This commit improves DNA parsing resilience to data corruption in two ways:
* Detect and abort on failure to allocate requested amount of memory.
* Detect multiple usages of the same `type_index` by different struct
definitions.
The second part fixes the `dna_genfile.cc:1918:40.blend` case reported
in #137870.
Pull Request: https://projects.blender.org/blender/blender/pulls/139803
The Movie distortion node crops its data if the movie size differs from
the input size. That's because boundary extensions do not take
calibration size into account. To fix this, we use the same coordinates
range as the distortion grid computation, which computes the distortion
in the space of the calibration size.
Pull Request: https://projects.blender.org/blender/blender/pulls/139822
Adds a new operator in Grease Pencil edit mode to convert between curve
types. This acts as a replacment for the `Set Curve Type` operator as
the new operator better aligns with previous workflows and artist
expectations. Specifically using a threshold to adjust how well the
resulting curves fit to the original.
It can be found in the `Stroke` > `Convert Type` menu.
This operator aims at keeping visual fidelity between the curves. When
converting to a non-poly curve type, there's a `threshold` parameter
that dictates how closley the shapes will match (a value of zero meaning
an almost perfect match, and higher values will result in less accuracy
but lower control point count).
The conversion to `Catmull-Rom` does not do an actual curve fitting.
For now, this will resample the curves and then do an adaptive
simplification of the line (using the threshold parameter)
to simulate a curve fitting.
The `Set Curve Type` operator is no longer exposed in the
`Stroke` menu.
This also adds a new `geometry::fit_curves` function.
The function will fit a selection of curves to bézier curves. The
selected curves are treated as if they were poly curves.
The `thresholds` virtual array is the error threshold distance
for each curve that the fit should be within. The size of the virtual
array is assumed to have the same size as the total number of
input curves.
The `corners` virtual array allows specific input points to be treated
as sharp corners. The resulting bezier curve will have this point and
the handles will be set to "free".
There are two fitting methods:
* **Split**: Uses a least squares solver to find the control
points (faster, but less accurate).
* **Refit**: Iteratively removes knots with the least error starting
with a dense curve (slower, more accurate fit).
Co-authored-by: Casey Bianco-Davis <caseycasey739@gmail.com>
Co-authored-by: Hans Goudey <hans@blender.org>
Pull Request: https://projects.blender.org/blender/blender/pulls/137808
Set `UI_BLOCK_LIST_ITEM` flag for the block this will assign
`UI_BUT_LIST_ITEM` flag for tree view label buttons, see: `uiItemL_()`
That way `wcol_list_item` is being used for tree view.
(see: `widget_state()` / `widget_state_label()`
Pull Request: https://projects.blender.org/blender/blender/pulls/126026
This avoid having to compile specializations JIT and
use the same API as subprocess compilation.
This bridges the gap between subprocess and threaded
compilation.
Pull Request: https://projects.blender.org/blender/blender/pulls/139702
It would run into using the same frame twice, looking like "freeze
frames"
Apparently we had a similar issue before, see 3f8ec963e3
Just a PoC to show that this looks like a precision/rounding issue when
getting a "working" `UsdTimeCode`.
In the modifier code, we are doing a roundtrip going from frame >> time
(in seconds -- via `BKE_cachefile_time_offset`) and then back to frame
before we store that in `USDMeshReadParams`.
To avoid the precision loss, this PR introduces
`BKE_cachefile_frame_offset` to stay in the "frame" domain and
circumvent going through FPS alltogether.
There might be better ways to let USD handle the "sightly off"
`UsdTimeCode` better though.
Pull Request: https://projects.blender.org/blender/blender/pulls/139793
Duplicating collection doesn't work when multiple collections are
selected, instead first selected collection is just duplicated. Now fixed
by iterating over list of selected collections returned by
`outliner_collect_selected_parent_collections` after traversing. In that
function, child collections are skipped if parent collection is already
selected, this avoids extra copies from being generated (i.e creates one
copy of nested collections).
Resolves#139651
Pull Request: https://projects.blender.org/blender/blender/pulls/139719
Comparing the object with "edit_object" isn't correct as
multiple objects may be in edit-mode across multiple scenes.
Check the object for edit-mode data instead.
Annotation `Callable[[Any, ...], str | None]` is not supported by Python
typing system and ... will be misinterpreted as unknown type instead of
option to provide variable number of arguments.
Ref !138804
Regression in [0] which would re-highlight gizmos when they had been
tagged for highlighting.
This caused highlighting to be recalculated unexpectedly while
blocking modal operators run that used a timer.
The timer events would be passed though to the gizmo handler which
then re-evaluated the highlighted gizmo based on the cursor position.
Resolve by skipping pass-through for gizmos.
[0]: f839847d3b4849425c3b06a52aae4361d384fea4
Object::actcol assignments from edit-mode data wasn't clamping
the index to the valid range. This caused an out of bounds read when
accessing Object::matbits.
While material indexes should typically be within the material bounds,
this isn't guaranteed. Selecting a face for example with a material
outside the range was crashing.
Add a utility function that sets the active material index to replace
existing inline checks.
Follow up to the fix for #139369.
This changes how tooltips for dragging multiple files are shown. this
shows an `Documents` icon and a counter of how many files are dragged.
When multiple files are dragged from Blender internal file browser,
this avoids showing the thumbnail of the file selected to start
dragging, if selection is unique this thumbnail will be visible.
Pull Request: https://projects.blender.org/blender/blender/pulls/136276
A small number of USD files in the wild contain invalid face index data
for some of their meshes. This leads to asserts in debug builds and
crashes for users in retail builds(sometimes). There is already an
import option to Validate Meshes but it turns out that we, and most
other importers, perform validation too late. We crash before getting to
that validate option (see notes).
This PR implements a cheap detection mechanism and will auto-fix if we
detect broken data. The detection may not find all types of bad data but
it will detect what is known to fail today for duplicate vertex indices.
We immediately validate/fix before loading in the rest of the data. The
downside is that this will mean no additional data will be loaded.
Normals, edge creases, velocities, UVs, and all other attributes will be
lost because the incoming data arrays will no longer align.
It should be noted also that Alembic has also chosen this approach. It's
check is significantly weaker though and can be improved separately if
needed.
If auto-fix is triggered, it will typically appear as one trace on the
terminal.
```
WARN (io.usd): <path...>\io\usd\intern\usd_reader_mesh.cc:684
read_mesh_sample: Invalid face data detected for mesh
'/degenerate/m_degenerate'. Automatic correction will be used.
```
A more general downside of these fixes is that this applies to each
frame of animated mesh data. The mesh will be fixed, and re-fixed, on
every frame update when the frame in question contains bad data.
For well-behaved USD scenes, the penalty for this check is between 2-4%.
For broken USD scenes, it depends on how many meshes need the fixup. In
the case of the Intel 4004 Moore Lane scene, the penalty is a 2.7x
slowdown in import time (4.5 s to 12.5 s).
Pull Request: https://projects.blender.org/blender/blender/pulls/138633
The checks and related warnings detecting usage of blendfiles generated
by newer versions of Blender were not fully behaving as expected for
libraries. In particular, opening an older main blendfile linking
against newer library ones would not always detect and report the
`has_forward_compatibility_issues` status properly.
Found out while working on 'longer ID names' compatibility PR for 4.5
(!139336).
Compilation constants are constants defined in the create info.
They cannot be changed after the shader is created.
It is a replacement to macros with added type safety.
Reuse most of the logic from Specialization constants.
Pull Request: https://projects.blender.org/blender/blender/pulls/139703
The "All Libraries" library didn't free its assets correctly on refresh,
so the asset previews didn't refresh correctly either. That's because it
didn't forward the removal request to the asset library that actually
owns the asset. It only freed assets from its own storage, which is
always empty.
This might make refreshing the all library feel a little slower, since
previews are now refreshed too. But in general this is fairly fast still
and there's an optimization to only load visible previews too.
This adds a new function to query GPUtexture from an
Image datablock without actually creating them.
This allows to keep track of all the texture that
needs to be loaded and defer their loading in
end_sync. The texture are then only used in the
next sync. This is because we do not want to stage
the texture for drawing as it would require a
valid texture.
Multithreading is used to load the texture from disk
as soon as possible in a threaded way. It is still
blocking, but it is much faster (depending on
hardware).
Before (5.7s):
After (2.5s):
On Linux workstation: 2.28x speedup in texture loading
On M1 Backbook pro: 2.72x speedup in texture loading
This includes redraw overhead but it is not super significant.
Having a vector of all the textures to be loaded
will eventually be helpful in making the
texture uploading multi-threaded. Currently, it is
a bit difficult given the need of a valid GPUContext
per thread.
- [x] Bypass deffered loading on animated textures
- [x] Add throttling to only load a few textures per frame
- [x] Do not delay for viewport render
Pull Request: https://projects.blender.org/blender/blender/pulls/139644
Legacy curves can carry material information, and the fill material is
especially useful for grease pencil. This patch converts base color of
materials from legacy curves when converting to grease pencil.
Limitations:
- This patch does not take nodes material into account.
- Neither legacy curves nor grease pencil supports per-stroke fill
attribute yet, thus the converted grease pencil will be shown as
either all fills or all strokes, depends on the configuration in the
original legacy curve object.
Pull Request: https://projects.blender.org/blender/blender/pulls/139212
Apparently the vertex group list were missing when converting meshes to
grease pencil while all the attributes seems to be transferring just
fine. This is because of a missing `BKE_defgroup_copy_list` call. Now
all vertex group names show up correctly in the list after conversion.
Pull Request: https://projects.blender.org/blender/blender/pulls/139786
"Transfer Mesh Data" operator only works on meshes, however it's `poll`
call doesn't do complete checks for all selected objects because that
would be too slow. Now we add an error message when invalid objects are
encountered during data transfer (e.g. target object type is not mesh)
it will give a report to notify users that some errors have occured. So
there will be less confusion.
Pull Request: https://projects.blender.org/blender/blender/pulls/139568
In Geometry Nodes workspace, the viewport has a default value of 0.0
for `gpencil_vertex_paint_opacity`, this causes material preview to not
show proper vertex color even when strokes have color. Considering this
property is a bit obscure, setting a default value of 1.0 here makes
sense and it's also consistent with the rest of the viewport editors in
other workspaces.
Pull Request: https://projects.blender.org/blender/blender/pulls/139356
`blender::geometry::fillet_curves` should check for situations where empty
curves are passed in (this could happen in geometry nodes) and in those
cases it should not run.
Pull Request: https://projects.blender.org/blender/blender/pulls/139787
If the compilation subprocess crashes due to an internal driver error,
the end semaphore is never signaled and leaves Blender hanging.
This replaces the `decrement` calls with `try_decrement` and checks
every second if the subprocess is lost.
Additionally, if the issue comes from a broken binary, the crashes will
happen every time a Blender session tries to load it.
So now we store the shader hash in the shared memory before trying to
load the binary. If the subprocess is lost mid compilation, the main
process will delete the broken cached binary.
Pull Request: https://projects.blender.org/blender/blender/pulls/139747
Objects inside `lc.collection->gobject / te_parent->subtree` are not
order according to ob-parent heirarchy. Due to this, some child
object tree elements are iterated before the parent inside
`make_object_parent_hierarchy_collections` while building the hierarchy.
This ignores the case for objects when they are in the different
collection (lets say "Collection2"), see `if(!found)`. To fix this,
store ordered list of object hierarchy in a Vector `ordered_objects`.
This way, the parent element is first added to collection2.
Then while iterating its child, `parent_ob_tree_elements` has more than
one elements (i.e. possible to add child object inside Collection2)
Also resolves#100393
Steps to Reproduce: Just open the .blend from above report
Alternative for !136872
Pull Request: https://projects.blender.org/blender/blender/pulls/136971
Previously, dragging the screenshot selection area beyond Blender's
window borders would cause a crash. This PR addresses that issue by:
- Clamping the selection area to valid window bounds
- Displaying an "Invalid screenshot area" error when the selection is invalid
Pull Request: https://projects.blender.org/blender/blender/pulls/139680
The Cryptomatte pick layer in the compositor is wrongly displayed due to
color management getting applied even though it is non color data. This
is because the non color meta-data member of viewer images was not set
if it was newels created. To fix this, we make sure it gets set in all
code paths for the viewer.
The Cryptomatte node fails to work for some animated images in certain
cases. That's because the auto refresh frame mechanism is not reliable.
To fix this, we use the context's frame to get Cryptomatte layers.
When selecting gizmo nodes using box selection, circle selection, or
lasso selection, the gizmos don't show in the viewport.
This is fixed by notifying that the gizmo nodes have been updated to
redraw the gizmos using `WM_event_add_notifier`.
Pull Request: https://projects.blender.org/blender/blender/pulls/139728