Introduced in 853269aeb0
Prior to this commit, the PBVH partitioning process did not work
correctly for multires meshes with materials. Specifically, it failed
upon mapping the partitioned faces into their corresponding corners.
The rough process here is as follows:
* Flatten out the array of face indices into an array of corner indices.
* Sum up each `GridsNode` `prim_indices` corner count into an array.
* Create an `OffsetIndices` from these sums
* Use the `OffsetIndices` to slice the array created at the beginning
to assign to each node.
However, this process requires that the main PBVH array of corner
indices has the same order as iterating over the nodes, which the
partitioning algorithm does not do.
To solve this, this commit iterates over the Node `prim_indices` `Span`s
in the same order that the nodes are stored when flatting out the
corner data, ensuring a correct mapping.
Pull Request: https://projects.blender.org/blender/blender/pulls/129392
The length checking wasn't accounting for null bytes within multi-byte
sequences and could step over the null bytes.
For BLI_strlen_utf8 this could result in an out of bounds read.
In practice most UTF8 data is validated so the extra checks
are mainly to prevent errors on invalid or corrupt UTF8 text.
Python wheels from extensions were not being removed after
install/uninstall in some cases - although installing an extension
afterwards that used wheels would recalculate deps & remove them.
- Installing an extension didn't include the extension in the
compatibility-cache, causing uninstalling not to remove deps.
- Uninstalling an extension wasn't re-calculating the deps,
leaving them as-is.
Always write the compatibility-cache after installing and uninstalling
so changes are detected & handled.
This PR fixes a latent issue arising from invalid use of `accept_any_intersection(true)` when performing SSS ray-stepping with MetalRT. The comment incorrectly states that "we can optimize and accept the first hit", but to guarantee correct behaviour in future we need to request the closest hit.
Python scripts could perform actions that created notifiers
which would not be handled until the script was complete.
In the case of adding & removing objects a notifier would be created
for adding the object, then cleared when the ID was removed.
This lead to the notifier queue filling up with cleared notifiers
which were included in the search whenever an ID was removed.
The result of this was that adding and removing objects from a script
would become increasingly slower & use more memory.
Resolve by storing the current notifier being handed which isn't freed
(only cleared). The notifier handling loop detects cleared notifiers
and frees them after use.
Remove a workaround for #23871 which manipulated the module
to prevent classes defined in the text editor from having their
name-space cleared.
This caused the "multiprocessing" module to store the "__main__" module
as "__mp_main__" for later use.
Accessing attributes from this module would then attempt to read from
with a null "mp_dict" which crashed. This happened when showing the
extensions preferences but would have occurred if "__mp_main__" was
accessed from elsewhere too.
Resolve by removing the workaround since it has not been needed
since Python 3.2.