Ensure that `__main__.__file__` is not set to "<blender string>" when
spawning the subprocess for the background downloader. Blender sets this
value when executing a Python string from C++ code. However, this value
causes issues with Python's `multiprocessing` module, as it expects that
IF the attribute is set, it contains the actual file of the actual main
module. Clearing it works fine, as `__file__` is optional anyway.
Pull Request: https://projects.blender.org/blender/blender/pulls/143985
Add a [Python code generator][1] that takes an OpenAPI definition and
outputs the corresponding data model as [dataclasses][2]
This is intended to be used in the Remote Asset Library project, to
create, download, parse, and validate information of a remote asset
library.
[1]: https://koxudaxi.github.io/datamodel-code-generator/
[2]: https://docs.python.org/3/library/dataclasses.html
## Running the Generator
The generator is a Python script, which creates its own Python
virtualenv, installs the dependencies it needs, and then runs the
generator within that virtualenv.
The script is intended to run via the `generate_datamodels` CMake
target. For example, `ninja generate_datamodels` in the build
directory.
## Details
The virtualenv is created in Blender's build directory, and is not
cleaned up after running. This means that subsequent runs will just
use it directly, instead of reinstalling dependencies on every run.
## Generated Code & Interaction with Build System
It is my intention that the code generation _only_ happens when the
OpenAPI specification changes. This means that the generated code will
be committed to Git like any hand-written code. Building Blender will
therefore _not_ require the code generator to run. Only people working
on the area that uses the generated code will have to deal with this.
Pull Request: https://projects.blender.org/blender/blender/pulls/139495
Add a new package `scripts/modules/_bpy_internal/http`, containing
classes to download files via HTTP.
The code is intentionally put into the `_bpy_internal` package, as I
don't intend it to be the end-all-be-all of downloaders for general
use in add-ons. It's been written to support the Remote Asset Library
project (#134495), where it will be used to download JSON files (to
get the list of assets on the server) as well as the asset files
themselves.
The module consists of several parts. The main ones are:
`class ConditionalDownloader`
: File downloader, which downloads a URL to a file on disk.
It supports conditional requests via `ETag`/`If-None-Match` and
`Last-Modified`/`If-Modified-Since` HTTP headers (RFC 7273, section 3.
Precondition Header Fields). A `304 Not Modified` response is
treated as a succesful download.
Metadata of the request (the response length in bytes, and the above
headers) are stored on disk, in a location that is determined by the
user of the class. Probably in the future it would be nice to have a
single sqlite database for this (there's a TODO in the code about
this).
The downloader uses the Requests library, and manages its own HTTP
session object. This way it can handle TCP/IP connection reuse,
automatically retry failing connections, and in the future
HTTP-level authentication.
`class BackgroundDownloader`
: Wrapper for a `ConditionalDownloader` that manages a background
process for the actual downloading.
It runs the downloader in a background process, while ensuring that
its reporters (see below) get called on the main process. This way
it's possible to do background downloading, while still receiving
progress reports in a modal operator, which in turn can directly
call Blender's Python API. Care was taken to [not use Python
threads][1]
`class DownloadReporter`
: Protocol class. Objects adhering to the protocol can be given to a
`ConditionalDownloader` or `BackgroundDownloader`. The protocol has
functions like `download_starts(…)`, `download_progress(…)`,
`download_error(…)`, which will be called by the downloader to
report on what it's doing.
I chose to make this a protocol, rather than an abstract superclass,
because then it's possible to make an Operator a DownloadReporter
without requiring multi-classing.
[1]: https://docs.blender.org/api/main/info_gotchas_threading.html
Pull Request: https://projects.blender.org/blender/blender/pulls/138327
e.g. stands for "exempli gratia" in Latin which means "for example".
The best way to make sure it makes sense when writing is to just expand
it to "for example". In these cases where the text was "for e.g.", that
leaves us with "for for example" which makes no sense. This commit fixes
all 110 cases, mostly just just replacing the words with "for example",
but also restructuring the text a bit more in a few cases, mostly by
moving "e.g." to the beginning of a list in parentheses.
Pull Request: https://projects.blender.org/blender/blender/pulls/139596
Exposes bézier handle data through a new `GreasePencilStrokePointHandle`
class on each point, which provides `position`, `type`, and `select` for both
`point.handle_left` and `point.handle_right`.
Also removes the old separate point attributes:
* `handle_left_type`, `handle_right_type`
* `select_handle_left`, `select_handle_right`
Pull Request: https://projects.blender.org/blender/blender/pulls/137780
This adds two new python properties to `GreasePencilStrokePoint`
* `select_handle_left`: The selection of the left handle of this point
* `select_handle_right`: The selection of the right handle of this point
These could already be written to using `drawing.attributes[".selection_handle_left"]`
and `drawing.attributes[".selection_handle_right"]`, but this makes it
easier to work on individual points from the higher-level
python API.
Resolves#137639.
Pull Request: https://projects.blender.org/blender/blender/pulls/137730
Some wheels don't use "Purelib", meaning the directory layout of the
wheel needs to be manipulated on installation.
Add partial support for Python's "Binary distribution format" when
Purelib is false, since paths such as `includes` & `scripts` are not
remapped into user accessible locations - they will remain in the
wheels "*.data" directory, these could be supported if it's needed
although it seems like a fairly niche use case.
Ref !135709
Support differentiating between portable & system installations,
useful to properly locate relative paths which would not work
on system installations.
Ref !133143
addon_utils.enable/disable now handle wheels so that the functions can
be used with extension add-ons.
A new argument `refresh_handled` supports scripts handing refresh
themselves which is needed to avoid refreshing many times in cases
when there are multiple calls to enable/disable.
This is mostly useful for internal operations.
Adds the current GPU backend to the `Help -> Report a bug` script
when using the OpenGL or Vulkan backend. This makes it easier for
triagers to figure out if a user is using OpenGL or Vulkan
in cases the user doesn't explictly state which they're using.
Pull Request: https://projects.blender.org/blender/blender/pulls/130006
The default value for `FLOAT_COLOR` attributes is white. We can't
change this default easily.
This fix will initialize the attributes accessed through the high-level
python API with their expected default value. In the case of
vertex colors, this is fully transparent black.
Pull Request: https://projects.blender.org/blender/blender/pulls/129638
The issue was that the strokes were not using the `POLY` type and
needed to be tagged.
This adds a function `tag_positions_changed` on the `GreasePencilDrawing`
so that the high-level python API can tag the positions if
the `point.position` attibute is written to.
Pull Request: https://projects.blender.org/blender/blender/pulls/129292
When writing to a property that doesn't exist e.g. `frame.drawing.strokes.test = 42`
no exception would be raised and it would silently fail.
The fix defines the `__slots__` on the classes explicitly which then raises an exception
if the user tries to write something that wasn't previously defined.
Pull Request: https://projects.blender.org/blender/blender/pulls/129047
When e.g. executing `drawing.strokes[0].softness = 3`, the API would
always create a new attribute `softness` even if that attribute existed
already.
The issue was that the code was using the `.get(value, fallback)` syntax
but the `fallback` expression is always evaluated by python.
The fix removes the use of the `fallback` and uses a simple `if/else` to
check if the attribute doesn't exist yet and only then create it.
Pull Request: https://projects.blender.org/blender/blender/pulls/129044
The issue was that the API assumed that the `.selection` attribute
was always on either the point domain for points or the stroke domain
for strokes.
Internally the attribute domain depends on the current selection mode.
To fix the issue, the API now checks for the domain of the attribute
and handles it accordingly.
If the selection attribute is on the `'POINT'` domain:
* Reading the `stroke.select` property will check if *any* of the points of
the stroke are selected and return `True` or `False`.
* Writing to the `stroke.select` property will write `True` or `False` to *all* the
points in the stroke.
Also resolves#128645.
Pull Request: https://projects.blender.org/blender/blender/pulls/128687
On Windows an entire directory may be locked when any files inside it
are opened by another process. This can cause operations that
recursively remove a directory (uninstalling & updating) to fail
with a partially removed extension.
The case of uninstalling was already handled, where failure to remove
a directory would stage the extension for later removal.
In the case of updating however, the user could be left with a broken
(partially removed) extension where some files were removed, as the
directory was locked, the update would fail to extract new files.
Address this issue by renaming the directory before recursive removal.
The following logic has been implemented:
- If any files in the directory are locked, renaming will fail.
So even though the operation fails the extension is left intact.
- If renaming succeeds, it's possible to apply the update.
While it's possible (albeit unlikely) recursive removal fails,
which could be caused by file-system permissions issues corruption or
a process could open a file between rename & removal.
In this case the renamed directory is staged for later removal.
Other changes:
- Resolve a related problem where the user could install an
extension previously staged for removal, now installing an extension
ensured it's not removed later.
This would occur if uninstalling failed, the user resolves
directory-lock, uninstalls again, then re-installs the extension.
- When an extension fails to be removed, don't attempt to remove
user configuration for that extension.
Prefer to keep the extension & it's settings in their "current state"
if it can't be removed.
BaseException was used as a catch-all in situations where it
didn't make sense and where "Exception" is more appropriate
based on Python's documentation & error checking tools,
`pylint` warns `broad-exception-caught` for e.g.
BaseException includes SystemExit, KeyboardInterrupt & GeneratorExit,
so unless the intention is to catch calls to `sys.exit(..)`,
breaking a out of a loop using Ctrl-C or generator-exit,
then it shouldn't be used.
Even then, it's preferable to catch those exceptions explicitly.
Disable dynamic SDL loading as well as disable SDL for release builds.
This was only used for audio output which can already use OpenAL
if there are back-ends not natively supported by Blender.
- Remove extern/sdlew/
- Remove the WITH_SDL_DYNLOAD build option.
- Remove `bpy.app.sdl.available`.
Ref !127554
- Move sys_info into an internal module to avoid having so many
top level modules for Blender's internal functionality.
- Rename system_info sub-modules that pre-fill URL's for clarity.
- Move top-level exception handling into the operator.
- Report an error if an unexpected exception occurs.
- Use `Exception` instead of `BaseException` as there is no reason to
catch the additional exceptions.
- Remove use of sys_info from the command line example,
replace with in-lined system info.
This commit adds a python script that can collect some of the
information necessary to fill out a bug report.
The primary use case is to help users collect system information for
a bug report in the case that Blender can't open.
CMD and sh files are included to help users use the Python script.
Ref !122191
Writing to the `curve_type` attribute directly is not allowed as there are other
updates needed and otherwise will result in a crash.
The fix makes sure the `curve_type` is read-only. To change the curve type,
the `grease_pencil.set_curve_type` operator has to be used for now.
This moves the helper python classes from `scripts/modules/grease_pencil_python.py`
to `scripts/modules/_bpy_internal/grease_pencil.py`.
It also cleans up the code a bit more. No functional changes.
Pull Request: https://projects.blender.org/blender/blender/pulls/126403
When deleting files on WIN32, open files cannot be removed.
This is especially a problem for compiled Python modules which
remain open once imported.
Previously it was not as common for add-ons to include compiled Python
modules however with extensions supporting Python-wheels,
it's increasingly likely users run into this.
Workaround the problem by:
- Scheduling the files for removal next time Blender starts.
- Rename paths that cannot be removed to avoid collisions when
the paths is reused (re-installing for example).
This is supported for:
- Extensions.
- Python wheels.
- Legacy user add-ons.
- App-templates.
Details:
- On startup, a file exists that indicates cleanup is needed.
In the common case the file doesn't exist.
Otherwise module paths are scanned for files to remove.
- Since errors resolving paths to remove could result in user data loss,
ensure the paths are always within the (extension/addon/app-template)
directory.
- File locking isn't used, if multiple Blender instances start at the
same time and try to remove the same files, this won't cause errors.
Even so, remove the checking file immediately avoid unnecessary
file-system access overhead for other Blender instances.
Also resolves#125049.
Expose arguments to use when creating a Python sub-process.
Python could fail to start when loaded in a customized environment,
with PYTHONPATH set for e.g. Blender ignores these and loads but a
Python sub-process attempts to use these environment variables which
may point to incompatible Python versions.
Resolve the root cause of #124731.
Changes to an extensions manifest weren't accounted for.
This was particularly a problem for "System" extensions which aren't
intended to be managed inside Blender however the problem existed for
any changes made outside of Blender.
Now enabled extensions are checked on startup to ensure:
- They are compatible with Blender.
- The Python wheels are synchronized.
Resolves#123645.
Details:
- Any extension incompatibilities prevent the add-on being enabled
with a message printing the reason for it being disabled.
- Incompatible add-ons are kept enabled in the preferences to avoid
loosing their own preferences and allow for an upgrade to restore
compatibility.
- To avoid slowing down Blender's startup:
- Checks are skipped when no extensions are enabled
(as is the case for `--factory-startup` & running tests).
- Compatibility data is cached so in common case,
the cache is loaded and all enabled extensions `stat` their
manifests to detect changes without having to parse them.
- The cache is re-generated if any extensions change or the
Blender/Python version changes.
- Compatibility data is updated:
- On startup (when needed).
- On an explicit "Refresh Local"
(mainly for developers who may edit the manifest).
- When refreshing extensions after install/uninstall etc.
since an incompatible extensions may become compatible
after an update.
- When reloading preferences.
- Additional info is shown when the `--debug-python` is enabled,
if there are ever issues with the extension compatibility cache
generation not working as expected.
- The behavior for Python wheels has changed so they are only setup
when the extension is enabled. This was done to simplify startup
checks and has the benefit that an installed but disabled extension
never runs code - as the ability to install wheels means it could
have been imported from other scripts. It also means users can disable
an extension to avoid wheel version conflicts.
This does add the complication however that enabling add-on which is
an extension must first ensure it's wheels are setup.
See `addon_utils.extensions_refresh(..)`.
See code-comments for further details.