Files
test/source/blender/python/intern/CMakeLists.txt

Ignoring revisions in .git-blame-ignore-revs. Click here to bypass and see the normal blame view.

386 lines
7.3 KiB
CMake
Raw Normal View History

# SPDX-FileCopyrightText: 2006 Blender Authors
#
# SPDX-License-Identifier: GPL-2.0-or-later
set(INC
..
../../editors/include
../../imbuf/intern/oiio
../../makesrna
../../../../intern/mantaflow/extern
# RNA_prototypes.hh
${CMAKE_BINARY_DIR}/source/blender/makesrna
)
set(INC_SYS
${PYTHON_INCLUDE_DIRS}
)
set(SRC
bpy.cc
bpy_app.cc
bpy_app_alembic.cc
bpy_app_build_options.cc
bpy_app_ffmpeg.cc
bpy_app_handlers.cc
bpy_app_icons.cc
bpy_app_ocio.cc
bpy_app_oiio.cc
bpy_app_opensubdiv.cc
bpy_app_openvdb.cc
bpy_app_sdl.cc
bpy_app_timers.cc
bpy_app_translations.cc
bpy_app_usd.cc
bpy_capi_utils.cc
bpy_cli_command.cc
bpy_driver.cc
Python: Geometry: create GeometrySet wrapper for Python In Geometry Nodes a geometry is represented by a `GeometrySet`. This is a container that can contain one geometry of each of the supported types (mesh, curves, volume, grease pencil, pointcloud, instances). It's possible for a `GeometrySet` to contain e.g. a mesh and a point cloud. This patch creates a Python wrapper for the built-in `GeometrySet`. For now, it's main purpose is to consume the complete evaluated geometry of an object without having to go through complex hoops via `depsgraph.object_instances`. It also also allows retrieving instances that have been created with legacy instancing systems such as dupli-verts or particles. In the future, the `GeometrySet` API could also be used for more kinds of geometry processing from Python, similar to how we use `GeometrySet` internally as generic geometry storage. Since we can't really have constness guarantees in Python currently, it's enforced that the `GeometrySet` wrapper always has its own copy of each geometry type (so e.g. it does not share a `Mesh` data-block pointer with any other place in Blender). Without the copy, changes to the mesh in the geometry set would also affect the evaluated geometry that Blender sees. The copy has a small cost, but typically the overhead should be low, because attributes and other run-time data can still be shared. This should be entirely thread-safe, assuming that no code modifies implicitly shared data, which is forbidden. For historic reasons there are still cases like #132423 where this assumption does not hold in all cases. Those cases should be fixed. To my knowledge, this patch does not introduce any new such issues or makes existing issues worse. Pull Request: https://projects.blender.org/blender/blender/pulls/135318
2025-03-28 22:40:01 +01:00
bpy_geometry_set.cc
bpy_gizmo_wrap.cc
bpy_interface.cc
bpy_interface_atexit.cc
bpy_interface_run.cc
bpy_intern_string.cc
bpy_library_load.cc
bpy_library_write.cc
bpy_msgbus.cc
bpy_operator.cc
bpy_operator_wrap.cc
bpy_path.cc
bpy_props.cc
bpy_rna.cc
bpy_rna_anim.cc
bpy_rna_array.cc
bpy_rna_callback.cc
bpy_rna_context.cc
bpy_rna_data.cc
bpy_rna_driver.cc
bpy_rna_gizmo.cc
bpy_rna_id_collection.cc
bpy_rna_operator.cc
bpy_rna_text.cc
bpy_rna_types_capi.cc
bpy_rna_ui.cc
bpy_traceback.cc
bpy_utils_previews.cc
bpy_utils_units.cc
bpy.hh
bpy_app.hh
bpy_app_alembic.hh
bpy_app_build_options.hh
bpy_app_ffmpeg.hh
bpy_app_handlers.hh
bpy_app_icons.hh
bpy_app_ocio.hh
bpy_app_oiio.hh
bpy_app_opensubdiv.hh
bpy_app_openvdb.hh
bpy_app_sdl.hh
bpy_app_timers.hh
bpy_app_translations.hh
bpy_app_usd.hh
bpy_capi_utils.hh
bpy_cli_command.hh
bpy_driver.hh
Python: Geometry: create GeometrySet wrapper for Python In Geometry Nodes a geometry is represented by a `GeometrySet`. This is a container that can contain one geometry of each of the supported types (mesh, curves, volume, grease pencil, pointcloud, instances). It's possible for a `GeometrySet` to contain e.g. a mesh and a point cloud. This patch creates a Python wrapper for the built-in `GeometrySet`. For now, it's main purpose is to consume the complete evaluated geometry of an object without having to go through complex hoops via `depsgraph.object_instances`. It also also allows retrieving instances that have been created with legacy instancing systems such as dupli-verts or particles. In the future, the `GeometrySet` API could also be used for more kinds of geometry processing from Python, similar to how we use `GeometrySet` internally as generic geometry storage. Since we can't really have constness guarantees in Python currently, it's enforced that the `GeometrySet` wrapper always has its own copy of each geometry type (so e.g. it does not share a `Mesh` data-block pointer with any other place in Blender). Without the copy, changes to the mesh in the geometry set would also affect the evaluated geometry that Blender sees. The copy has a small cost, but typically the overhead should be low, because attributes and other run-time data can still be shared. This should be entirely thread-safe, assuming that no code modifies implicitly shared data, which is forbidden. For historic reasons there are still cases like #132423 where this assumption does not hold in all cases. Those cases should be fixed. To my knowledge, this patch does not introduce any new such issues or makes existing issues worse. Pull Request: https://projects.blender.org/blender/blender/pulls/135318
2025-03-28 22:40:01 +01:00
bpy_geometry_set.hh
bpy_gizmo_wrap.hh
bpy_intern_string.hh
bpy_library.hh
bpy_msgbus.hh
bpy_operator.hh
bpy_operator_wrap.hh
bpy_path.hh
bpy_props.hh
bpy_rna.hh
bpy_rna_anim.hh
bpy_rna_callback.hh
bpy_rna_context.hh
bpy_rna_data.hh
bpy_rna_driver.hh
bpy_rna_gizmo.hh
bpy_rna_id_collection.hh
bpy_rna_operator.hh
bpy_rna_text.hh
bpy_rna_types_capi.hh
bpy_rna_ui.hh
bpy_traceback.hh
bpy_utils_previews.hh
bpy_utils_units.hh
../BPY_extern.hh
../BPY_extern_clog.hh
../BPY_extern_python.hh
../BPY_extern_run.hh
)
set(LIB
PRIVATE bf::blenkernel
PRIVATE bf::blenlib
PRIVATE bf::blenloader
PRIVATE bf::blentranslation
PRIVATE bf::depsgraph
Cleanup: CMake: Modernize bf_dna dependencies There's quite a few libraries that depend on dna_type_offsets.h but had gotten to it by just adding the folder that contains it to their includes INC section without declaring a dependency to bf_dna in the LIB section. which occasionally lead to the lib building before bf_dna and the header being missing, while this generally gets fixed in CMake by adding bf_dna to the LIB section of the lib, however until last week all libraries in the LIB section were linked as INTERFACE so adding it in there did not resolve the build issue. To make things still build, we sprinkled add_dependencies wherever we needed it to force a build order. This diff : Declares public include folders for the bf_dna target so there's no more fudging the INC section required to get to them. Removes all dna related paths from the INC section for all libraries. Adds an alias target bf:dna to signify it has been updated to modern cmake Declares a dependency on bf::dna for all libraries that require it Removes (almost) all calls to add_dependencies for bf_dna Future work: Because of the manual dependency management that was done, there is now some "clutter" with libs depending on bf_dna that realistically don't. Example bf_intern_opencolorio itself has no dependency on bf_dna at all, doesn't need it, doesn't use it. However the dna include folder had been added to it in the past since bf_blenlib uses dna headers in some of its public headers and bf_intern_opencolorio does use those blenlib headers. Given bf_blenlib now correctly declares the dependency on bf_dna as public bf_intern_opencolorio will get the dna header directory automatically from CMake, hence some cleanup could be done for bf_intern_opencolorio Because 99% of the changes in this diff have been automated, this diff does not seek to address these issues as there is no easy way to determine why a certain dependency is in place. A developer will have to make a pass a this at some later point in time. As I'd rather not mix automated and manual labour. There are a few libraries that could not be automatically processed (ie bf_blendthumb) that also will need this manual look-over. Pull Request: https://projects.blender.org/blender/blender/pulls/109835
2023-07-10 15:07:37 +02:00
PRIVATE bf::dna
bf_editor_animation
bf_editor_interface
bf_editor_space_api
PRIVATE bf::gpu
PRIVATE bf::imbuf
PRIVATE bf::intern::clog
PRIVATE bf::intern::guardedalloc
PRIVATE bf::animrig
bf_python_gpu
Refactor: OpenColorIO integration Briefly about this change: - OpenColorIO C-API is removed. - The information about color spaces in ImBuf module is removed. It was stored in global ListBase in colormanagement.cc. - Both OpenColorIO and fallback implementation supports GPU drawing. - Fallback implementation supports white point, RGB curves, etc. - Removed check for support of GPU drawing in IMB. Historically it was implemented in a separate library with C-API, this is because way back C++ code needed to stay in intern. This causes all sort of overheads, and even calls that are strictly considered bad level. This change moves OpenColorIO integration into a module within imbuf, next to movie, and next to IMB_colormanagement which is the main user of it. This allows to avoid copy of color spaces, displays, views etc in the ImBuf: they were used to help quickly querying information to be shown on the interface. With this change it can be stored in the same data structures as what is used by the OpenColorIO integration. While it might not be fully avoiding duplication it is now less, and there is no need in the user code to maintain the copies. In a lot of cases this change also avoids allocations done per access to the OpenColorIO. For example, it is not needed anymore to allocate image descriptor in a heap. The bigger user-visible change is that the fallback implementation now supports GLSL drawing, with the whole list of supported features, such as curve mapping and white point. This should help simplifying code which relies on color space conversion on GPU: there is no need to figure out fallback solution in such cases. The only case when drawing will not work is when there is some actual bug, or driver issue, and shader has failed to compile. The change avoids having an opaque type for color space, and instead uses forward declaration. It is a bit verbose on declaration, but helps avoiding unsafe type-casts. There are ways to solve this in the future, like having a header for forward declaration, or to flatten the name space a bit. There should be no user-level changes under normal operation. When building without OpenColorIO or the configuration has a typo or is missing a fuller set of color management tools is applies (such as the white point correction). Pull Request: https://projects.blender.org/blender/blender/pulls/138433
2025-05-09 14:01:43 +02:00
PRIVATE bf::imbuf::opencolorio
CMake: Refactor external dependencies handling This is a more correct fix to the issue Brecht was fixing in D6600. While the fix in that patch worked fine for linking it broke ASAN runtime under some circumstances. For example, `make full debug developer` would compile, but trying to start blender will cause assert failure in ASAN (related on check that ASAN is not running already). Top-level idea: leave it to CMake to keep track of dependency graph. The root of the issue comes to the fact that target like "blender" is configured to use a lot of static libraries coming from Blender sources and to use external static libraries. There is nothing which ensures order between blender's and external libraries. Only order of blender libraries is guaranteed. It was possible that due to a cycle or other circumstances some of blender libraries would have been passed to linker after libraries it uses, causing linker errors. For example, this order will likely fail: libbf_blenfont.a libfreetype6.a libbf_blenfont.a This change makes it so blender libraries are explicitly provided their dependencies to an external libraries, which allows CMake to ensure they are always linked against them. General rule here: if bf_foo depends on an external library it is to be provided to LIBS for bf_foo. For example, if bf_blenkernel depends on opensubdiv then LIBS in blenkernel's CMakeLists.txt is to include OPENSUBDIB_LIBRARIES. The change is made based on searching for used include folders such as OPENSUBDIV_INCLUDE_DIRS and adding corresponding libraries to LIBS ion that CMakeLists.txt. Transitive dependencies are not simplified by this approach, but I am not aware of any downside of this: CMake should be smart enough to simplify them on its side. And even if not, this shouldn't affect linking time. Benefit of not relying on transitive dependencies is that build system is more robust towards future changes. For example, if bf_intern_opensubiv is no longer depends on OPENSUBDIV_LIBRARIES and all such code is moved to bf_blenkernel this will not break linking. The not-so-trivial part is change to blender_add_lib (and its version in Cycles). The complexity is caused by libraries being provided as a single list argument which doesn't allow to use different release and debug libraries on Windows. The idea is: - Have every library prefixed as "optimized" or "debug" if separation is needed (non-prefixed libraries will be considered "generic"). - Loop through libraries passed to function and do simple parsing which will look for "optimized" and "debug" words and specify following library to corresponding category. This isn't something particularly great. Alternative would be to use target_link_libraries() directly, which sounds like more code but which is more explicit and allows to have more flexibility and control comparing to wrapper approach. Tested the following configurations on Linux, macOS and Windows: - make full debug developer - make full release developer - make lite debug developer - make lite release developer NOTE: Linux libraries needs to be compiled with D6641 applied, otherwise, depending on configuration, it's possible to run into duplicated zlib symbols error. Differential Revision: https://developer.blender.org/D6642
2020-01-20 18:36:19 +01:00
${PYTHON_LINKFLAGS}
${PYTHON_LIBRARIES}
PRIVATE bf::windowmanager
)
2024-09-20 13:14:57 +10:00
# Only to check if `buildinfo` is available.
if(WITH_BUILDINFO)
add_definitions(-DBUILD_DATE)
endif()
if(WITH_INSTALL_PORTABLE)
add_definitions(-DWITH_INSTALL_PORTABLE)
endif()
if(WITH_PYTHON_MODULE)
add_definitions(-DWITH_PYTHON_MODULE)
endif()
# Find the SSL certificate for the portable Blender installation.
# Without this, the absolute path on the builder is used, causing HTTPS access to fail.
# For example `urllib.request.urlopen("https://projects.blender.org")` fails
# (or any other HTTPS site). see: #102300 for details.
# NOTE: that this isn't necessary on WIN32.
if(WITH_PYTHON AND WITH_PYTHON_INSTALL AND (APPLE OR WITH_INSTALL_PORTABLE) AND (NOT WIN32))
# - `PYTHON_SSL_CERT_FILE` absolute path to the PEM file.
find_python_module_file("certifi/cacert.pem" PYTHON_SSL_CERT_FILE _python_ssl_cert_file_relative)
mark_as_advanced(PYTHON_SSL_CERT_FILE)
2024-04-19 15:50:26 +10:00
if(PYTHON_SSL_CERT_FILE)
add_definitions(-DPYTHON_SSL_CERT_FILE="${_python_ssl_cert_file_relative}")
else()
message(WARNING
"Unable to find \"certifi/cacert.pem\" within \"${PYTHON_LIBPATH}\", "
"this build will not be able to use bundled certificates with the \"ssl\" module!"
)
endif()
unset(_python_ssl_cert_file_relative)
endif()
if(WITH_PYTHON_SAFETY)
add_definitions(-DWITH_PYTHON_SAFETY)
endif()
if(WITH_AUDASPACE)
# It's possible to build with AUDASPACE (for file IO) but without the `aud` Python API,
# when building without NUMPY so define both `WITH_AUDASPACE` & `DWITH_AUDASPACE_PY`.
add_definitions(-DWITH_AUDASPACE)
if(WITH_PYTHON_NUMPY)
add_definitions(-DWITH_AUDASPACE_PY)
endif()
endif()
if(WITH_BULLET)
add_definitions(-DWITH_BULLET)
endif()
if(WITH_CODEC_FFMPEG)
list(APPEND INC_SYS
${FFMPEG_INCLUDE_DIRS}
)
CMake: Refactor external dependencies handling This is a more correct fix to the issue Brecht was fixing in D6600. While the fix in that patch worked fine for linking it broke ASAN runtime under some circumstances. For example, `make full debug developer` would compile, but trying to start blender will cause assert failure in ASAN (related on check that ASAN is not running already). Top-level idea: leave it to CMake to keep track of dependency graph. The root of the issue comes to the fact that target like "blender" is configured to use a lot of static libraries coming from Blender sources and to use external static libraries. There is nothing which ensures order between blender's and external libraries. Only order of blender libraries is guaranteed. It was possible that due to a cycle or other circumstances some of blender libraries would have been passed to linker after libraries it uses, causing linker errors. For example, this order will likely fail: libbf_blenfont.a libfreetype6.a libbf_blenfont.a This change makes it so blender libraries are explicitly provided their dependencies to an external libraries, which allows CMake to ensure they are always linked against them. General rule here: if bf_foo depends on an external library it is to be provided to LIBS for bf_foo. For example, if bf_blenkernel depends on opensubdiv then LIBS in blenkernel's CMakeLists.txt is to include OPENSUBDIB_LIBRARIES. The change is made based on searching for used include folders such as OPENSUBDIV_INCLUDE_DIRS and adding corresponding libraries to LIBS ion that CMakeLists.txt. Transitive dependencies are not simplified by this approach, but I am not aware of any downside of this: CMake should be smart enough to simplify them on its side. And even if not, this shouldn't affect linking time. Benefit of not relying on transitive dependencies is that build system is more robust towards future changes. For example, if bf_intern_opensubiv is no longer depends on OPENSUBDIV_LIBRARIES and all such code is moved to bf_blenkernel this will not break linking. The not-so-trivial part is change to blender_add_lib (and its version in Cycles). The complexity is caused by libraries being provided as a single list argument which doesn't allow to use different release and debug libraries on Windows. The idea is: - Have every library prefixed as "optimized" or "debug" if separation is needed (non-prefixed libraries will be considered "generic"). - Loop through libraries passed to function and do simple parsing which will look for "optimized" and "debug" words and specify following library to corresponding category. This isn't something particularly great. Alternative would be to use target_link_libraries() directly, which sounds like more code but which is more explicit and allows to have more flexibility and control comparing to wrapper approach. Tested the following configurations on Linux, macOS and Windows: - make full debug developer - make full release developer - make lite debug developer - make lite release developer NOTE: Linux libraries needs to be compiled with D6641 applied, otherwise, depending on configuration, it's possible to run into duplicated zlib symbols error. Differential Revision: https://developer.blender.org/D6642
2020-01-20 18:36:19 +01:00
list(APPEND LIB
${FFMPEG_LIBRARIES}
)
add_definitions(-DWITH_FFMPEG)
endif()
if(WITH_CODEC_SNDFILE)
add_definitions(-DWITH_SNDFILE)
endif()
if(WITH_CYCLES)
list(APPEND INC
../../../../intern/cycles/blender
)
list(APPEND LIB
bf_intern_cycles
)
add_definitions(-DWITH_CYCLES)
endif()
if(WITH_CYCLES_OSL)
add_definitions(-DWITH_CYCLES_OSL)
endif()
if(WITH_CYCLES_EMBREE)
add_definitions(-DWITH_CYCLES_EMBREE)
endif()
if(WITH_FREESTYLE)
list(APPEND INC
../../freestyle/intern/python
)
add_definitions(-DWITH_FREESTYLE)
endif()
if(WITH_IMAGE_CINEON)
add_definitions(-DWITH_IMAGE_CINEON)
endif()
if(WITH_IMAGE_OPENEXR)
add_definitions(-DWITH_IMAGE_OPENEXR)
endif()
if(WITH_IMAGE_OPENJPEG)
add_definitions(-DWITH_IMAGE_OPENJPEG)
endif()
if(WITH_IMAGE_WEBP)
add_definitions(-DWITH_IMAGE_WEBP)
endif()
if(WITH_INPUT_NDOF)
add_definitions(-DWITH_INPUT_NDOF)
endif()
if(WITH_INTERNATIONAL)
add_definitions(-DWITH_INTERNATIONAL)
endif()
if(WITH_OPENAL)
add_definitions(-DWITH_OPENAL)
endif()
if(WITH_OPENSUBDIV)
add_definitions(-DWITH_OPENSUBDIV)
endif()
if(WITH_SDL)
list(APPEND INC_SYS
${SDL_INCLUDE_DIR}
)
list(APPEND LIB
${SDL_LIBRARY}
)
add_definitions(-DWITH_SDL)
endif()
if(WITH_JACK)
add_definitions(-DWITH_JACK)
endif()
if(WITH_COREAUDIO)
add_definitions(-DWITH_COREAUDIO)
endif()
if(WITH_LIBMV)
add_definitions(-DWITH_LIBMV)
endif()
if(WITH_PULSEAUDIO)
add_definitions(-DWITH_PULSEAUDIO)
endif()
if(WITH_WASAPI)
add_definitions(-DWITH_WASAPI)
endif()
if(WITH_MOD_OCEANSIM)
add_definitions(-DWITH_OCEANSIM)
endif()
if(WITH_MOD_REMESH)
add_definitions(-DWITH_MOD_REMESH)
endif()
if(WITH_MOD_FLUID)
add_definitions(-DWITH_FLUID)
endif()
if(WITH_OPENCOLLADA)
add_definitions(-DWITH_COLLADA)
endif()
if(WITH_IO_WAVEFRONT_OBJ)
add_definitions(-DWITH_IO_WAVEFRONT_OBJ)
endif()
if(WITH_IO_PLY)
add_definitions(-DWITH_IO_PLY)
endif()
if(WITH_IO_STL)
add_definitions(-DWITH_IO_STL)
endif()
IO: New FBX importer (C++, via ufbx) Adds a C++ based FBX importer, using 3rd party ufbx library (design task: #131304). The old Python based importer is still there; the new one is marked as "(experimental)" in the menu item. Drag-and-drop uses the old Python importer; the new one is only in the menu item. The new importer is generally 2x-5x faster than the old one, and often uses less memory too. There's potential to make it several times faster still. - ASCII FBX files are supported now - Binary FBX files older than 7.1 (SDK 2012) version are supported now - Better handling of "geometric transform" (common in 3dsmax), manifesting as wrong rotation for some objects when in a hierarchy (e.g. #131172) - Some FBX files that the old importer was failing to read are supported now (e.g. cases 47344, 134983) - Materials import more shader parameters (IOR, diffuse roughness, anisotropy, subsurface, transmission, coat, sheen, thin film) and shader models (e.g. OpenPBR or glTF2 materials from 3dsmax imports much better) - Importer now creates layered/slotted animation actions. Each "take" inside FBX file creates one action, and animated object within it gets a slot. - Materials that use the same texture several times no longer create duplicate images; the same image is used - Material diffuse color animations were imported, but they only animated the viewport color. Now they also animate the nodetree base color too. - "Ignore Leaf Bones" option no longer ignores leaf bones that are actually skinned to some parts of the mesh. - Previous importer was creating orphan invisible Camera data objects for some files (mostly from MotionBuilder?), new one properly creates these cameras. Import settings that existed in Python importer, but are NOT DONE in the new one (mostly because not sure if they are useful, and no one asked for them from feedback yet): - Manual Orientation & Forward/Up Axis: not sure if actually useful. FBX file itself specifies the axes fairly clearly. USD/glTF/Alembic also do not have settings to override them. - Use Pre/Post Rotation (defaults on): feels like it should just always be on. ufbx handles that internally. - Apply Transform (defaults off, warning icon): not sure if needed at all. - Decal Offset: Cycles specific. None of other importers have it. - Automatic Bone Orientation (defaults off): feels like current behavior (either on or off) often produces "nonsensical bones" where bone direction does not go towards the children with either setting. There are discussions within I/O and Animation modules about different ways of bone visualizations and/or different bone length axes, that would solve this in general. - Force Connect Children (defaults off): not sure when that would be useful. On several animated armatures I tried, it turns armature animation into garbage. - Primary/Secondary Bone Axis: again not sure when would be useful. Importer UI screenshots, performance benchmark details and TODOs for later work are in the PR. Pull Request: https://projects.blender.org/blender/blender/pulls/132406
2025-04-16 09:55:00 +02:00
if(WITH_IO_FBX)
add_definitions(-DWITH_IO_FBX)
endif()
if(WITH_IO_GREASE_PENCIL)
add_definitions(-DWITH_IO_GREASE_PENCIL)
endif()
if(WITH_ALEMBIC)
add_definitions(-DWITH_ALEMBIC)
endif()
if(WITH_OPENVDB)
add_definitions(-DWITH_OPENVDB)
list(APPEND INC
../../../../intern/openvdb
)
endif()
if(WITH_ALEMBIC)
add_definitions(-DWITH_ALEMBIC)
list(APPEND INC
../../io/alembic
)
endif()
USD: Introducing a simple USD Exporter This commit introduces the first version of an exporter to Pixar's Universal Scene Description (USD) format. Reviewed By: sergey, LazyDodo Differential Revision: https://developer.blender.org/D6287 - The USD libraries are built by `make deps`, but not yet built by install_deps.sh. - Only experimental support for instancing; by default all duplicated objects are made real in the USD file. This is fine for exporting a linked-in posed character, not so much for thousands of pebbles etc. - The way materials and UV coordinates and Normals are exported is going to change soon. - This patch contains LazyDodo's fixes for building on Windows in D5359. == Meshes == USD seems to support neither per-material nor per-face-group double-sidedness, so we just use the flag from the first non-empty material slot. If there is no material we default to double-sidedness. Each UV map is stored on the mesh in a separate primvar. Materials can refer to these UV maps, but this is not yet exported by Blender. The primvar name is the same as the UV Map name. This is to allow the standard name "st" for texture coordinates by naming the UV Map as such, without having to guess which UV Map is the "standard" one. Face-varying mesh normals are written to USD. When the mesh has custom loop normals those are written. Otherwise the poly flag `ME_SMOOTH` is inspected to determine the normals. The UV maps and mesh normals take up a significant amount of space, so exporting them is optional. They're still enabled by default, though. For comparison: a shot of Spring (03_035_A) is 1.2 GiB when exported with UVs and normals, and 262 MiB without. We probably have room for optimisation of written UVs and normals. The mesh subdivision scheme isn't using the default value 'Catmull Clark', but uses 'None', indicating we're exporting a polygonal mesh. This is necessary for USD to understand our normals; otherwise the mesh is always rendered smooth. In the future we may want to expose this choice of subdivision scheme to the user, or auto-detect it when we actually support exporting pre-subdivision meshes. A possible optimisation could be to inspect whether all polygons are smooth or flat, and mark the USD mesh as such. This can be added when needed. == Animation == Mesh and transform animation are now written when passing `animation=True` to the export operator. There is no inspection of whether an object is actually animated or not; USD can handle deduplication of static values for us. The administration of which timecode to use for the export is left to the file-format-specific concrete subclasses of `AbstractHierarchyIterator`; the abstract iterator itself doesn't know anything about the passage of time. This will allow subclasses for the frame-based USD format and time-based Alembic format. == Support for simple preview materials == Very simple versions of the materials are now exported, using only the viewport diffuse RGB, metallic, and roughness. When there are multiple materials, the mesh faces are stored as geometry subset and each material is assigned to the appropriate subset. If there is only one material this is skipped. The first material if any) is always applied to the mesh itself (regardless of the existence of geometry subsets), because the Hydra viewport doesn't support materials on subsets. See https://github.com/PixarAnimationStudios/USD/issues/542 for more info. Note that the geometry subsets are not yet time-sampled, so it may break when an animated mesh changes topology. Materials are exported as a flat list under a top-level '/_materials' namespace. This inhibits instancing of the objects using those materials, so this is subject to change. == Hair == Only the parent strands are exported, and only with a constant colour. No UV coordinates, no information about the normals. == Camera == Only perspective cameras are supported for now. == Particles == Particles are only written when they are alive, which means that they are always visible (there is currently no code that deals with marking them as invisible outside their lifespan). Particle-system-instanced objects are exported by suffixing the object name with the particle's persistent ID, giving each particle XForm a unique name. == Instancing/referencing == This exporter has experimental support for instancing/referencing. Dupli-object meshes are now written to USD as references to the original mesh. This is still very limited in correctness, as there are issues referencing to materials from a referenced mesh. I am still committing this, as it gives us a place to start when continuing the quest for proper instancing in USD. == Lights == USD does not directly support spot lights, so those aren't exported yet. It's possible to add this in the future via the UsdLuxShapingAPI. The units used for the light intensity are also still a bit of a mystery. == Fluid vertex velocities == Currently only fluid simulations (not meshes in general) have explicit vertex velocities. This is the most important case for exporting velocities, though, as the baked mesh changes topology all the time, and thus computing the velocities at import time in a post-processing step is hard. == The Building Process == - USD is built as monolithic library, instead of 25 smaller libraries. We were linking all of them as 'whole archive' anyway, so this doesn't affect the final file size. It does, however, make life easier with respect to linking order, and handling upstream changes. - The JSON files required by USD are installed into datafiles/usd; they are required on every platform. Set the `PXR_PATH_DEBUG` to any value to have the USD library print the paths it uses to find those files. - USD is patched so that it finds the aforementioned JSON files in a path that we pass to it from Blender. - USD is patched to have a `PXR_BUILD_USD_TOOLS` CMake option to disable building the tools in its `bin` directory. This is sent as a pull request at https://github.com/PixarAnimationStudios/USD/pull/1048
2019-12-13 10:27:40 +01:00
if(WITH_USD)
add_definitions(-DWITH_USD)
list(APPEND INC
../../io/usd
)
USD: Introducing a simple USD Exporter This commit introduces the first version of an exporter to Pixar's Universal Scene Description (USD) format. Reviewed By: sergey, LazyDodo Differential Revision: https://developer.blender.org/D6287 - The USD libraries are built by `make deps`, but not yet built by install_deps.sh. - Only experimental support for instancing; by default all duplicated objects are made real in the USD file. This is fine for exporting a linked-in posed character, not so much for thousands of pebbles etc. - The way materials and UV coordinates and Normals are exported is going to change soon. - This patch contains LazyDodo's fixes for building on Windows in D5359. == Meshes == USD seems to support neither per-material nor per-face-group double-sidedness, so we just use the flag from the first non-empty material slot. If there is no material we default to double-sidedness. Each UV map is stored on the mesh in a separate primvar. Materials can refer to these UV maps, but this is not yet exported by Blender. The primvar name is the same as the UV Map name. This is to allow the standard name "st" for texture coordinates by naming the UV Map as such, without having to guess which UV Map is the "standard" one. Face-varying mesh normals are written to USD. When the mesh has custom loop normals those are written. Otherwise the poly flag `ME_SMOOTH` is inspected to determine the normals. The UV maps and mesh normals take up a significant amount of space, so exporting them is optional. They're still enabled by default, though. For comparison: a shot of Spring (03_035_A) is 1.2 GiB when exported with UVs and normals, and 262 MiB without. We probably have room for optimisation of written UVs and normals. The mesh subdivision scheme isn't using the default value 'Catmull Clark', but uses 'None', indicating we're exporting a polygonal mesh. This is necessary for USD to understand our normals; otherwise the mesh is always rendered smooth. In the future we may want to expose this choice of subdivision scheme to the user, or auto-detect it when we actually support exporting pre-subdivision meshes. A possible optimisation could be to inspect whether all polygons are smooth or flat, and mark the USD mesh as such. This can be added when needed. == Animation == Mesh and transform animation are now written when passing `animation=True` to the export operator. There is no inspection of whether an object is actually animated or not; USD can handle deduplication of static values for us. The administration of which timecode to use for the export is left to the file-format-specific concrete subclasses of `AbstractHierarchyIterator`; the abstract iterator itself doesn't know anything about the passage of time. This will allow subclasses for the frame-based USD format and time-based Alembic format. == Support for simple preview materials == Very simple versions of the materials are now exported, using only the viewport diffuse RGB, metallic, and roughness. When there are multiple materials, the mesh faces are stored as geometry subset and each material is assigned to the appropriate subset. If there is only one material this is skipped. The first material if any) is always applied to the mesh itself (regardless of the existence of geometry subsets), because the Hydra viewport doesn't support materials on subsets. See https://github.com/PixarAnimationStudios/USD/issues/542 for more info. Note that the geometry subsets are not yet time-sampled, so it may break when an animated mesh changes topology. Materials are exported as a flat list under a top-level '/_materials' namespace. This inhibits instancing of the objects using those materials, so this is subject to change. == Hair == Only the parent strands are exported, and only with a constant colour. No UV coordinates, no information about the normals. == Camera == Only perspective cameras are supported for now. == Particles == Particles are only written when they are alive, which means that they are always visible (there is currently no code that deals with marking them as invisible outside their lifespan). Particle-system-instanced objects are exported by suffixing the object name with the particle's persistent ID, giving each particle XForm a unique name. == Instancing/referencing == This exporter has experimental support for instancing/referencing. Dupli-object meshes are now written to USD as references to the original mesh. This is still very limited in correctness, as there are issues referencing to materials from a referenced mesh. I am still committing this, as it gives us a place to start when continuing the quest for proper instancing in USD. == Lights == USD does not directly support spot lights, so those aren't exported yet. It's possible to add this in the future via the UsdLuxShapingAPI. The units used for the light intensity are also still a bit of a mystery. == Fluid vertex velocities == Currently only fluid simulations (not meshes in general) have explicit vertex velocities. This is the most important case for exporting velocities, though, as the baked mesh changes topology all the time, and thus computing the velocities at import time in a post-processing step is hard. == The Building Process == - USD is built as monolithic library, instead of 25 smaller libraries. We were linking all of them as 'whole archive' anyway, so this doesn't affect the final file size. It does, however, make life easier with respect to linking order, and handling upstream changes. - The JSON files required by USD are installed into datafiles/usd; they are required on every platform. Set the `PXR_PATH_DEBUG` to any value to have the USD library print the paths it uses to find those files. - USD is patched so that it finds the aforementioned JSON files in a path that we pass to it from Blender. - USD is patched to have a `PXR_BUILD_USD_TOOLS` CMake option to disable building the tools in its `bin` directory. This is sent as a pull request at https://github.com/PixarAnimationStudios/USD/pull/1048
2019-12-13 10:27:40 +01:00
endif()
if(WITH_OPENSUBDIV)
add_definitions(-DWITH_OPENSUBDIV)
list(APPEND INC
../../../../intern/opensubdiv
)
endif()
Build System: Add OpenXR-SDK dependency and WITH_XR_OPENXR build option The OpenXR-SDK contains utilities for using the OpenXR standard (https://www.khronos.org/openxr/). Namely C-headers and a so called "loader" to manage runtime linking to OpenXR platforms ("runtimes") installed on the user's system. The WITH_XR_OPENXR build option is disabled by default for now, as there is no code using it yet. On macOS it will remain disabled for now, it's untested and there's no OpenXR runtime in sight for it. Some points on the OpenXR-SDK dependency: * The repository is located at https://github.com/KhronosGroup/OpenXR-SDK (Apache 2). * Notes on updating the dependency: https://wiki.blender.org/wiki/Source/OpenXR_SDK_Dependency * It contains a bunch of generated files, for which the sources are in a separate repository (https://github.com/KhronosGroup/OpenXR-SDK-Source). * We could use that other repo by default, but I'd rather go with the simpler solution and allow people to opt in if they want advanced dev features. * We currently use the OpenXR loader lib from it and the headers. * To use the injected OpenXR API-layers from the SDK (e.g. API validation layers), the SDK needs to be compiled from this other repository. The extra "XR_" prefix in the build option is to avoid mix-ups of OpenXR with OpenEXR. Most of this comes from the 2019 GSoC project, "Core Support of Virtual Reality Headsets through OpenXR" (https://wiki.blender.org/wiki/User:Severin/GSoC-2019/). Differential Revision: https://developer.blender.org/D6188 Reviewed by: Campbell Barton, Sergey Sharybin, Bastien Montagne, Ray Molenkamp
2020-03-04 16:39:00 +01:00
if(WITH_XR_OPENXR)
add_definitions(-DWITH_XR_OPENXR)
endif()
if(WITH_POTRACE)
add_definitions(-DWITH_POTRACE)
endif()
if(WITH_PUGIXML)
add_definitions(-DWITH_PUGIXML)
endif()
if(WITH_HARU)
add_definitions(-DWITH_HARU)
endif()
if(WITH_HYDRA)
list(APPEND LIB
bf_render_hydra
)
add_definitions(-DWITH_HYDRA)
endif()
blender_add_lib(bf_python "${SRC}" "${INC}" "${INC_SYS}" "${LIB}")
# RNA_prototypes.hh
add_dependencies(bf_python bf_rna)