2022-02-11 09:07:11 +11:00
|
|
|
/* SPDX-License-Identifier: GPL-2.0-or-later
|
|
|
|
|
* Copyright 2001-2002 NaN Holding BV. All rights reserved. */
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2019-02-18 08:08:12 +11:00
|
|
|
/** \file
|
|
|
|
|
* \ingroup render
|
2011-02-27 19:31:27 +00:00
|
|
|
*/
|
|
|
|
|
|
2008-10-02 01:38:12 +00:00
|
|
|
#include <math.h>
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
#include <stdio.h>
|
2020-03-19 09:33:03 +01:00
|
|
|
#include <stdlib.h>
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2008-10-02 01:38:12 +00:00
|
|
|
#include "MEM_guardedalloc.h"
|
|
|
|
|
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
#include "BLI_blenlib.h"
|
2008-10-01 03:35:53 +00:00
|
|
|
#include "BLI_kdopbvh.h"
|
2020-03-19 09:33:03 +01:00
|
|
|
#include "BLI_math.h"
|
|
|
|
|
#include "BLI_noise.h"
|
2016-02-23 12:19:56 +01:00
|
|
|
#include "BLI_task.h"
|
2020-03-19 09:33:03 +01:00
|
|
|
#include "BLI_utildefines.h"
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2015-08-16 17:32:01 +10:00
|
|
|
#include "BLT_translation.h"
|
2013-03-10 16:55:01 +00:00
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
#include "DNA_mesh_types.h"
|
2016-03-24 11:41:44 +01:00
|
|
|
#include "DNA_meshdata_types.h"
|
|
|
|
|
#include "DNA_object_types.h"
|
|
|
|
|
#include "DNA_particle_types.h"
|
2022-08-20 13:42:10 +02:00
|
|
|
#include "DNA_scene_types.h"
|
2016-03-24 11:41:44 +01:00
|
|
|
#include "DNA_texture_types.h"
|
|
|
|
|
|
2017-12-07 15:36:26 +11:00
|
|
|
#include "BKE_colorband.h"
|
2020-03-19 09:33:03 +01:00
|
|
|
#include "BKE_colortools.h"
|
2020-12-15 10:47:58 +11:00
|
|
|
#include "BKE_customdata.h"
|
2016-03-24 11:41:44 +01:00
|
|
|
#include "BKE_deform.h"
|
2008-10-31 05:29:54 +00:00
|
|
|
#include "BKE_lattice.h"
|
Refactor: Move normals out of MVert, lazy calculation
As described in T91186, this commit moves mesh vertex normals into a
contiguous array of float vectors in a custom data layer, how face
normals are currently stored.
The main interface is documented in `BKE_mesh.h`. Vertex and face
normals are now calculated on-demand and cached, retrieved with an
"ensure" function. Since the logical state of a mesh is now "has
normals when necessary", they can be retrieved from a `const` mesh.
The goal is to use on-demand calculation for all derived data, but
leave room for eager calculation for performance purposes (modifier
evaluation is threaded, but viewport data generation is not).
**Benefits**
This moves us closer to a SoA approach rather than the current AoS
paradigm. Accessing a contiguous `float3` is much more efficient than
retrieving data from a larger struct. The memory requirements for
accessing only normals or vertex locations are smaller, and at the
cost of more memory usage for just normals, they now don't have to
be converted between float and short, which also simplifies code
In the future, the remaining items can be removed from `MVert`,
leaving only `float3`, which has similar benefits (see T93602).
Removing the combination of derived and original data makes it
conceptually simpler to only calculate normals when necessary.
This is especially important now that we have more opportunities
for temporary meshes in geometry nodes.
**Performance**
In addition to the theoretical future performance improvements by
making `MVert == float3`, I've done some basic performance testing
on this patch directly. The data is fairly rough, but it gives an idea
about where things stand generally.
- Mesh line primitive 4m Verts: 1.16x faster (36 -> 31 ms),
showing that accessing just `MVert` is now more efficient.
- Spring Splash Screen: 1.03-1.06 -> 1.06-1.11 FPS, a very slight
change that at least shows there is no regression.
- Sprite Fright Snail Smoosh: 3.30-3.40 -> 3.42-3.50 FPS, a small
but observable speedup.
- Set Position Node with Scaled Normal: 1.36x faster (53 -> 39 ms),
shows that using normals in geometry nodes is faster.
- Normal Calculation 1.6m Vert Cube: 1.19x faster (25 -> 21 ms),
shows that calculating normals is slightly faster now.
- File Size of 1.6m Vert Cube: 1.03x smaller (214.7 -> 208.4 MB),
Normals are not saved in files, which can help with large meshes.
As for memory usage, it may be slightly more in some cases, but
I didn't observe any difference in the production files I tested.
**Tests**
Some modifiers and cycles test results need to be updated with this
commit, for two reasons:
- The subdivision surface modifier is not responsible for calculating
normals anymore. In master, the modifier creates different normals
than the result of the `Mesh` normal calculation, so this is a bug
fix.
- There are small differences in the results of some modifiers that
use normals because they are not converted to and from `short`
anymore.
**Future improvements**
- Remove `ModifierTypeInfo::dependsOnNormals`. Code in each modifier
already retrieves normals if they are needed anyway.
- Copy normals as part of a better CoW system for attributes.
- Make more areas use lazy instead of eager normal calculation.
- Remove `BKE_mesh_normals_tag_dirty` in more places since that is
now the default state of a new mesh.
- Possibly apply a similar change to derived face corner normals.
Differential Revision: https://developer.blender.org/D12770
2022-01-13 14:37:58 -06:00
|
|
|
#include "BKE_mesh.h"
|
2015-07-18 21:42:39 +02:00
|
|
|
#include "BKE_object.h"
|
Point Density texture
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
|
|
|
#include "BKE_particle.h"
|
2010-06-27 05:39:55 +00:00
|
|
|
#include "BKE_scene.h"
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2017-07-21 11:53:13 +02:00
|
|
|
#include "DEG_depsgraph.h"
|
2018-02-28 13:54:00 -03:00
|
|
|
#include "DEG_depsgraph_query.h"
|
2017-07-21 11:53:13 +02:00
|
|
|
|
2020-11-06 14:16:27 -05:00
|
|
|
#include "texture_common.h"
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2020-11-09 15:42:38 +01:00
|
|
|
#include "RE_texture.h"
|
2008-11-09 01:16:12 +00:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
static ThreadMutex sample_mutex = PTHREAD_MUTEX_INITIALIZER;
|
2008-11-09 01:16:12 +00:00
|
|
|
|
|
|
|
|
static int point_data_used(PointDensity *pd)
|
|
|
|
|
{
|
|
|
|
|
int pd_bitflag = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2011-03-16 18:21:31 +00:00
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
if ((pd->falloff_type == TEX_PD_FALLOFF_PARTICLE_VEL) ||
|
2015-03-28 23:50:36 +05:00
|
|
|
(pd->color_source == TEX_PD_COLOR_PARTVEL) ||
|
|
|
|
|
(pd->color_source == TEX_PD_COLOR_PARTSPEED)) {
|
2011-03-16 18:21:31 +00:00
|
|
|
pd_bitflag |= POINT_DATA_VEL;
|
2015-03-28 23:50:36 +05:00
|
|
|
}
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
if ((pd->color_source == TEX_PD_COLOR_PARTAGE) ||
|
2015-03-28 23:50:36 +05:00
|
|
|
(pd->falloff_type == TEX_PD_FALLOFF_PARTICLE_AGE)) {
|
2011-03-16 18:21:31 +00:00
|
|
|
pd_bitflag |= POINT_DATA_LIFE;
|
2015-03-28 23:50:36 +05:00
|
|
|
}
|
2011-03-16 18:21:31 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
else if (pd->source == TEX_PD_OBJECT) {
|
|
|
|
|
if (ELEM(pd->ob_color_source,
|
|
|
|
|
TEX_PD_COLOR_VERTCOL,
|
|
|
|
|
TEX_PD_COLOR_VERTWEIGHT,
|
|
|
|
|
TEX_PD_COLOR_VERTNOR)) {
|
|
|
|
|
pd_bitflag |= POINT_DATA_COLOR;
|
|
|
|
|
}
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-11-09 01:16:12 +00:00
|
|
|
return pd_bitflag;
|
|
|
|
|
}
|
|
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
static void point_data_pointers(PointDensity *pd,
|
|
|
|
|
float **r_data_velocity,
|
|
|
|
|
float **r_data_life,
|
|
|
|
|
float **r_data_color)
|
|
|
|
|
{
|
|
|
|
|
const int data_used = point_data_used(pd);
|
|
|
|
|
const int totpoint = pd->totpoints;
|
|
|
|
|
float *data = pd->point_data;
|
|
|
|
|
int offset = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_VEL) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_velocity) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_velocity = data + offset;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
offset += 3 * totpoint;
|
|
|
|
|
}
|
|
|
|
|
else {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_velocity) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_velocity = NULL;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_LIFE) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_life) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_life = data + offset;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
offset += totpoint;
|
|
|
|
|
}
|
|
|
|
|
else {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_life) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_life = NULL;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_COLOR) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_color) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_color = data + offset;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
offset += 3 * totpoint;
|
|
|
|
|
}
|
|
|
|
|
else {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_color) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_color = NULL;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
|
|
|
|
}
|
2008-11-09 01:16:12 +00:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
/* additional data stored alongside the point density BVH,
|
|
|
|
|
* accessible by point index number to retrieve other information
|
2008-11-09 01:16:12 +00:00
|
|
|
* such as particle velocity or lifetime */
|
2016-03-24 11:41:44 +01:00
|
|
|
static void alloc_point_data(PointDensity *pd)
|
2008-11-09 01:16:12 +00:00
|
|
|
{
|
2016-03-24 11:41:44 +01:00
|
|
|
const int totpoints = pd->totpoints;
|
|
|
|
|
int data_used = point_data_used(pd);
|
2008-11-09 01:16:12 +00:00
|
|
|
int data_size = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_VEL) {
|
2008-11-09 01:16:12 +00:00
|
|
|
/* store 3 channels of velocity data */
|
|
|
|
|
data_size += 3;
|
|
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_LIFE) {
|
2008-11-09 01:16:12 +00:00
|
|
|
/* store 1 channel of lifetime data */
|
|
|
|
|
data_size += 1;
|
|
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_COLOR) {
|
|
|
|
|
/* store 3 channels of RGB data */
|
|
|
|
|
data_size += 3;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
if (data_size) {
|
2016-03-24 11:41:44 +01:00
|
|
|
pd->point_data = MEM_callocN(sizeof(float) * data_size * totpoints, "particle point data");
|
2015-03-28 23:50:36 +05:00
|
|
|
}
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
|
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
static void pointdensity_cache_psys(
|
|
|
|
|
Depsgraph *depsgraph, Scene *scene, PointDensity *pd, Object *ob, ParticleSystem *psys)
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
{
|
|
|
|
|
ParticleKey state;
|
2013-08-08 15:36:03 +00:00
|
|
|
ParticleCacheKey *cache;
|
2015-03-28 23:50:36 +05:00
|
|
|
ParticleSimulationData sim = {NULL};
|
|
|
|
|
ParticleData *pa = NULL;
|
2021-07-12 16:15:03 +02:00
|
|
|
float cfra = BKE_scene_ctime_get(scene);
|
2021-06-26 21:35:18 +10:00
|
|
|
int i /*, Childexists*/ /* UNUSED */;
|
2016-03-24 11:41:44 +01:00
|
|
|
int total_particles;
|
|
|
|
|
int data_used;
|
|
|
|
|
float *data_vel, *data_life;
|
Point Density texture
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
|
|
|
float partco[3];
|
2018-05-23 11:11:34 +02:00
|
|
|
const bool use_render_params = (DEG_get_mode(depsgraph) == DAG_EVAL_RENDER);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
data_used = point_data_used(pd);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-06-23 07:53:49 +10:00
|
|
|
if (!psys_check_enabled(ob, psys, use_render_params)) {
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
sim.depsgraph = depsgraph;
|
2015-03-29 02:34:44 +05:00
|
|
|
sim.scene = scene;
|
2015-03-28 23:50:36 +05:00
|
|
|
sim.ob = ob;
|
|
|
|
|
sim.psys = psys;
|
2015-07-18 22:36:09 +02:00
|
|
|
sim.psmd = psys_get_modifier(ob, psys);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-11-02 14:41:49 +01:00
|
|
|
/* in case ob->world_to_object isn't up-to-date */
|
|
|
|
|
invert_m4_m4(ob->world_to_object, ob->object_to_world);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
total_particles = psys->totpart + psys->totchild;
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_init(&sim);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-10-12 11:38:28 +00:00
|
|
|
pd->point_tree = BLI_bvhtree_new(total_particles, 0.0, 4, 6);
|
2008-11-09 01:16:12 +00:00
|
|
|
pd->totpoints = total_particles;
|
2016-03-24 11:41:44 +01:00
|
|
|
alloc_point_data(pd);
|
|
|
|
|
point_data_pointers(pd, &data_vel, &data_life, NULL);
|
2015-03-28 23:50:36 +05:00
|
|
|
|
2011-09-20 08:48:48 +00:00
|
|
|
#if 0 /* UNUSED */
|
2019-05-31 23:21:16 +10:00
|
|
|
if (psys->totchild > 0 && !(psys->part->draw & PART_DRAW_PARENT)) {
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
childexists = 1;
|
2019-05-31 23:21:16 +10:00
|
|
|
}
|
2011-09-20 08:48:48 +00:00
|
|
|
#endif
|
|
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
for (i = 0, pa = psys->particles; i < total_particles; i++, pa++) {
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
if (psys->part->type == PART_HAIR) {
|
|
|
|
|
/* hair particles */
|
2019-04-22 09:08:06 +10:00
|
|
|
if (i < psys->totpart && psys->pathcache) {
|
2013-08-08 15:36:03 +00:00
|
|
|
cache = psys->pathcache[i];
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else if (i >= psys->totpart && psys->childcache) {
|
2013-08-08 15:36:03 +00:00
|
|
|
cache = psys->childcache[i - psys->totpart];
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else {
|
2013-08-08 15:36:03 +00:00
|
|
|
continue;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-01-13 17:24:20 +01:00
|
|
|
cache += cache->segments; /* use endpoint */
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
copy_v3_v3(state.co, cache->co);
|
|
|
|
|
zero_v3(state.vel);
|
|
|
|
|
state.time = 0.0f;
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
/* emitter particles */
|
|
|
|
|
state.time = cfra;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2019-04-22 09:08:06 +10:00
|
|
|
if (!psys_get_particle_state(&sim, i, &state, 0)) {
|
2013-08-08 15:36:03 +00:00
|
|
|
continue;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-11-09 01:16:12 +00:00
|
|
|
if (data_used & POINT_DATA_LIFE) {
|
2008-11-10 00:14:35 +00:00
|
|
|
if (i < psys->totpart) {
|
2015-03-28 23:50:36 +05:00
|
|
|
state.time = (cfra - pa->time) / pa->lifetime;
|
2012-03-24 06:38:07 +00:00
|
|
|
}
|
|
|
|
|
else {
|
2015-03-28 23:50:36 +05:00
|
|
|
ChildParticle *cpa = (psys->child + i) - psys->totpart;
|
2009-08-13 05:21:25 +00:00
|
|
|
float pa_birthtime, pa_dietime;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
state.time = psys_get_child_time(psys, cpa, cfra, &pa_birthtime, &pa_dietime);
|
2008-11-10 00:14:35 +00:00
|
|
|
}
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
}
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
copy_v3_v3(partco, state.co);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2019-04-22 09:08:06 +10:00
|
|
|
if (pd->psys_cache_space == TEX_PD_OBJECTSPACE) {
|
2022-11-02 14:41:49 +01:00
|
|
|
mul_m4_v3(ob->world_to_object, partco);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2013-08-08 15:36:03 +00:00
|
|
|
else if (pd->psys_cache_space == TEX_PD_OBJECTLOC) {
|
|
|
|
|
sub_v3_v3(partco, ob->loc);
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
/* TEX_PD_WORLDSPACE */
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
BLI_bvhtree_insert(pd->point_tree, i, partco, 1);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_vel) {
|
2019-03-25 11:55:36 +11:00
|
|
|
data_vel[i * 3 + 0] = state.vel[0];
|
|
|
|
|
data_vel[i * 3 + 1] = state.vel[1];
|
|
|
|
|
data_vel[i * 3 + 2] = state.vel[2];
|
2013-08-08 15:36:03 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_life) {
|
|
|
|
|
data_life[i] = state.time;
|
2013-08-08 15:36:03 +00:00
|
|
|
}
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-10-01 03:35:53 +00:00
|
|
|
BLI_bvhtree_balance(pd->point_tree);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_free(&sim);
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
|
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
static void pointdensity_cache_vertex_color(PointDensity *pd,
|
|
|
|
|
Object *UNUSED(ob),
|
|
|
|
|
Mesh *mesh,
|
|
|
|
|
float *data_color)
|
2016-03-24 11:41:44 +01:00
|
|
|
{
|
Mesh: Remove redundant custom data pointers
For copy-on-write, we want to share attribute arrays between meshes
where possible. Mutable pointers like `Mesh.mvert` make that difficult
by making ownership vague. They also make code more complex by adding
redundancy.
The simplest solution is just removing them and retrieving layers from
`CustomData` as needed. Similar changes have already been applied to
curves and point clouds (e9f82d3dc7ee, 410a6efb747f). Removing use of
the pointers generally makes code more obvious and more reusable.
Mesh data is now accessed with a C++ API (`Mesh::edges()` or
`Mesh::edges_for_write()`), and a C API (`BKE_mesh_edges(mesh)`).
The CoW changes this commit makes possible are described in T95845
and T95842, and started in D14139 and D14140. The change also simplifies
the ongoing mesh struct-of-array refactors from T95965.
**RNA/Python Access Performance**
Theoretically, accessing mesh elements with the RNA API may become
slower, since the layer needs to be found on every random access.
However, overhead is already high enough that this doesn't make a
noticible differenc, and performance is actually improved in some
cases. Random access can be up to 10% faster, but other situations
might be a bit slower. Generally using `foreach_get/set` are the best
way to improve performance. See the differential revision for more
discussion about Python performance.
Cycles has been updated to use raw pointers and the internal Blender
mesh types, mostly because there is no sense in having this overhead
when it's already compiled with Blender. In my tests this roughly
halves the Cycles mesh creation time (0.19s to 0.10s for a 1 million
face grid).
Differential Revision: https://developer.blender.org/D15488
2022-09-05 11:56:34 -05:00
|
|
|
const MLoop *mloop = BKE_mesh_loops(mesh);
|
2018-05-23 11:11:34 +02:00
|
|
|
const int totloop = mesh->totloop;
|
2016-03-24 11:41:44 +01:00
|
|
|
char layername[MAX_CUSTOMDATA_LAYER_NAME];
|
|
|
|
|
int i;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
BLI_assert(data_color);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-04-20 09:10:10 -05:00
|
|
|
if (!CustomData_has_layer(&mesh->ldata, CD_PROP_BYTE_COLOR)) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2022-04-20 09:10:10 -05:00
|
|
|
CustomData_validate_layer_name(
|
|
|
|
|
&mesh->ldata, CD_PROP_BYTE_COLOR, pd->vertex_attribute_name, layername);
|
2022-05-14 18:57:52 +02:00
|
|
|
const MLoopCol *mcol = CustomData_get_layer_named(&mesh->ldata, CD_PROP_BYTE_COLOR, layername);
|
2019-04-22 09:08:06 +10:00
|
|
|
if (!mcol) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
/* Stores the number of MLoops using the same vertex, so we can normalize colors. */
|
|
|
|
|
int *mcorners = MEM_callocN(sizeof(int) * pd->totpoints, "point density corner count");
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
for (i = 0; i < totloop; i++) {
|
|
|
|
|
int v = mloop[i].v;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-10-09 13:16:19 +02:00
|
|
|
if (mcorners[v] == 0) {
|
|
|
|
|
rgb_uchar_to_float(&data_color[v * 3], &mcol[i].r);
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
float col[3];
|
|
|
|
|
rgb_uchar_to_float(col, &mcol[i].r);
|
|
|
|
|
add_v3_v3(&data_color[v * 3], col);
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
++mcorners[v];
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
/* Normalize colors by averaging over mcorners.
|
|
|
|
|
* All the corners share the same vertex, ie. occupy the same point in space.
|
|
|
|
|
*/
|
|
|
|
|
for (i = 0; i < pd->totpoints; i++) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (mcorners[i] > 0) {
|
2019-03-25 11:55:36 +11:00
|
|
|
mul_v3_fl(&data_color[i * 3], 1.0f / mcorners[i]);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
MEM_freeN(mcorners);
|
|
|
|
|
}
|
|
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
static void pointdensity_cache_vertex_weight(PointDensity *pd,
|
|
|
|
|
Object *ob,
|
|
|
|
|
Mesh *mesh,
|
|
|
|
|
float *data_color)
|
2016-03-24 11:41:44 +01:00
|
|
|
{
|
2018-05-23 11:11:34 +02:00
|
|
|
const int totvert = mesh->totvert;
|
2016-03-24 11:41:44 +01:00
|
|
|
int mdef_index;
|
|
|
|
|
int i;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
BLI_assert(data_color);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-05-13 18:31:29 +02:00
|
|
|
const MDeformVert *mdef = CustomData_get_layer(&mesh->vdata, CD_MDEFORMVERT);
|
2019-04-22 09:08:06 +10:00
|
|
|
if (!mdef) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2021-07-13 12:10:34 -04:00
|
|
|
mdef_index = BKE_id_defgroup_name_index(&mesh->id, pd->vertex_attribute_name);
|
2019-04-22 09:08:06 +10:00
|
|
|
if (mdef_index < 0) {
|
2021-07-13 12:10:34 -04:00
|
|
|
mdef_index = BKE_object_defgroup_active_index_get(ob) - 1;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
if (mdef_index < 0) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-05-13 18:31:29 +02:00
|
|
|
const MDeformVert *dv;
|
2019-09-08 00:12:26 +10:00
|
|
|
for (i = 0, dv = mdef; i < totvert; i++, dv++, data_color += 3) {
|
2016-03-24 11:41:44 +01:00
|
|
|
MDeformWeight *dw;
|
|
|
|
|
int j;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2019-09-08 00:12:26 +10:00
|
|
|
for (j = 0, dw = dv->dw; j < dv->totweight; j++, dw++) {
|
2016-03-24 11:41:44 +01:00
|
|
|
if (dw->def_nr == mdef_index) {
|
|
|
|
|
copy_v3_fl(data_color, dw->weight);
|
|
|
|
|
break;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
Refactor: Move normals out of MVert, lazy calculation
As described in T91186, this commit moves mesh vertex normals into a
contiguous array of float vectors in a custom data layer, how face
normals are currently stored.
The main interface is documented in `BKE_mesh.h`. Vertex and face
normals are now calculated on-demand and cached, retrieved with an
"ensure" function. Since the logical state of a mesh is now "has
normals when necessary", they can be retrieved from a `const` mesh.
The goal is to use on-demand calculation for all derived data, but
leave room for eager calculation for performance purposes (modifier
evaluation is threaded, but viewport data generation is not).
**Benefits**
This moves us closer to a SoA approach rather than the current AoS
paradigm. Accessing a contiguous `float3` is much more efficient than
retrieving data from a larger struct. The memory requirements for
accessing only normals or vertex locations are smaller, and at the
cost of more memory usage for just normals, they now don't have to
be converted between float and short, which also simplifies code
In the future, the remaining items can be removed from `MVert`,
leaving only `float3`, which has similar benefits (see T93602).
Removing the combination of derived and original data makes it
conceptually simpler to only calculate normals when necessary.
This is especially important now that we have more opportunities
for temporary meshes in geometry nodes.
**Performance**
In addition to the theoretical future performance improvements by
making `MVert == float3`, I've done some basic performance testing
on this patch directly. The data is fairly rough, but it gives an idea
about where things stand generally.
- Mesh line primitive 4m Verts: 1.16x faster (36 -> 31 ms),
showing that accessing just `MVert` is now more efficient.
- Spring Splash Screen: 1.03-1.06 -> 1.06-1.11 FPS, a very slight
change that at least shows there is no regression.
- Sprite Fright Snail Smoosh: 3.30-3.40 -> 3.42-3.50 FPS, a small
but observable speedup.
- Set Position Node with Scaled Normal: 1.36x faster (53 -> 39 ms),
shows that using normals in geometry nodes is faster.
- Normal Calculation 1.6m Vert Cube: 1.19x faster (25 -> 21 ms),
shows that calculating normals is slightly faster now.
- File Size of 1.6m Vert Cube: 1.03x smaller (214.7 -> 208.4 MB),
Normals are not saved in files, which can help with large meshes.
As for memory usage, it may be slightly more in some cases, but
I didn't observe any difference in the production files I tested.
**Tests**
Some modifiers and cycles test results need to be updated with this
commit, for two reasons:
- The subdivision surface modifier is not responsible for calculating
normals anymore. In master, the modifier creates different normals
than the result of the `Mesh` normal calculation, so this is a bug
fix.
- There are small differences in the results of some modifiers that
use normals because they are not converted to and from `short`
anymore.
**Future improvements**
- Remove `ModifierTypeInfo::dependsOnNormals`. Code in each modifier
already retrieves normals if they are needed anyway.
- Copy normals as part of a better CoW system for attributes.
- Make more areas use lazy instead of eager normal calculation.
- Remove `BKE_mesh_normals_tag_dirty` in more places since that is
now the default state of a new mesh.
- Possibly apply a similar change to derived face corner normals.
Differential Revision: https://developer.blender.org/D12770
2022-01-13 14:37:58 -06:00
|
|
|
static void pointdensity_cache_vertex_normal(Mesh *mesh, float *data_color)
|
2016-03-24 11:41:44 +01:00
|
|
|
{
|
|
|
|
|
BLI_assert(data_color);
|
Refactor: Move normals out of MVert, lazy calculation
As described in T91186, this commit moves mesh vertex normals into a
contiguous array of float vectors in a custom data layer, how face
normals are currently stored.
The main interface is documented in `BKE_mesh.h`. Vertex and face
normals are now calculated on-demand and cached, retrieved with an
"ensure" function. Since the logical state of a mesh is now "has
normals when necessary", they can be retrieved from a `const` mesh.
The goal is to use on-demand calculation for all derived data, but
leave room for eager calculation for performance purposes (modifier
evaluation is threaded, but viewport data generation is not).
**Benefits**
This moves us closer to a SoA approach rather than the current AoS
paradigm. Accessing a contiguous `float3` is much more efficient than
retrieving data from a larger struct. The memory requirements for
accessing only normals or vertex locations are smaller, and at the
cost of more memory usage for just normals, they now don't have to
be converted between float and short, which also simplifies code
In the future, the remaining items can be removed from `MVert`,
leaving only `float3`, which has similar benefits (see T93602).
Removing the combination of derived and original data makes it
conceptually simpler to only calculate normals when necessary.
This is especially important now that we have more opportunities
for temporary meshes in geometry nodes.
**Performance**
In addition to the theoretical future performance improvements by
making `MVert == float3`, I've done some basic performance testing
on this patch directly. The data is fairly rough, but it gives an idea
about where things stand generally.
- Mesh line primitive 4m Verts: 1.16x faster (36 -> 31 ms),
showing that accessing just `MVert` is now more efficient.
- Spring Splash Screen: 1.03-1.06 -> 1.06-1.11 FPS, a very slight
change that at least shows there is no regression.
- Sprite Fright Snail Smoosh: 3.30-3.40 -> 3.42-3.50 FPS, a small
but observable speedup.
- Set Position Node with Scaled Normal: 1.36x faster (53 -> 39 ms),
shows that using normals in geometry nodes is faster.
- Normal Calculation 1.6m Vert Cube: 1.19x faster (25 -> 21 ms),
shows that calculating normals is slightly faster now.
- File Size of 1.6m Vert Cube: 1.03x smaller (214.7 -> 208.4 MB),
Normals are not saved in files, which can help with large meshes.
As for memory usage, it may be slightly more in some cases, but
I didn't observe any difference in the production files I tested.
**Tests**
Some modifiers and cycles test results need to be updated with this
commit, for two reasons:
- The subdivision surface modifier is not responsible for calculating
normals anymore. In master, the modifier creates different normals
than the result of the `Mesh` normal calculation, so this is a bug
fix.
- There are small differences in the results of some modifiers that
use normals because they are not converted to and from `short`
anymore.
**Future improvements**
- Remove `ModifierTypeInfo::dependsOnNormals`. Code in each modifier
already retrieves normals if they are needed anyway.
- Copy normals as part of a better CoW system for attributes.
- Make more areas use lazy instead of eager normal calculation.
- Remove `BKE_mesh_normals_tag_dirty` in more places since that is
now the default state of a new mesh.
- Possibly apply a similar change to derived face corner normals.
Differential Revision: https://developer.blender.org/D12770
2022-01-13 14:37:58 -06:00
|
|
|
const float(*vert_normals)[3] = BKE_mesh_vertex_normals_ensure(mesh);
|
|
|
|
|
memcpy(data_color, vert_normals, sizeof(float[3]) * mesh->totvert);
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
|
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
static void pointdensity_cache_object(PointDensity *pd, Object *ob)
|
2008-09-29 07:56:41 +00:00
|
|
|
{
|
2016-03-24 11:41:44 +01:00
|
|
|
float *data_color;
|
2008-09-29 07:56:41 +00:00
|
|
|
int i;
|
Mesh: Remove redundant custom data pointers
For copy-on-write, we want to share attribute arrays between meshes
where possible. Mutable pointers like `Mesh.mvert` make that difficult
by making ownership vague. They also make code more complex by adding
redundancy.
The simplest solution is just removing them and retrieving layers from
`CustomData` as needed. Similar changes have already been applied to
curves and point clouds (e9f82d3dc7ee, 410a6efb747f). Removing use of
the pointers generally makes code more obvious and more reusable.
Mesh data is now accessed with a C++ API (`Mesh::edges()` or
`Mesh::edges_for_write()`), and a C API (`BKE_mesh_edges(mesh)`).
The CoW changes this commit makes possible are described in T95845
and T95842, and started in D14139 and D14140. The change also simplifies
the ongoing mesh struct-of-array refactors from T95965.
**RNA/Python Access Performance**
Theoretically, accessing mesh elements with the RNA API may become
slower, since the layer needs to be found on every random access.
However, overhead is already high enough that this doesn't make a
noticible differenc, and performance is actually improved in some
cases. Random access can be up to 10% faster, but other situations
might be a bit slower. Generally using `foreach_get/set` are the best
way to improve performance. See the differential revision for more
discussion about Python performance.
Cycles has been updated to use raw pointers and the internal Blender
mesh types, mostly because there is no sense in having this overhead
when it's already compiled with Blender. In my tests this roughly
halves the Cycles mesh creation time (0.19s to 0.10s for a 1 million
face grid).
Differential Revision: https://developer.blender.org/D15488
2022-09-05 11:56:34 -05:00
|
|
|
const MVert *mvert = NULL, *mv;
|
2018-05-23 11:11:34 +02:00
|
|
|
Mesh *mesh = ob->data;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
Refactor CDData masks, to have one mask per mesh elem type.
We already have different storages for cddata of verts, edges etc.,
'simply' do the same for the mask flags we use all around Blender code
to request some data, or limit some operation to some layers, etc.
Reason we need this is that some cddata types (like Normals) are
actually shared between verts/polys/loops, and we don’t want to generate
clnors everytime we request vnors!
As a side note, this also does final fix to T59338, which was the
trigger for this patch (need to request computed loop normals for
another mesh than evaluated one).
Reviewers: brecht, campbellbarton, sergey
Differential Revision: https://developer.blender.org/D4407
2019-03-07 11:13:40 +01:00
|
|
|
#if 0 /* UNUSED */
|
|
|
|
|
CustomData_MeshMasks mask = CD_MASK_BAREMESH;
|
|
|
|
|
mask.fmask |= CD_MASK_MTFACE | CD_MASK_MCOL;
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->ob_color_source) {
|
|
|
|
|
case TEX_PD_COLOR_VERTCOL:
|
2022-04-20 09:10:10 -05:00
|
|
|
mask.lmask |= CD_MASK_PROP_BYTE_COLOR;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTWEIGHT:
|
Refactor CDData masks, to have one mask per mesh elem type.
We already have different storages for cddata of verts, edges etc.,
'simply' do the same for the mask flags we use all around Blender code
to request some data, or limit some operation to some layers, etc.
Reason we need this is that some cddata types (like Normals) are
actually shared between verts/polys/loops, and we don’t want to generate
clnors everytime we request vnors!
As a side note, this also does final fix to T59338, which was the
trigger for this patch (need to request computed loop normals for
another mesh than evaluated one).
Reviewers: brecht, campbellbarton, sergey
Differential Revision: https://developer.blender.org/D4407
2019-03-07 11:13:40 +01:00
|
|
|
mask.vmask |= CD_MASK_MDEFORMVERT;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
}
|
Refactor CDData masks, to have one mask per mesh elem type.
We already have different storages for cddata of verts, edges etc.,
'simply' do the same for the mask flags we use all around Blender code
to request some data, or limit some operation to some layers, etc.
Reason we need this is that some cddata types (like Normals) are
actually shared between verts/polys/loops, and we don’t want to generate
clnors everytime we request vnors!
As a side note, this also does final fix to T59338, which was the
trigger for this patch (need to request computed loop normals for
another mesh than evaluated one).
Reviewers: brecht, campbellbarton, sergey
Differential Revision: https://developer.blender.org/D4407
2019-03-07 11:13:40 +01:00
|
|
|
#endif
|
2015-03-28 23:50:36 +05:00
|
|
|
|
2022-09-07 00:06:31 -05:00
|
|
|
mvert = BKE_mesh_verts(mesh); /* local object space */
|
2018-05-23 11:11:34 +02:00
|
|
|
pd->totpoints = mesh->totvert;
|
2015-03-28 23:50:36 +05:00
|
|
|
if (pd->totpoints == 0) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-11-03 06:04:42 +00:00
|
|
|
pd->point_tree = BLI_bvhtree_new(pd->totpoints, 0.0, 4, 6);
|
2016-03-24 11:41:44 +01:00
|
|
|
alloc_point_data(pd);
|
|
|
|
|
point_data_pointers(pd, NULL, NULL, &data_color);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
for (i = 0, mv = mvert; i < pd->totpoints; i++, mv++) {
|
2009-11-03 06:04:42 +00:00
|
|
|
float co[3];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
copy_v3_v3(co, mv->co);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2012-04-28 06:31:57 +00:00
|
|
|
switch (pd->ob_cache_space) {
|
2009-11-03 06:04:42 +00:00
|
|
|
case TEX_PD_OBJECTSPACE:
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_OBJECTLOC:
|
2022-10-24 14:16:37 +02:00
|
|
|
mul_m4_v3(ob->object_to_world, co);
|
2010-04-23 23:57:00 +00:00
|
|
|
sub_v3_v3(co, ob->loc);
|
2009-11-03 06:04:42 +00:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_WORLDSPACE:
|
|
|
|
|
default:
|
2022-10-24 14:16:37 +02:00
|
|
|
mul_m4_v3(ob->object_to_world, co);
|
2009-11-03 06:04:42 +00:00
|
|
|
break;
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-11-03 06:04:42 +00:00
|
|
|
BLI_bvhtree_insert(pd->point_tree, i, co, 1);
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->ob_color_source) {
|
|
|
|
|
case TEX_PD_COLOR_VERTCOL:
|
2018-05-23 11:11:34 +02:00
|
|
|
pointdensity_cache_vertex_color(pd, ob, mesh, data_color);
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTWEIGHT:
|
2018-05-23 11:11:34 +02:00
|
|
|
pointdensity_cache_vertex_weight(pd, ob, mesh, data_color);
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTNOR:
|
Refactor: Move normals out of MVert, lazy calculation
As described in T91186, this commit moves mesh vertex normals into a
contiguous array of float vectors in a custom data layer, how face
normals are currently stored.
The main interface is documented in `BKE_mesh.h`. Vertex and face
normals are now calculated on-demand and cached, retrieved with an
"ensure" function. Since the logical state of a mesh is now "has
normals when necessary", they can be retrieved from a `const` mesh.
The goal is to use on-demand calculation for all derived data, but
leave room for eager calculation for performance purposes (modifier
evaluation is threaded, but viewport data generation is not).
**Benefits**
This moves us closer to a SoA approach rather than the current AoS
paradigm. Accessing a contiguous `float3` is much more efficient than
retrieving data from a larger struct. The memory requirements for
accessing only normals or vertex locations are smaller, and at the
cost of more memory usage for just normals, they now don't have to
be converted between float and short, which also simplifies code
In the future, the remaining items can be removed from `MVert`,
leaving only `float3`, which has similar benefits (see T93602).
Removing the combination of derived and original data makes it
conceptually simpler to only calculate normals when necessary.
This is especially important now that we have more opportunities
for temporary meshes in geometry nodes.
**Performance**
In addition to the theoretical future performance improvements by
making `MVert == float3`, I've done some basic performance testing
on this patch directly. The data is fairly rough, but it gives an idea
about where things stand generally.
- Mesh line primitive 4m Verts: 1.16x faster (36 -> 31 ms),
showing that accessing just `MVert` is now more efficient.
- Spring Splash Screen: 1.03-1.06 -> 1.06-1.11 FPS, a very slight
change that at least shows there is no regression.
- Sprite Fright Snail Smoosh: 3.30-3.40 -> 3.42-3.50 FPS, a small
but observable speedup.
- Set Position Node with Scaled Normal: 1.36x faster (53 -> 39 ms),
shows that using normals in geometry nodes is faster.
- Normal Calculation 1.6m Vert Cube: 1.19x faster (25 -> 21 ms),
shows that calculating normals is slightly faster now.
- File Size of 1.6m Vert Cube: 1.03x smaller (214.7 -> 208.4 MB),
Normals are not saved in files, which can help with large meshes.
As for memory usage, it may be slightly more in some cases, but
I didn't observe any difference in the production files I tested.
**Tests**
Some modifiers and cycles test results need to be updated with this
commit, for two reasons:
- The subdivision surface modifier is not responsible for calculating
normals anymore. In master, the modifier creates different normals
than the result of the `Mesh` normal calculation, so this is a bug
fix.
- There are small differences in the results of some modifiers that
use normals because they are not converted to and from `short`
anymore.
**Future improvements**
- Remove `ModifierTypeInfo::dependsOnNormals`. Code in each modifier
already retrieves normals if they are needed anyway.
- Copy normals as part of a better CoW system for attributes.
- Make more areas use lazy instead of eager normal calculation.
- Remove `BKE_mesh_normals_tag_dirty` in more places since that is
now the default state of a new mesh.
- Possibly apply a similar change to derived face corner normals.
Differential Revision: https://developer.blender.org/D12770
2022-01-13 14:37:58 -06:00
|
|
|
pointdensity_cache_vertex_normal(mesh, data_color);
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-10-01 03:35:53 +00:00
|
|
|
BLI_bvhtree_balance(pd->point_tree);
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
2015-03-28 23:50:36 +05:00
|
|
|
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
static void cache_pointdensity(Depsgraph *depsgraph, Scene *scene, PointDensity *pd)
|
2015-03-29 02:14:06 +05:00
|
|
|
{
|
|
|
|
|
if (pd == NULL) {
|
2010-07-26 05:31:31 +00:00
|
|
|
return;
|
2015-03-29 02:14:06 +05:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
if (pd->point_tree) {
|
2008-10-01 03:35:53 +00:00
|
|
|
BLI_bvhtree_free(pd->point_tree);
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
pd->point_tree = NULL;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
Point Density texture
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
Object *ob = pd->object;
|
2009-11-04 08:44:42 +00:00
|
|
|
ParticleSystem *psys;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
if (!ob || !pd->psys) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
psys = BLI_findlink(&ob->particlesystem, pd->psys - 1);
|
|
|
|
|
if (!psys) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
pointdensity_cache_psys(depsgraph, scene, pd, ob, psys);
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2008-09-29 07:56:41 +00:00
|
|
|
else if (pd->source == TEX_PD_OBJECT) {
|
|
|
|
|
Object *ob = pd->object;
|
2019-04-22 09:08:06 +10:00
|
|
|
if (ob && ob->type == OB_MESH) {
|
2018-05-23 11:11:34 +02:00
|
|
|
pointdensity_cache_object(pd, ob);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
|
|
|
|
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
static void free_pointdensity(PointDensity *pd)
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
{
|
2015-03-29 02:14:06 +05:00
|
|
|
if (pd == NULL) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
if (pd->point_tree) {
|
2008-10-01 03:35:53 +00:00
|
|
|
BLI_bvhtree_free(pd->point_tree);
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
pd->point_tree = NULL;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2021-08-06 13:59:38 +10:00
|
|
|
MEM_SAFE_FREE(pd->point_data);
|
2008-11-09 01:16:12 +00:00
|
|
|
pd->totpoints = 0;
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
|
|
|
|
|
2012-06-06 22:38:39 +00:00
|
|
|
typedef struct PointDensityRangeData {
|
2010-03-22 09:30:00 +00:00
|
|
|
float *density;
|
|
|
|
|
float squared_radius;
|
2016-03-24 11:41:44 +01:00
|
|
|
float *point_data_life;
|
|
|
|
|
float *point_data_velocity;
|
|
|
|
|
float *point_data_color;
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
float *vec;
|
2016-03-24 11:41:44 +01:00
|
|
|
float *col;
|
2008-10-22 01:31:46 +00:00
|
|
|
float softness;
|
2010-03-22 09:30:00 +00:00
|
|
|
short falloff_type;
|
2008-10-31 05:29:54 +00:00
|
|
|
short noise_influence;
|
2008-11-09 01:16:12 +00:00
|
|
|
float *age;
|
2011-05-01 03:57:53 +00:00
|
|
|
struct CurveMapping *density_curve;
|
|
|
|
|
float velscale;
|
2008-10-02 01:38:12 +00:00
|
|
|
} PointDensityRangeData;
|
2008-10-01 07:13:28 +00:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
static float density_falloff(PointDensityRangeData *pdr, int index, float squared_dist)
|
|
|
|
|
{
|
|
|
|
|
const float dist = (pdr->squared_radius - squared_dist) / pdr->squared_radius * 0.5f;
|
|
|
|
|
float density = 0.0f;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pdr->falloff_type) {
|
|
|
|
|
case TEX_PD_FALLOFF_STD:
|
|
|
|
|
density = dist;
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_SMOOTH:
|
|
|
|
|
density = 3.0f * dist * dist - 2.0f * dist * dist * dist;
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_SOFT:
|
|
|
|
|
density = pow(dist, pdr->softness);
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_CONSTANT:
|
|
|
|
|
density = pdr->squared_radius;
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_ROOT:
|
|
|
|
|
density = sqrtf(dist);
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_PARTICLE_AGE:
|
2019-04-22 09:08:06 +10:00
|
|
|
if (pdr->point_data_life) {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist * MIN2(pdr->point_data_life[index], 1.0f);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_PARTICLE_VEL:
|
2019-04-22 09:08:06 +10:00
|
|
|
if (pdr->point_data_velocity) {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist * len_v3(&pdr->point_data_velocity[index * 3]) * pdr->velscale;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->density_curve && dist != 0.0f) {
|
2020-08-01 13:02:21 +10:00
|
|
|
BKE_curvemapping_init(pdr->density_curve);
|
2019-08-07 03:21:55 +10:00
|
|
|
density = BKE_curvemapping_evaluateF(pdr->density_curve, 0, density / dist) * dist;
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
return density;
|
|
|
|
|
}
|
|
|
|
|
|
2016-03-19 17:16:50 +11:00
|
|
|
static void accum_density(void *userdata, int index, const float co[3], float squared_dist)
|
2008-10-01 07:13:28 +00:00
|
|
|
{
|
2008-10-02 01:38:12 +00:00
|
|
|
PointDensityRangeData *pdr = (PointDensityRangeData *)userdata;
|
2009-08-26 00:38:43 +00:00
|
|
|
float density = 0.0f;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-19 17:16:50 +11:00
|
|
|
UNUSED_VARS(co);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->point_data_velocity) {
|
|
|
|
|
pdr->vec[0] += pdr->point_data_velocity[index * 3 + 0]; // * density;
|
|
|
|
|
pdr->vec[1] += pdr->point_data_velocity[index * 3 + 1]; // * density;
|
|
|
|
|
pdr->vec[2] += pdr->point_data_velocity[index * 3 + 2]; // * density;
|
2011-06-02 16:59:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->point_data_life) {
|
|
|
|
|
*pdr->age += pdr->point_data_life[index]; // * density;
|
2011-06-02 16:59:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->point_data_color) {
|
|
|
|
|
add_v3_v3(pdr->col, &pdr->point_data_color[index * 3]); // * density;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
density = density_falloff(pdr, index, squared_dist);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
*pdr->density += density;
|
2008-10-01 07:13:28 +00:00
|
|
|
}
|
|
|
|
|
|
2019-03-25 11:55:36 +11:00
|
|
|
static void init_pointdensityrangedata(PointDensity *pd,
|
|
|
|
|
PointDensityRangeData *pdr,
|
|
|
|
|
float *density,
|
|
|
|
|
float *vec,
|
|
|
|
|
float *age,
|
|
|
|
|
float *col,
|
|
|
|
|
struct CurveMapping *density_curve,
|
|
|
|
|
float velscale)
|
2008-11-09 01:16:12 +00:00
|
|
|
{
|
2015-03-28 23:50:36 +05:00
|
|
|
pdr->squared_radius = pd->radius * pd->radius;
|
2008-11-09 01:16:12 +00:00
|
|
|
pdr->density = density;
|
|
|
|
|
pdr->falloff_type = pd->falloff_type;
|
|
|
|
|
pdr->vec = vec;
|
|
|
|
|
pdr->age = age;
|
2016-03-24 11:41:44 +01:00
|
|
|
pdr->col = col;
|
2008-11-09 01:16:12 +00:00
|
|
|
pdr->softness = pd->falloff_softness;
|
|
|
|
|
pdr->noise_influence = pd->noise_influence;
|
2016-03-24 11:41:44 +01:00
|
|
|
point_data_pointers(
|
|
|
|
|
pd, &pdr->point_data_velocity, &pdr->point_data_life, &pdr->point_data_color);
|
2011-05-01 03:57:53 +00:00
|
|
|
pdr->density_curve = density_curve;
|
|
|
|
|
pdr->velscale = velscale;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
|
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
static int pointdensity(PointDensity *pd,
|
|
|
|
|
const float texvec[3],
|
|
|
|
|
TexResult *texres,
|
2016-03-24 11:41:44 +01:00
|
|
|
float r_vec[3],
|
2015-07-18 21:42:39 +02:00
|
|
|
float *r_age,
|
2016-03-24 11:41:44 +01:00
|
|
|
float r_col[3])
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
{
|
2009-03-17 05:33:05 +00:00
|
|
|
int retval = TEX_INT;
|
2008-10-02 01:38:12 +00:00
|
|
|
PointDensityRangeData pdr;
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
float density = 0.0f, age = 0.0f;
|
2016-03-24 11:41:44 +01:00
|
|
|
float vec[3] = {0.0f, 0.0f, 0.0f}, col[3] = {0.0f, 0.0f, 0.0f}, co[3];
|
2008-10-12 23:39:52 +00:00
|
|
|
float turb, noise_fac;
|
2015-03-28 23:50:36 +05:00
|
|
|
int num = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-03-17 05:33:05 +00:00
|
|
|
texres->tin = 0.0f;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
init_pointdensityrangedata(pd,
|
|
|
|
|
&pdr,
|
|
|
|
|
&density,
|
|
|
|
|
vec,
|
|
|
|
|
&age,
|
|
|
|
|
col,
|
2019-03-25 11:55:36 +11:00
|
|
|
(pd->flag & TEX_PD_FALLOFF_CURVE ? pd->falloff_curve : NULL),
|
|
|
|
|
pd->falloff_speed_scale * 0.001f);
|
|
|
|
|
noise_fac = pd->noise_fac * 0.5f; /* better default */
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2011-11-07 01:38:32 +00:00
|
|
|
copy_v3_v3(co, texvec);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-11-09 01:16:12 +00:00
|
|
|
if (point_data_used(pd)) {
|
|
|
|
|
/* does a BVH lookup to find accumulated density and additional point data *
|
|
|
|
|
* stores particle velocity vector in 'vec', and particle lifetime in 'time' */
|
|
|
|
|
num = BLI_bvhtree_range_query(pd->point_tree, co, pd->radius, accum_density, &pdr);
|
|
|
|
|
if (num > 0) {
|
|
|
|
|
age /= num;
|
2015-03-28 23:50:36 +05:00
|
|
|
mul_v3_fl(vec, 1.0f / num);
|
2016-03-24 11:41:44 +01:00
|
|
|
mul_v3_fl(col, 1.0f / num);
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-03-17 05:33:05 +00:00
|
|
|
/* reset */
|
2016-03-24 11:41:44 +01:00
|
|
|
density = 0.0f;
|
|
|
|
|
zero_v3(vec);
|
|
|
|
|
zero_v3(col);
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
if (pd->flag & TEX_PD_TURBULENCE) {
|
2020-11-06 10:59:32 +11:00
|
|
|
turb = BLI_noise_generic_turbulence(pd->noise_size,
|
|
|
|
|
texvec[0] + vec[0],
|
|
|
|
|
texvec[1] + vec[1],
|
|
|
|
|
texvec[2] + vec[2],
|
|
|
|
|
pd->noise_depth,
|
|
|
|
|
0,
|
|
|
|
|
pd->noise_basis);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2019-03-25 11:55:36 +11:00
|
|
|
turb -= 0.5f; /* re-center 0.0-1.0 range around 0 to prevent offsetting result */
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-10-31 05:29:54 +00:00
|
|
|
/* now we have an offset coordinate to use for the density lookup */
|
|
|
|
|
co[0] = texvec[0] + noise_fac * turb;
|
|
|
|
|
co[1] = texvec[1] + noise_fac * turb;
|
|
|
|
|
co[2] = texvec[2] + noise_fac * turb;
|
2008-11-19 05:30:52 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-03-17 05:33:05 +00:00
|
|
|
/* BVH query with the potentially perturbed coordinates */
|
2008-11-19 05:30:52 +00:00
|
|
|
num = BLI_bvhtree_range_query(pd->point_tree, co, pd->radius, accum_density, &pdr);
|
|
|
|
|
if (num > 0) {
|
|
|
|
|
age /= num;
|
2015-03-28 23:50:36 +05:00
|
|
|
mul_v3_fl(vec, 1.0f / num);
|
2016-03-24 11:41:44 +01:00
|
|
|
mul_v3_fl(col, 1.0f / num);
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 22:36:09 +02:00
|
|
|
texres->tin = density;
|
2015-07-18 21:42:39 +02:00
|
|
|
if (r_age != NULL) {
|
|
|
|
|
*r_age = age;
|
|
|
|
|
}
|
|
|
|
|
if (r_vec != NULL) {
|
|
|
|
|
copy_v3_v3(r_vec, vec);
|
|
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (r_col != NULL) {
|
|
|
|
|
copy_v3_v3(r_col, col);
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
return retval;
|
|
|
|
|
}
|
|
|
|
|
|
2018-05-13 14:10:05 +02:00
|
|
|
static void pointdensity_color(
|
|
|
|
|
PointDensity *pd, TexResult *texres, float age, const float vec[3], const float col[3])
|
2015-07-18 21:42:39 +02:00
|
|
|
{
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v4_fl(texres->trgba, 1.0f);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
|
|
|
|
float rgba[4];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->color_source) {
|
|
|
|
|
case TEX_PD_COLOR_PARTAGE:
|
|
|
|
|
if (pd->coba) {
|
2017-12-07 15:52:59 +11:00
|
|
|
if (BKE_colorband_evaluate(pd->coba, age, rgba)) {
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, rgba);
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->tin *= rgba[3];
|
2022-01-28 13:28:31 +01:00
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_PARTSPEED: {
|
|
|
|
|
float speed = len_v3(vec) * pd->speed_scale;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pd->coba) {
|
2017-12-07 15:52:59 +11:00
|
|
|
if (BKE_colorband_evaluate(pd->coba, speed, rgba)) {
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, rgba);
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->tin *= rgba[3];
|
2022-01-28 13:28:31 +01:00
|
|
|
texres->trgba[3] = texres->tin;
|
2019-04-17 06:17:24 +02:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
|
|
|
|
break;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
case TEX_PD_COLOR_PARTVEL:
|
|
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
mul_v3_v3fl(texres->trgba, vec, pd->speed_scale);
|
|
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_CONSTANT:
|
|
|
|
|
default:
|
|
|
|
|
break;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
float rgba[4];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->ob_color_source) {
|
|
|
|
|
case TEX_PD_COLOR_VERTCOL:
|
|
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, col);
|
|
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTWEIGHT:
|
|
|
|
|
texres->talpha = true;
|
2017-12-07 15:52:59 +11:00
|
|
|
if (pd->coba && BKE_colorband_evaluate(pd->coba, col[0], rgba)) {
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, rgba);
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->tin *= rgba[3];
|
|
|
|
|
}
|
|
|
|
|
else {
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, col);
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2022-01-28 13:28:31 +01:00
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTNOR:
|
|
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, col);
|
|
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_CONSTANT:
|
|
|
|
|
default:
|
|
|
|
|
break;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
|
|
|
|
}
|
2015-07-18 22:36:09 +02:00
|
|
|
}
|
|
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
static void sample_dummy_point_density(int resolution, float *values)
|
|
|
|
|
{
|
2020-08-08 13:29:21 +10:00
|
|
|
memset(values, 0, sizeof(float[4]) * resolution * resolution * resolution);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
static void particle_system_minmax(Depsgraph *depsgraph,
|
2017-07-21 11:53:13 +02:00
|
|
|
Scene *scene,
|
2015-09-28 20:10:01 +05:00
|
|
|
Object *object,
|
2015-07-18 21:42:39 +02:00
|
|
|
ParticleSystem *psys,
|
|
|
|
|
float radius,
|
|
|
|
|
float min[3],
|
|
|
|
|
float max[3])
|
|
|
|
|
{
|
2015-09-28 20:10:01 +05:00
|
|
|
const float size[3] = {radius, radius, radius};
|
2021-07-12 16:15:03 +02:00
|
|
|
const float cfra = BKE_scene_ctime_get(scene);
|
2015-07-18 21:42:39 +02:00
|
|
|
ParticleSettings *part = psys->part;
|
2015-09-28 20:10:01 +05:00
|
|
|
ParticleSimulationData sim = {NULL};
|
|
|
|
|
ParticleData *pa = NULL;
|
|
|
|
|
int i;
|
|
|
|
|
int total_particles;
|
|
|
|
|
float mat[4][4], imat[4][4];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
INIT_MINMAX(min, max);
|
|
|
|
|
if (part->type == PART_HAIR) {
|
2019-08-01 13:53:25 +10:00
|
|
|
/* TODO(sergey): Not supported currently. */
|
2015-07-18 21:42:39 +02:00
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-09-28 20:10:01 +05:00
|
|
|
unit_m4(mat);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
sim.depsgraph = depsgraph;
|
2015-09-28 20:10:01 +05:00
|
|
|
sim.scene = scene;
|
|
|
|
|
sim.ob = object;
|
|
|
|
|
sim.psys = psys;
|
|
|
|
|
sim.psmd = psys_get_modifier(object, psys);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-10-24 14:16:37 +02:00
|
|
|
invert_m4_m4(imat, object->object_to_world);
|
2015-09-28 20:10:01 +05:00
|
|
|
total_particles = psys->totpart + psys->totchild;
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_init(&sim);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-09-28 20:10:01 +05:00
|
|
|
for (i = 0, pa = psys->particles; i < total_particles; i++, pa++) {
|
2015-07-18 21:42:39 +02:00
|
|
|
float co_object[3], co_min[3], co_max[3];
|
2015-09-28 20:10:01 +05:00
|
|
|
ParticleKey state;
|
|
|
|
|
state.time = cfra;
|
|
|
|
|
if (!psys_get_particle_state(&sim, i, &state, 0)) {
|
|
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
mul_v3_m4v3(co_object, imat, state.co);
|
2015-07-18 21:42:39 +02:00
|
|
|
sub_v3_v3v3(co_min, co_object, size);
|
|
|
|
|
add_v3_v3v3(co_max, co_object, size);
|
|
|
|
|
minmax_v3v3_v3(min, max, co_min);
|
|
|
|
|
minmax_v3v3_v3(min, max, co_max);
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_free(&sim);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
void RE_point_density_cache(struct Depsgraph *depsgraph, PointDensity *pd)
|
2015-11-25 17:38:12 +05:00
|
|
|
{
|
2018-02-28 13:54:00 -03:00
|
|
|
Scene *scene = DEG_get_evaluated_scene(depsgraph);
|
2017-07-21 11:53:13 +02:00
|
|
|
|
2019-09-19 11:30:01 +02:00
|
|
|
/* Same matrices/resolution as dupli_render_particle_set(). */
|
2015-11-25 17:38:12 +05:00
|
|
|
BLI_mutex_lock(&sample_mutex);
|
2018-05-23 11:11:34 +02:00
|
|
|
cache_pointdensity(depsgraph, scene, pd);
|
2015-11-25 17:38:12 +05:00
|
|
|
BLI_mutex_unlock(&sample_mutex);
|
|
|
|
|
}
|
|
|
|
|
|
2016-01-27 07:32:48 +11:00
|
|
|
void RE_point_density_minmax(struct Depsgraph *depsgraph,
|
|
|
|
|
struct PointDensity *pd,
|
|
|
|
|
float r_min[3],
|
|
|
|
|
float r_max[3])
|
2015-07-18 21:42:39 +02:00
|
|
|
{
|
2018-02-28 13:54:00 -03:00
|
|
|
Scene *scene = DEG_get_evaluated_scene(depsgraph);
|
2015-07-18 21:42:39 +02:00
|
|
|
Object *object = pd->object;
|
|
|
|
|
if (object == NULL) {
|
2016-01-26 12:50:55 +01:00
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
2016-02-01 08:34:29 +01:00
|
|
|
return;
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
|
|
|
|
ParticleSystem *psys;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
if (pd->psys == 0) {
|
2016-01-26 12:50:55 +01:00
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
2015-07-18 21:42:39 +02:00
|
|
|
return;
|
|
|
|
|
}
|
|
|
|
|
psys = BLI_findlink(&object->particlesystem, pd->psys - 1);
|
|
|
|
|
if (psys == NULL) {
|
2016-01-26 12:50:55 +01:00
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
2015-07-18 21:42:39 +02:00
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
particle_system_minmax(depsgraph, scene, object, psys, pd->radius, r_min, r_max);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
else {
|
2020-08-07 22:36:11 +10:00
|
|
|
const float radius[3] = {pd->radius, pd->radius, pd->radius};
|
2022-04-01 13:45:02 -05:00
|
|
|
const BoundBox *bb = BKE_object_boundbox_get(object);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2017-02-28 12:39:40 +01:00
|
|
|
if (bb != NULL) {
|
|
|
|
|
BLI_assert((bb->flag & BOUNDBOX_DIRTY) == 0);
|
|
|
|
|
copy_v3_v3(r_min, bb->vec[0]);
|
|
|
|
|
copy_v3_v3(r_max, bb->vec[6]);
|
2016-02-28 22:56:18 +01:00
|
|
|
/* Adjust texture space to include density points on the boundaries. */
|
|
|
|
|
sub_v3_v3(r_min, radius);
|
|
|
|
|
add_v3_v3(r_max, radius);
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
|
|
|
|
}
|
2016-01-26 12:50:55 +01:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
typedef struct SampleCallbackData {
|
|
|
|
|
PointDensity *pd;
|
|
|
|
|
int resolution;
|
|
|
|
|
float *min, *dim;
|
|
|
|
|
float *values;
|
|
|
|
|
} SampleCallbackData;
|
|
|
|
|
|
2018-01-10 12:49:51 +01:00
|
|
|
static void point_density_sample_func(void *__restrict data_v,
|
|
|
|
|
const int iter,
|
2019-07-30 14:56:47 +02:00
|
|
|
const TaskParallelTLS *__restrict UNUSED(tls))
|
2016-02-23 12:19:56 +01:00
|
|
|
{
|
|
|
|
|
SampleCallbackData *data = (SampleCallbackData *)data_v;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
const int resolution = data->resolution;
|
|
|
|
|
const int resolution2 = resolution * resolution;
|
|
|
|
|
const float *min = data->min, *dim = data->dim;
|
|
|
|
|
PointDensity *pd = data->pd;
|
|
|
|
|
float *values = data->values;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-05-13 14:10:05 +02:00
|
|
|
if (!pd || !pd->point_tree) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
size_t z = (size_t)iter;
|
2019-09-08 00:12:26 +10:00
|
|
|
for (size_t y = 0; y < resolution; y++) {
|
|
|
|
|
for (size_t x = 0; x < resolution; x++) {
|
2016-02-23 12:19:56 +01:00
|
|
|
size_t index = z * resolution2 + y * resolution + x;
|
|
|
|
|
float texvec[3];
|
2016-03-24 11:41:44 +01:00
|
|
|
float age, vec[3], col[3];
|
2016-02-23 12:19:56 +01:00
|
|
|
TexResult texres;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
copy_v3_v3(texvec, min);
|
|
|
|
|
texvec[0] += dim[0] * (float)x / (float)resolution;
|
|
|
|
|
texvec[1] += dim[1] * (float)y / (float)resolution;
|
|
|
|
|
texvec[2] += dim[2] * (float)z / (float)resolution;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-21 19:35:12 +02:00
|
|
|
pointdensity(pd, texvec, &texres, vec, &age, col);
|
|
|
|
|
pointdensity_color(pd, &texres, age, vec, col);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(&values[index * 4 + 0], texres.trgba);
|
2019-03-25 11:55:36 +11:00
|
|
|
values[index * 4 + 3] = texres.tin;
|
2016-02-23 12:19:56 +01:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2016-01-27 07:32:48 +11:00
|
|
|
void RE_point_density_sample(Depsgraph *depsgraph,
|
|
|
|
|
PointDensity *pd,
|
|
|
|
|
const int resolution,
|
|
|
|
|
float *values)
|
2016-01-26 12:50:55 +01:00
|
|
|
{
|
|
|
|
|
Object *object = pd->object;
|
|
|
|
|
float min[3], max[3], dim[3];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-01-26 12:50:55 +01:00
|
|
|
/* TODO(sergey): Implement some sort of assert() that point density
|
|
|
|
|
* was cached already.
|
|
|
|
|
*/
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-01-26 12:50:55 +01:00
|
|
|
if (object == NULL) {
|
|
|
|
|
sample_dummy_point_density(resolution, values);
|
|
|
|
|
return;
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-24 14:28:35 +01:00
|
|
|
BLI_mutex_lock(&sample_mutex);
|
2018-04-06 12:07:27 +02:00
|
|
|
RE_point_density_minmax(depsgraph, pd, min, max);
|
2016-02-24 14:28:35 +01:00
|
|
|
BLI_mutex_unlock(&sample_mutex);
|
2015-07-18 21:42:39 +02:00
|
|
|
sub_v3_v3v3(dim, max, min);
|
|
|
|
|
if (dim[0] <= 0.0f || dim[1] <= 0.0f || dim[2] <= 0.0f) {
|
|
|
|
|
sample_dummy_point_density(resolution, values);
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
SampleCallbackData data;
|
|
|
|
|
data.pd = pd;
|
|
|
|
|
data.resolution = resolution;
|
|
|
|
|
data.min = min;
|
|
|
|
|
data.dim = dim;
|
|
|
|
|
data.values = values;
|
2019-07-30 14:56:47 +02:00
|
|
|
TaskParallelSettings settings;
|
2018-01-08 11:35:48 +01:00
|
|
|
BLI_parallel_range_settings_defaults(&settings);
|
|
|
|
|
settings.use_threading = (resolution > 32);
|
2016-02-23 12:19:56 +01:00
|
|
|
BLI_task_parallel_range(0, resolution, &data, point_density_sample_func, &settings);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 14:33:15 +01:00
|
|
|
free_pointdensity(pd);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
2016-02-23 11:58:27 +01:00
|
|
|
|
|
|
|
|
void RE_point_density_free(struct PointDensity *pd)
|
|
|
|
|
{
|
|
|
|
|
free_pointdensity(pd);
|
|
|
|
|
}
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
|
|
|
|
|
void RE_point_density_fix_linking(void)
|
|
|
|
|
{
|
|
|
|
|
}
|