2023-05-31 16:19:06 +02:00
|
|
|
/* SPDX-FileCopyrightText: 2001-2002 NaN Holding BV. All rights reserved.
|
|
|
|
|
*
|
|
|
|
|
* SPDX-License-Identifier: GPL-2.0-or-later */
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2019-02-18 08:08:12 +11:00
|
|
|
/** \file
|
|
|
|
|
* \ingroup render
|
2011-02-27 19:31:27 +00:00
|
|
|
*/
|
|
|
|
|
|
2023-07-27 21:49:56 +10:00
|
|
|
#include <cmath>
|
|
|
|
|
#include <cstdio>
|
|
|
|
|
#include <cstdlib>
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2008-10-02 01:38:12 +00:00
|
|
|
#include "MEM_guardedalloc.h"
|
|
|
|
|
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
#include "BLI_blenlib.h"
|
2008-10-01 03:35:53 +00:00
|
|
|
#include "BLI_kdopbvh.h"
|
Cleanup: reduce amount of math-related includes
Using ClangBuildAnalyzer on the whole Blender build, it was pointing
out that BLI_math.h is the heaviest "header hub" (i.e. non tiny file
that is included a lot).
However, there's very little (actually zero) source files in Blender
that need "all the math" (base, colors, vectors, matrices,
quaternions, intersection, interpolation, statistics, solvers and
time). A common use case is source files needing just vectors, or
just vectors & matrices, or just colors etc. Actually, 181 files
were including the whole math thing without needing it at all.
This change removes BLI_math.h completely, and instead in all the
places that need it, includes BLI_math_vector.h or BLI_math_color.h
and so on.
Change from that:
- BLI_math_color.h was included 1399 times -> now 408 (took 114.0sec
to parse -> now 36.3sec)
- BLI_simd.h 1403 -> 418 (109.7sec -> 34.9sec).
Full rebuild of Blender (Apple M1, Xcode, RelWithDebInfo) is not
affected much (342sec -> 334sec). Most of benefit would be when
someone's changing BLI_simd.h or BLI_math_color.h or similar files,
that now there's 3x fewer files result in a recompile.
Pull Request #110944
2023-08-09 11:39:20 +03:00
|
|
|
#include "BLI_math_color.h"
|
|
|
|
|
#include "BLI_math_matrix.h"
|
|
|
|
|
#include "BLI_math_vector.h"
|
2020-03-19 09:33:03 +01:00
|
|
|
#include "BLI_noise.h"
|
2016-02-23 12:19:56 +01:00
|
|
|
#include "BLI_task.h"
|
2020-03-19 09:33:03 +01:00
|
|
|
#include "BLI_utildefines.h"
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2015-08-16 17:32:01 +10:00
|
|
|
#include "BLT_translation.h"
|
2013-03-10 16:55:01 +00:00
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
#include "DNA_mesh_types.h"
|
2016-03-24 11:41:44 +01:00
|
|
|
#include "DNA_meshdata_types.h"
|
|
|
|
|
#include "DNA_object_types.h"
|
|
|
|
|
#include "DNA_particle_types.h"
|
2022-08-20 13:42:10 +02:00
|
|
|
#include "DNA_scene_types.h"
|
2016-03-24 11:41:44 +01:00
|
|
|
#include "DNA_texture_types.h"
|
|
|
|
|
|
2017-12-07 15:36:26 +11:00
|
|
|
#include "BKE_colorband.h"
|
2020-03-19 09:33:03 +01:00
|
|
|
#include "BKE_colortools.h"
|
2020-12-15 10:47:58 +11:00
|
|
|
#include "BKE_customdata.h"
|
2016-03-24 11:41:44 +01:00
|
|
|
#include "BKE_deform.h"
|
2008-10-31 05:29:54 +00:00
|
|
|
#include "BKE_lattice.h"
|
2023-08-02 22:14:18 +02:00
|
|
|
#include "BKE_mesh.hh"
|
2015-07-18 21:42:39 +02:00
|
|
|
#include "BKE_object.h"
|
Point Density texture
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
|
|
|
#include "BKE_particle.h"
|
2010-06-27 05:39:55 +00:00
|
|
|
#include "BKE_scene.h"
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2023-09-22 03:18:17 +02:00
|
|
|
#include "DEG_depsgraph.hh"
|
|
|
|
|
#include "DEG_depsgraph_query.hh"
|
2017-07-21 11:53:13 +02:00
|
|
|
|
2020-11-06 14:16:27 -05:00
|
|
|
#include "texture_common.h"
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
|
2020-11-09 15:42:38 +01:00
|
|
|
#include "RE_texture.h"
|
2008-11-09 01:16:12 +00:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
static ThreadMutex sample_mutex = PTHREAD_MUTEX_INITIALIZER;
|
2008-11-09 01:16:12 +00:00
|
|
|
|
2023-06-21 11:29:00 +10:00
|
|
|
enum {
|
|
|
|
|
POINT_DATA_VEL = 1 << 0,
|
|
|
|
|
POINT_DATA_LIFE = 1 << 1,
|
|
|
|
|
POINT_DATA_COLOR = 1 << 2,
|
|
|
|
|
};
|
|
|
|
|
|
2008-11-09 01:16:12 +00:00
|
|
|
static int point_data_used(PointDensity *pd)
|
|
|
|
|
{
|
|
|
|
|
int pd_bitflag = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2011-03-16 18:21:31 +00:00
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
if ((pd->falloff_type == TEX_PD_FALLOFF_PARTICLE_VEL) ||
|
2015-03-28 23:50:36 +05:00
|
|
|
(pd->color_source == TEX_PD_COLOR_PARTVEL) || (pd->color_source == TEX_PD_COLOR_PARTSPEED))
|
|
|
|
|
{
|
2011-03-16 18:21:31 +00:00
|
|
|
pd_bitflag |= POINT_DATA_VEL;
|
2015-03-28 23:50:36 +05:00
|
|
|
}
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
if ((pd->color_source == TEX_PD_COLOR_PARTAGE) ||
|
2015-03-28 23:50:36 +05:00
|
|
|
(pd->falloff_type == TEX_PD_FALLOFF_PARTICLE_AGE))
|
|
|
|
|
{
|
2011-03-16 18:21:31 +00:00
|
|
|
pd_bitflag |= POINT_DATA_LIFE;
|
2015-03-28 23:50:36 +05:00
|
|
|
}
|
2011-03-16 18:21:31 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
else if (pd->source == TEX_PD_OBJECT) {
|
|
|
|
|
if (ELEM(pd->ob_color_source,
|
|
|
|
|
TEX_PD_COLOR_VERTCOL,
|
|
|
|
|
TEX_PD_COLOR_VERTWEIGHT,
|
|
|
|
|
TEX_PD_COLOR_VERTNOR)) {
|
|
|
|
|
pd_bitflag |= POINT_DATA_COLOR;
|
|
|
|
|
}
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-11-09 01:16:12 +00:00
|
|
|
return pd_bitflag;
|
|
|
|
|
}
|
|
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
static void point_data_pointers(PointDensity *pd,
|
|
|
|
|
float **r_data_velocity,
|
|
|
|
|
float **r_data_life,
|
|
|
|
|
float **r_data_color)
|
|
|
|
|
{
|
|
|
|
|
const int data_used = point_data_used(pd);
|
|
|
|
|
const int totpoint = pd->totpoints;
|
|
|
|
|
float *data = pd->point_data;
|
|
|
|
|
int offset = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_VEL) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_velocity) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_velocity = data + offset;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
offset += 3 * totpoint;
|
|
|
|
|
}
|
|
|
|
|
else {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_velocity) {
|
2023-07-27 13:10:42 +02:00
|
|
|
*r_data_velocity = nullptr;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_LIFE) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_life) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_life = data + offset;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
offset += totpoint;
|
|
|
|
|
}
|
|
|
|
|
else {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_life) {
|
2023-07-27 13:10:42 +02:00
|
|
|
*r_data_life = nullptr;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_COLOR) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_color) {
|
2016-03-24 11:41:44 +01:00
|
|
|
*r_data_color = data + offset;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
offset += 3 * totpoint;
|
|
|
|
|
}
|
|
|
|
|
else {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (r_data_color) {
|
2023-07-27 13:10:42 +02:00
|
|
|
*r_data_color = nullptr;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
|
|
|
|
}
|
2008-11-09 01:16:12 +00:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
/* additional data stored alongside the point density BVH,
|
|
|
|
|
* accessible by point index number to retrieve other information
|
2008-11-09 01:16:12 +00:00
|
|
|
* such as particle velocity or lifetime */
|
2016-03-24 11:41:44 +01:00
|
|
|
static void alloc_point_data(PointDensity *pd)
|
2008-11-09 01:16:12 +00:00
|
|
|
{
|
2016-03-24 11:41:44 +01:00
|
|
|
const int totpoints = pd->totpoints;
|
|
|
|
|
int data_used = point_data_used(pd);
|
2008-11-09 01:16:12 +00:00
|
|
|
int data_size = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_VEL) {
|
2008-11-09 01:16:12 +00:00
|
|
|
/* store 3 channels of velocity data */
|
|
|
|
|
data_size += 3;
|
|
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_LIFE) {
|
2008-11-09 01:16:12 +00:00
|
|
|
/* store 1 channel of lifetime data */
|
|
|
|
|
data_size += 1;
|
|
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_used & POINT_DATA_COLOR) {
|
|
|
|
|
/* store 3 channels of RGB data */
|
|
|
|
|
data_size += 3;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
if (data_size) {
|
2023-07-27 13:10:42 +02:00
|
|
|
pd->point_data = static_cast<float *>(
|
|
|
|
|
MEM_callocN(sizeof(float) * data_size * totpoints, "particle point data"));
|
2015-03-28 23:50:36 +05:00
|
|
|
}
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
|
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
static void pointdensity_cache_psys(
|
|
|
|
|
Depsgraph *depsgraph, Scene *scene, PointDensity *pd, Object *ob, ParticleSystem *psys)
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
{
|
|
|
|
|
ParticleKey state;
|
2013-08-08 15:36:03 +00:00
|
|
|
ParticleCacheKey *cache;
|
2023-07-27 13:10:42 +02:00
|
|
|
ParticleSimulationData sim = {nullptr};
|
|
|
|
|
ParticleData *pa = nullptr;
|
2021-07-12 16:15:03 +02:00
|
|
|
float cfra = BKE_scene_ctime_get(scene);
|
2023-03-03 10:09:20 +11:00
|
|
|
int i;
|
|
|
|
|
// int childexists = 0; /* UNUSED */
|
2016-03-24 11:41:44 +01:00
|
|
|
int total_particles;
|
|
|
|
|
int data_used;
|
|
|
|
|
float *data_vel, *data_life;
|
Point Density texture
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
|
|
|
float partco[3];
|
2018-05-23 11:11:34 +02:00
|
|
|
const bool use_render_params = (DEG_get_mode(depsgraph) == DAG_EVAL_RENDER);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
data_used = point_data_used(pd);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-06-23 07:53:49 +10:00
|
|
|
if (!psys_check_enabled(ob, psys, use_render_params)) {
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
sim.depsgraph = depsgraph;
|
2015-03-29 02:34:44 +05:00
|
|
|
sim.scene = scene;
|
2015-03-28 23:50:36 +05:00
|
|
|
sim.ob = ob;
|
|
|
|
|
sim.psys = psys;
|
2015-07-18 22:36:09 +02:00
|
|
|
sim.psmd = psys_get_modifier(ob, psys);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-11-02 14:41:49 +01:00
|
|
|
/* in case ob->world_to_object isn't up-to-date */
|
|
|
|
|
invert_m4_m4(ob->world_to_object, ob->object_to_world);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
total_particles = psys->totpart + psys->totchild;
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_init(&sim);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-10-12 11:38:28 +00:00
|
|
|
pd->point_tree = BLI_bvhtree_new(total_particles, 0.0, 4, 6);
|
2008-11-09 01:16:12 +00:00
|
|
|
pd->totpoints = total_particles;
|
2016-03-24 11:41:44 +01:00
|
|
|
alloc_point_data(pd);
|
2023-07-27 13:10:42 +02:00
|
|
|
point_data_pointers(pd, &data_vel, &data_life, nullptr);
|
2015-03-28 23:50:36 +05:00
|
|
|
|
2011-09-20 08:48:48 +00:00
|
|
|
#if 0 /* UNUSED */
|
2019-05-31 23:21:16 +10:00
|
|
|
if (psys->totchild > 0 && !(psys->part->draw & PART_DRAW_PARENT)) {
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
childexists = 1;
|
2019-05-31 23:21:16 +10:00
|
|
|
}
|
2011-09-20 08:48:48 +00:00
|
|
|
#endif
|
|
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
for (i = 0, pa = psys->particles; i < total_particles; i++, pa++) {
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
if (psys->part->type == PART_HAIR) {
|
|
|
|
|
/* hair particles */
|
2019-04-22 09:08:06 +10:00
|
|
|
if (i < psys->totpart && psys->pathcache) {
|
2013-08-08 15:36:03 +00:00
|
|
|
cache = psys->pathcache[i];
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else if (i >= psys->totpart && psys->childcache) {
|
2013-08-08 15:36:03 +00:00
|
|
|
cache = psys->childcache[i - psys->totpart];
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else {
|
2013-08-08 15:36:03 +00:00
|
|
|
continue;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-01-13 17:24:20 +01:00
|
|
|
cache += cache->segments; /* use endpoint */
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
copy_v3_v3(state.co, cache->co);
|
|
|
|
|
zero_v3(state.vel);
|
|
|
|
|
state.time = 0.0f;
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
/* emitter particles */
|
|
|
|
|
state.time = cfra;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 21:49:56 +10:00
|
|
|
if (!psys_get_particle_state(&sim, i, &state, false)) {
|
2013-08-08 15:36:03 +00:00
|
|
|
continue;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-11-09 01:16:12 +00:00
|
|
|
if (data_used & POINT_DATA_LIFE) {
|
2008-11-10 00:14:35 +00:00
|
|
|
if (i < psys->totpart) {
|
2015-03-28 23:50:36 +05:00
|
|
|
state.time = (cfra - pa->time) / pa->lifetime;
|
2012-03-24 06:38:07 +00:00
|
|
|
}
|
|
|
|
|
else {
|
2015-03-28 23:50:36 +05:00
|
|
|
ChildParticle *cpa = (psys->child + i) - psys->totpart;
|
2009-08-13 05:21:25 +00:00
|
|
|
float pa_birthtime, pa_dietime;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
state.time = psys_get_child_time(psys, cpa, cfra, &pa_birthtime, &pa_dietime);
|
2008-11-10 00:14:35 +00:00
|
|
|
}
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
}
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2013-08-08 15:36:03 +00:00
|
|
|
copy_v3_v3(partco, state.co);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2019-04-22 09:08:06 +10:00
|
|
|
if (pd->psys_cache_space == TEX_PD_OBJECTSPACE) {
|
2022-11-02 14:41:49 +01:00
|
|
|
mul_m4_v3(ob->world_to_object, partco);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2013-08-08 15:36:03 +00:00
|
|
|
else if (pd->psys_cache_space == TEX_PD_OBJECTLOC) {
|
|
|
|
|
sub_v3_v3(partco, ob->loc);
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
/* TEX_PD_WORLDSPACE */
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
BLI_bvhtree_insert(static_cast<BVHTree *>(pd->point_tree), i, partco, 1);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_vel) {
|
2019-03-25 11:55:36 +11:00
|
|
|
data_vel[i * 3 + 0] = state.vel[0];
|
|
|
|
|
data_vel[i * 3 + 1] = state.vel[1];
|
|
|
|
|
data_vel[i * 3 + 2] = state.vel[2];
|
2013-08-08 15:36:03 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (data_life) {
|
|
|
|
|
data_life[i] = state.time;
|
2013-08-08 15:36:03 +00:00
|
|
|
}
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
BLI_bvhtree_balance(static_cast<BVHTree *>(pd->point_tree));
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_free(&sim);
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
|
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
static void pointdensity_cache_vertex_color(PointDensity *pd,
|
2023-07-27 13:10:42 +02:00
|
|
|
Object * /*ob*/,
|
2018-05-23 11:11:34 +02:00
|
|
|
Mesh *mesh,
|
|
|
|
|
float *data_color)
|
2016-03-24 11:41:44 +01:00
|
|
|
{
|
Mesh: Replace MLoop struct with generic attributes
Implements #102359.
Split the `MLoop` struct into two separate integer arrays called
`corner_verts` and `corner_edges`, referring to the vertex each corner
is attached to and the next edge around the face at each corner. These
arrays can be sliced to give access to the edges or vertices in a face.
Then they are often referred to as "poly_verts" or "poly_edges".
The main benefits are halving the necessary memory bandwidth when only
one array is used and simplifications from using regular integer indices
instead of a special-purpose struct.
The commit also starts a renaming from "loop" to "corner" in mesh code.
Like the other mesh struct of array refactors, forward compatibility is
kept by writing files with the older format. This will be done until 4.0
to ease the transition process.
Looking at a small portion of the patch should give a good impression
for the rest of the changes. I tried to make the changes as small as
possible so it's easy to tell the correctness from the diff. Though I
found Blender developers have been very inventive over the last decade
when finding different ways to loop over the corners in a face.
For performance, nearly every piece of code that deals with `Mesh` is
slightly impacted. Any algorithm that is memory bottle-necked should
see an improvement. For example, here is a comparison of interpolating
a vertex float attribute to face corners (Ryzen 3700x):
**Before** (Average: 3.7 ms, Min: 3.4 ms)
```
threading::parallel_for(loops.index_range(), 4096, [&](IndexRange range) {
for (const int64_t i : range) {
dst[i] = src[loops[i].v];
}
});
```
**After** (Average: 2.9 ms, Min: 2.6 ms)
```
array_utils::gather(src, corner_verts, dst);
```
That's an improvement of 28% to the average timings, and it's also a
simplification, since an index-based routine can be used instead.
For more examples using the new arrays, see the design task.
Pull Request: https://projects.blender.org/blender/blender/pulls/104424
2023-03-20 15:55:13 +01:00
|
|
|
const int *corner_verts = BKE_mesh_corner_verts(mesh);
|
2018-05-23 11:11:34 +02:00
|
|
|
const int totloop = mesh->totloop;
|
2016-03-24 11:41:44 +01:00
|
|
|
char layername[MAX_CUSTOMDATA_LAYER_NAME];
|
|
|
|
|
int i;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
BLI_assert(data_color);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-25 21:15:52 +02:00
|
|
|
if (!CustomData_has_layer(&mesh->loop_data, CD_PROP_BYTE_COLOR)) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2022-04-20 09:10:10 -05:00
|
|
|
CustomData_validate_layer_name(
|
2023-07-25 21:15:52 +02:00
|
|
|
&mesh->loop_data, CD_PROP_BYTE_COLOR, pd->vertex_attribute_name, layername);
|
2023-07-27 13:10:42 +02:00
|
|
|
const MLoopCol *mcol = static_cast<const MLoopCol *>(
|
|
|
|
|
CustomData_get_layer_named(&mesh->loop_data, CD_PROP_BYTE_COLOR, layername));
|
2019-04-22 09:08:06 +10:00
|
|
|
if (!mcol) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
/* Stores the number of MLoops using the same vertex, so we can normalize colors. */
|
2023-07-27 13:10:42 +02:00
|
|
|
int *mcorners = static_cast<int *>(
|
|
|
|
|
MEM_callocN(sizeof(int) * pd->totpoints, "point density corner count"));
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
for (i = 0; i < totloop; i++) {
|
Mesh: Replace MLoop struct with generic attributes
Implements #102359.
Split the `MLoop` struct into two separate integer arrays called
`corner_verts` and `corner_edges`, referring to the vertex each corner
is attached to and the next edge around the face at each corner. These
arrays can be sliced to give access to the edges or vertices in a face.
Then they are often referred to as "poly_verts" or "poly_edges".
The main benefits are halving the necessary memory bandwidth when only
one array is used and simplifications from using regular integer indices
instead of a special-purpose struct.
The commit also starts a renaming from "loop" to "corner" in mesh code.
Like the other mesh struct of array refactors, forward compatibility is
kept by writing files with the older format. This will be done until 4.0
to ease the transition process.
Looking at a small portion of the patch should give a good impression
for the rest of the changes. I tried to make the changes as small as
possible so it's easy to tell the correctness from the diff. Though I
found Blender developers have been very inventive over the last decade
when finding different ways to loop over the corners in a face.
For performance, nearly every piece of code that deals with `Mesh` is
slightly impacted. Any algorithm that is memory bottle-necked should
see an improvement. For example, here is a comparison of interpolating
a vertex float attribute to face corners (Ryzen 3700x):
**Before** (Average: 3.7 ms, Min: 3.4 ms)
```
threading::parallel_for(loops.index_range(), 4096, [&](IndexRange range) {
for (const int64_t i : range) {
dst[i] = src[loops[i].v];
}
});
```
**After** (Average: 2.9 ms, Min: 2.6 ms)
```
array_utils::gather(src, corner_verts, dst);
```
That's an improvement of 28% to the average timings, and it's also a
simplification, since an index-based routine can be used instead.
For more examples using the new arrays, see the design task.
Pull Request: https://projects.blender.org/blender/blender/pulls/104424
2023-03-20 15:55:13 +01:00
|
|
|
int v = corner_verts[i];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-10-09 13:16:19 +02:00
|
|
|
if (mcorners[v] == 0) {
|
|
|
|
|
rgb_uchar_to_float(&data_color[v * 3], &mcol[i].r);
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
float col[3];
|
|
|
|
|
rgb_uchar_to_float(col, &mcol[i].r);
|
|
|
|
|
add_v3_v3(&data_color[v * 3], col);
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
++mcorners[v];
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
/* Normalize colors by averaging over mcorners.
|
|
|
|
|
* All the corners share the same vertex, ie. occupy the same point in space.
|
|
|
|
|
*/
|
|
|
|
|
for (i = 0; i < pd->totpoints; i++) {
|
2019-04-22 09:08:06 +10:00
|
|
|
if (mcorners[i] > 0) {
|
2019-03-25 11:55:36 +11:00
|
|
|
mul_v3_fl(&data_color[i * 3], 1.0f / mcorners[i]);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
MEM_freeN(mcorners);
|
|
|
|
|
}
|
|
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
static void pointdensity_cache_vertex_weight(PointDensity *pd,
|
|
|
|
|
Object *ob,
|
|
|
|
|
Mesh *mesh,
|
|
|
|
|
float *data_color)
|
2016-03-24 11:41:44 +01:00
|
|
|
{
|
2018-05-23 11:11:34 +02:00
|
|
|
const int totvert = mesh->totvert;
|
2016-03-24 11:41:44 +01:00
|
|
|
int mdef_index;
|
|
|
|
|
int i;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
BLI_assert(data_color);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
const MDeformVert *mdef = static_cast<const MDeformVert *>(
|
|
|
|
|
CustomData_get_layer(&mesh->vert_data, CD_MDEFORMVERT));
|
2019-04-22 09:08:06 +10:00
|
|
|
if (!mdef) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2021-07-13 12:10:34 -04:00
|
|
|
mdef_index = BKE_id_defgroup_name_index(&mesh->id, pd->vertex_attribute_name);
|
2019-04-22 09:08:06 +10:00
|
|
|
if (mdef_index < 0) {
|
2021-07-13 12:10:34 -04:00
|
|
|
mdef_index = BKE_object_defgroup_active_index_get(ob) - 1;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
if (mdef_index < 0) {
|
2016-03-24 11:41:44 +01:00
|
|
|
return;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-05-13 18:31:29 +02:00
|
|
|
const MDeformVert *dv;
|
2019-09-08 00:12:26 +10:00
|
|
|
for (i = 0, dv = mdef; i < totvert; i++, dv++, data_color += 3) {
|
2016-03-24 11:41:44 +01:00
|
|
|
MDeformWeight *dw;
|
|
|
|
|
int j;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2019-09-08 00:12:26 +10:00
|
|
|
for (j = 0, dw = dv->dw; j < dv->totweight; j++, dw++) {
|
2016-03-24 11:41:44 +01:00
|
|
|
if (dw->def_nr == mdef_index) {
|
|
|
|
|
copy_v3_fl(data_color, dw->weight);
|
|
|
|
|
break;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
Refactor: Move normals out of MVert, lazy calculation
As described in T91186, this commit moves mesh vertex normals into a
contiguous array of float vectors in a custom data layer, how face
normals are currently stored.
The main interface is documented in `BKE_mesh.h`. Vertex and face
normals are now calculated on-demand and cached, retrieved with an
"ensure" function. Since the logical state of a mesh is now "has
normals when necessary", they can be retrieved from a `const` mesh.
The goal is to use on-demand calculation for all derived data, but
leave room for eager calculation for performance purposes (modifier
evaluation is threaded, but viewport data generation is not).
**Benefits**
This moves us closer to a SoA approach rather than the current AoS
paradigm. Accessing a contiguous `float3` is much more efficient than
retrieving data from a larger struct. The memory requirements for
accessing only normals or vertex locations are smaller, and at the
cost of more memory usage for just normals, they now don't have to
be converted between float and short, which also simplifies code
In the future, the remaining items can be removed from `MVert`,
leaving only `float3`, which has similar benefits (see T93602).
Removing the combination of derived and original data makes it
conceptually simpler to only calculate normals when necessary.
This is especially important now that we have more opportunities
for temporary meshes in geometry nodes.
**Performance**
In addition to the theoretical future performance improvements by
making `MVert == float3`, I've done some basic performance testing
on this patch directly. The data is fairly rough, but it gives an idea
about where things stand generally.
- Mesh line primitive 4m Verts: 1.16x faster (36 -> 31 ms),
showing that accessing just `MVert` is now more efficient.
- Spring Splash Screen: 1.03-1.06 -> 1.06-1.11 FPS, a very slight
change that at least shows there is no regression.
- Sprite Fright Snail Smoosh: 3.30-3.40 -> 3.42-3.50 FPS, a small
but observable speedup.
- Set Position Node with Scaled Normal: 1.36x faster (53 -> 39 ms),
shows that using normals in geometry nodes is faster.
- Normal Calculation 1.6m Vert Cube: 1.19x faster (25 -> 21 ms),
shows that calculating normals is slightly faster now.
- File Size of 1.6m Vert Cube: 1.03x smaller (214.7 -> 208.4 MB),
Normals are not saved in files, which can help with large meshes.
As for memory usage, it may be slightly more in some cases, but
I didn't observe any difference in the production files I tested.
**Tests**
Some modifiers and cycles test results need to be updated with this
commit, for two reasons:
- The subdivision surface modifier is not responsible for calculating
normals anymore. In master, the modifier creates different normals
than the result of the `Mesh` normal calculation, so this is a bug
fix.
- There are small differences in the results of some modifiers that
use normals because they are not converted to and from `short`
anymore.
**Future improvements**
- Remove `ModifierTypeInfo::dependsOnNormals`. Code in each modifier
already retrieves normals if they are needed anyway.
- Copy normals as part of a better CoW system for attributes.
- Make more areas use lazy instead of eager normal calculation.
- Remove `BKE_mesh_normals_tag_dirty` in more places since that is
now the default state of a new mesh.
- Possibly apply a similar change to derived face corner normals.
Differential Revision: https://developer.blender.org/D12770
2022-01-13 14:37:58 -06:00
|
|
|
static void pointdensity_cache_vertex_normal(Mesh *mesh, float *data_color)
|
2016-03-24 11:41:44 +01:00
|
|
|
{
|
|
|
|
|
BLI_assert(data_color);
|
2023-08-24 13:10:41 -04:00
|
|
|
const blender::Span<blender::float3> normals = mesh->vert_normals();
|
|
|
|
|
memcpy(data_color, normals.data(), sizeof(float[3]) * mesh->totvert);
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
|
|
|
|
|
2018-05-23 11:11:34 +02:00
|
|
|
static void pointdensity_cache_object(PointDensity *pd, Object *ob)
|
2008-09-29 07:56:41 +00:00
|
|
|
{
|
2016-03-24 11:41:44 +01:00
|
|
|
float *data_color;
|
2008-09-29 07:56:41 +00:00
|
|
|
int i;
|
2023-07-27 13:10:42 +02:00
|
|
|
Mesh *mesh = static_cast<Mesh *>(ob->data);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
Refactor CDData masks, to have one mask per mesh elem type.
We already have different storages for cddata of verts, edges etc.,
'simply' do the same for the mask flags we use all around Blender code
to request some data, or limit some operation to some layers, etc.
Reason we need this is that some cddata types (like Normals) are
actually shared between verts/polys/loops, and we don’t want to generate
clnors everytime we request vnors!
As a side note, this also does final fix to T59338, which was the
trigger for this patch (need to request computed loop normals for
another mesh than evaluated one).
Reviewers: brecht, campbellbarton, sergey
Differential Revision: https://developer.blender.org/D4407
2019-03-07 11:13:40 +01:00
|
|
|
#if 0 /* UNUSED */
|
|
|
|
|
CustomData_MeshMasks mask = CD_MASK_BAREMESH;
|
|
|
|
|
mask.fmask |= CD_MASK_MTFACE | CD_MASK_MCOL;
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->ob_color_source) {
|
|
|
|
|
case TEX_PD_COLOR_VERTCOL:
|
2022-04-20 09:10:10 -05:00
|
|
|
mask.lmask |= CD_MASK_PROP_BYTE_COLOR;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTWEIGHT:
|
Refactor CDData masks, to have one mask per mesh elem type.
We already have different storages for cddata of verts, edges etc.,
'simply' do the same for the mask flags we use all around Blender code
to request some data, or limit some operation to some layers, etc.
Reason we need this is that some cddata types (like Normals) are
actually shared between verts/polys/loops, and we don’t want to generate
clnors everytime we request vnors!
As a side note, this also does final fix to T59338, which was the
trigger for this patch (need to request computed loop normals for
another mesh than evaluated one).
Reviewers: brecht, campbellbarton, sergey
Differential Revision: https://developer.blender.org/D4407
2019-03-07 11:13:40 +01:00
|
|
|
mask.vmask |= CD_MASK_MDEFORMVERT;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
}
|
Refactor CDData masks, to have one mask per mesh elem type.
We already have different storages for cddata of verts, edges etc.,
'simply' do the same for the mask flags we use all around Blender code
to request some data, or limit some operation to some layers, etc.
Reason we need this is that some cddata types (like Normals) are
actually shared between verts/polys/loops, and we don’t want to generate
clnors everytime we request vnors!
As a side note, this also does final fix to T59338, which was the
trigger for this patch (need to request computed loop normals for
another mesh than evaluated one).
Reviewers: brecht, campbellbarton, sergey
Differential Revision: https://developer.blender.org/D4407
2019-03-07 11:13:40 +01:00
|
|
|
#endif
|
2015-03-28 23:50:36 +05:00
|
|
|
|
Mesh: Move positions to a generic attribute
**Changes**
As described in T93602, this patch removes all use of the `MVert`
struct, replacing it with a generic named attribute with the name
`"position"`, consistent with other geometry types.
Variable names have been changed from `verts` to `positions`, to align
with the attribute name and the more generic design (positions are not
vertices, they are just an attribute stored on the point domain).
This change is made possible by previous commits that moved all other
data out of `MVert` to runtime data or other generic attributes. What
remains is mostly a simple type change. Though, the type still shows up
859 times, so the patch is quite large.
One compromise is that now `CD_MASK_BAREMESH` now contains
`CD_PROP_FLOAT3`. With the general move towards generic attributes
over custom data types, we are removing use of these type masks anyway.
**Benefits**
The most obvious benefit is reduced memory usage and the benefits
that brings in memory-bound situations. `float3` is only 3 bytes, in
comparison to `MVert` which was 4. When there are millions of vertices
this starts to matter more.
The other benefits come from using a more generic type. Instead of
writing algorithms specifically for `MVert`, code can just use arrays
of vectors. This will allow eliminating many temporary arrays or
wrappers used to extract positions.
Many possible improvements aren't implemented in this patch, though
I did switch simplify or remove the process of creating temporary
position arrays in a few places.
The design clarity that "positions are just another attribute" brings
allows removing explicit copying of vertices in some procedural
operations-- they are just processed like most other attributes.
**Performance**
This touches so many areas that it's hard to benchmark exhaustively,
but I observed some areas as examples.
* The mesh line node with 4 million count was 1.5x (8ms to 12ms) faster.
* The Spring splash screen went from ~4.3 to ~4.5 fps.
* The subdivision surface modifier/node was slightly faster
RNA access through Python may be slightly slower, since now we need
a name lookup instead of just a custom data type lookup for each index.
**Future Improvements**
* Remove uses of "vert_coords" functions:
* `BKE_mesh_vert_coords_alloc`
* `BKE_mesh_vert_coords_get`
* `BKE_mesh_vert_coords_apply{_with_mat4}`
* Remove more hidden copying of positions
* General simplification now possible in many areas
* Convert more code to C++ to use `float3` instead of `float[3]`
* Currently `reinterpret_cast` is used for those C-API functions
Differential Revision: https://developer.blender.org/D15982
2023-01-10 00:10:43 -05:00
|
|
|
const float(*positions)[3] = BKE_mesh_vert_positions(mesh); /* local object space */
|
2018-05-23 11:11:34 +02:00
|
|
|
pd->totpoints = mesh->totvert;
|
2015-03-28 23:50:36 +05:00
|
|
|
if (pd->totpoints == 0) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-11-03 06:04:42 +00:00
|
|
|
pd->point_tree = BLI_bvhtree_new(pd->totpoints, 0.0, 4, 6);
|
2016-03-24 11:41:44 +01:00
|
|
|
alloc_point_data(pd);
|
2023-07-27 13:10:42 +02:00
|
|
|
point_data_pointers(pd, nullptr, nullptr, &data_color);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
Mesh: Move positions to a generic attribute
**Changes**
As described in T93602, this patch removes all use of the `MVert`
struct, replacing it with a generic named attribute with the name
`"position"`, consistent with other geometry types.
Variable names have been changed from `verts` to `positions`, to align
with the attribute name and the more generic design (positions are not
vertices, they are just an attribute stored on the point domain).
This change is made possible by previous commits that moved all other
data out of `MVert` to runtime data or other generic attributes. What
remains is mostly a simple type change. Though, the type still shows up
859 times, so the patch is quite large.
One compromise is that now `CD_MASK_BAREMESH` now contains
`CD_PROP_FLOAT3`. With the general move towards generic attributes
over custom data types, we are removing use of these type masks anyway.
**Benefits**
The most obvious benefit is reduced memory usage and the benefits
that brings in memory-bound situations. `float3` is only 3 bytes, in
comparison to `MVert` which was 4. When there are millions of vertices
this starts to matter more.
The other benefits come from using a more generic type. Instead of
writing algorithms specifically for `MVert`, code can just use arrays
of vectors. This will allow eliminating many temporary arrays or
wrappers used to extract positions.
Many possible improvements aren't implemented in this patch, though
I did switch simplify or remove the process of creating temporary
position arrays in a few places.
The design clarity that "positions are just another attribute" brings
allows removing explicit copying of vertices in some procedural
operations-- they are just processed like most other attributes.
**Performance**
This touches so many areas that it's hard to benchmark exhaustively,
but I observed some areas as examples.
* The mesh line node with 4 million count was 1.5x (8ms to 12ms) faster.
* The Spring splash screen went from ~4.3 to ~4.5 fps.
* The subdivision surface modifier/node was slightly faster
RNA access through Python may be slightly slower, since now we need
a name lookup instead of just a custom data type lookup for each index.
**Future Improvements**
* Remove uses of "vert_coords" functions:
* `BKE_mesh_vert_coords_alloc`
* `BKE_mesh_vert_coords_get`
* `BKE_mesh_vert_coords_apply{_with_mat4}`
* Remove more hidden copying of positions
* General simplification now possible in many areas
* Convert more code to C++ to use `float3` instead of `float[3]`
* Currently `reinterpret_cast` is used for those C-API functions
Differential Revision: https://developer.blender.org/D15982
2023-01-10 00:10:43 -05:00
|
|
|
for (i = 0; i < pd->totpoints; i++) {
|
2009-11-03 06:04:42 +00:00
|
|
|
float co[3];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
Mesh: Move positions to a generic attribute
**Changes**
As described in T93602, this patch removes all use of the `MVert`
struct, replacing it with a generic named attribute with the name
`"position"`, consistent with other geometry types.
Variable names have been changed from `verts` to `positions`, to align
with the attribute name and the more generic design (positions are not
vertices, they are just an attribute stored on the point domain).
This change is made possible by previous commits that moved all other
data out of `MVert` to runtime data or other generic attributes. What
remains is mostly a simple type change. Though, the type still shows up
859 times, so the patch is quite large.
One compromise is that now `CD_MASK_BAREMESH` now contains
`CD_PROP_FLOAT3`. With the general move towards generic attributes
over custom data types, we are removing use of these type masks anyway.
**Benefits**
The most obvious benefit is reduced memory usage and the benefits
that brings in memory-bound situations. `float3` is only 3 bytes, in
comparison to `MVert` which was 4. When there are millions of vertices
this starts to matter more.
The other benefits come from using a more generic type. Instead of
writing algorithms specifically for `MVert`, code can just use arrays
of vectors. This will allow eliminating many temporary arrays or
wrappers used to extract positions.
Many possible improvements aren't implemented in this patch, though
I did switch simplify or remove the process of creating temporary
position arrays in a few places.
The design clarity that "positions are just another attribute" brings
allows removing explicit copying of vertices in some procedural
operations-- they are just processed like most other attributes.
**Performance**
This touches so many areas that it's hard to benchmark exhaustively,
but I observed some areas as examples.
* The mesh line node with 4 million count was 1.5x (8ms to 12ms) faster.
* The Spring splash screen went from ~4.3 to ~4.5 fps.
* The subdivision surface modifier/node was slightly faster
RNA access through Python may be slightly slower, since now we need
a name lookup instead of just a custom data type lookup for each index.
**Future Improvements**
* Remove uses of "vert_coords" functions:
* `BKE_mesh_vert_coords_alloc`
* `BKE_mesh_vert_coords_get`
* `BKE_mesh_vert_coords_apply{_with_mat4}`
* Remove more hidden copying of positions
* General simplification now possible in many areas
* Convert more code to C++ to use `float3` instead of `float[3]`
* Currently `reinterpret_cast` is used for those C-API functions
Differential Revision: https://developer.blender.org/D15982
2023-01-10 00:10:43 -05:00
|
|
|
copy_v3_v3(co, positions[i]);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2012-04-28 06:31:57 +00:00
|
|
|
switch (pd->ob_cache_space) {
|
2009-11-03 06:04:42 +00:00
|
|
|
case TEX_PD_OBJECTSPACE:
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_OBJECTLOC:
|
2022-10-24 14:16:37 +02:00
|
|
|
mul_m4_v3(ob->object_to_world, co);
|
2010-04-23 23:57:00 +00:00
|
|
|
sub_v3_v3(co, ob->loc);
|
2009-11-03 06:04:42 +00:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_WORLDSPACE:
|
|
|
|
|
default:
|
2022-10-24 14:16:37 +02:00
|
|
|
mul_m4_v3(ob->object_to_world, co);
|
2009-11-03 06:04:42 +00:00
|
|
|
break;
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
BLI_bvhtree_insert(static_cast<BVHTree *>(pd->point_tree), i, co, 1);
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->ob_color_source) {
|
|
|
|
|
case TEX_PD_COLOR_VERTCOL:
|
2018-05-23 11:11:34 +02:00
|
|
|
pointdensity_cache_vertex_color(pd, ob, mesh, data_color);
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTWEIGHT:
|
2018-05-23 11:11:34 +02:00
|
|
|
pointdensity_cache_vertex_weight(pd, ob, mesh, data_color);
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTNOR:
|
Refactor: Move normals out of MVert, lazy calculation
As described in T91186, this commit moves mesh vertex normals into a
contiguous array of float vectors in a custom data layer, how face
normals are currently stored.
The main interface is documented in `BKE_mesh.h`. Vertex and face
normals are now calculated on-demand and cached, retrieved with an
"ensure" function. Since the logical state of a mesh is now "has
normals when necessary", they can be retrieved from a `const` mesh.
The goal is to use on-demand calculation for all derived data, but
leave room for eager calculation for performance purposes (modifier
evaluation is threaded, but viewport data generation is not).
**Benefits**
This moves us closer to a SoA approach rather than the current AoS
paradigm. Accessing a contiguous `float3` is much more efficient than
retrieving data from a larger struct. The memory requirements for
accessing only normals or vertex locations are smaller, and at the
cost of more memory usage for just normals, they now don't have to
be converted between float and short, which also simplifies code
In the future, the remaining items can be removed from `MVert`,
leaving only `float3`, which has similar benefits (see T93602).
Removing the combination of derived and original data makes it
conceptually simpler to only calculate normals when necessary.
This is especially important now that we have more opportunities
for temporary meshes in geometry nodes.
**Performance**
In addition to the theoretical future performance improvements by
making `MVert == float3`, I've done some basic performance testing
on this patch directly. The data is fairly rough, but it gives an idea
about where things stand generally.
- Mesh line primitive 4m Verts: 1.16x faster (36 -> 31 ms),
showing that accessing just `MVert` is now more efficient.
- Spring Splash Screen: 1.03-1.06 -> 1.06-1.11 FPS, a very slight
change that at least shows there is no regression.
- Sprite Fright Snail Smoosh: 3.30-3.40 -> 3.42-3.50 FPS, a small
but observable speedup.
- Set Position Node with Scaled Normal: 1.36x faster (53 -> 39 ms),
shows that using normals in geometry nodes is faster.
- Normal Calculation 1.6m Vert Cube: 1.19x faster (25 -> 21 ms),
shows that calculating normals is slightly faster now.
- File Size of 1.6m Vert Cube: 1.03x smaller (214.7 -> 208.4 MB),
Normals are not saved in files, which can help with large meshes.
As for memory usage, it may be slightly more in some cases, but
I didn't observe any difference in the production files I tested.
**Tests**
Some modifiers and cycles test results need to be updated with this
commit, for two reasons:
- The subdivision surface modifier is not responsible for calculating
normals anymore. In master, the modifier creates different normals
than the result of the `Mesh` normal calculation, so this is a bug
fix.
- There are small differences in the results of some modifiers that
use normals because they are not converted to and from `short`
anymore.
**Future improvements**
- Remove `ModifierTypeInfo::dependsOnNormals`. Code in each modifier
already retrieves normals if they are needed anyway.
- Copy normals as part of a better CoW system for attributes.
- Make more areas use lazy instead of eager normal calculation.
- Remove `BKE_mesh_normals_tag_dirty` in more places since that is
now the default state of a new mesh.
- Possibly apply a similar change to derived face corner normals.
Differential Revision: https://developer.blender.org/D12770
2022-01-13 14:37:58 -06:00
|
|
|
pointdensity_cache_vertex_normal(mesh, data_color);
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
BLI_bvhtree_balance(static_cast<BVHTree *>(pd->point_tree));
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
2015-03-28 23:50:36 +05:00
|
|
|
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
static void cache_pointdensity(Depsgraph *depsgraph, Scene *scene, PointDensity *pd)
|
2015-03-29 02:14:06 +05:00
|
|
|
{
|
2023-07-27 13:10:42 +02:00
|
|
|
if (pd == nullptr) {
|
2010-07-26 05:31:31 +00:00
|
|
|
return;
|
2015-03-29 02:14:06 +05:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
if (pd->point_tree) {
|
2023-07-27 13:10:42 +02:00
|
|
|
BLI_bvhtree_free(static_cast<BVHTree *>(pd->point_tree));
|
|
|
|
|
pd->point_tree = nullptr;
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
Point Density texture
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
Object *ob = pd->object;
|
2009-11-04 08:44:42 +00:00
|
|
|
ParticleSystem *psys;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-03-28 23:50:36 +05:00
|
|
|
if (!ob || !pd->psys) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
psys = static_cast<ParticleSystem *>(BLI_findlink(&ob->particlesystem, pd->psys - 1));
|
2015-03-28 23:50:36 +05:00
|
|
|
if (!psys) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
pointdensity_cache_psys(depsgraph, scene, pd, ob, psys);
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2008-09-29 07:56:41 +00:00
|
|
|
else if (pd->source == TEX_PD_OBJECT) {
|
|
|
|
|
Object *ob = pd->object;
|
2019-04-22 09:08:06 +10:00
|
|
|
if (ob && ob->type == OB_MESH) {
|
2018-05-23 11:11:34 +02:00
|
|
|
pointdensity_cache_object(pd, ob);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2008-09-29 07:56:41 +00:00
|
|
|
}
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
|
|
|
|
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
static void free_pointdensity(PointDensity *pd)
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
{
|
2023-07-27 13:10:42 +02:00
|
|
|
if (pd == nullptr) {
|
2015-03-29 02:14:06 +05:00
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
if (pd->point_tree) {
|
2023-07-27 13:10:42 +02:00
|
|
|
BLI_bvhtree_free(static_cast<BVHTree *>(pd->point_tree));
|
|
|
|
|
pd->point_tree = nullptr;
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2021-08-06 13:59:38 +10:00
|
|
|
MEM_SAFE_FREE(pd->point_data);
|
2008-11-09 01:16:12 +00:00
|
|
|
pd->totpoints = 0;
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
}
|
|
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
struct PointDensityRangeData {
|
2010-03-22 09:30:00 +00:00
|
|
|
float *density;
|
|
|
|
|
float squared_radius;
|
2016-03-24 11:41:44 +01:00
|
|
|
float *point_data_life;
|
|
|
|
|
float *point_data_velocity;
|
|
|
|
|
float *point_data_color;
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
float *vec;
|
2016-03-24 11:41:44 +01:00
|
|
|
float *col;
|
2008-10-22 01:31:46 +00:00
|
|
|
float softness;
|
2010-03-22 09:30:00 +00:00
|
|
|
short falloff_type;
|
2008-10-31 05:29:54 +00:00
|
|
|
short noise_influence;
|
2008-11-09 01:16:12 +00:00
|
|
|
float *age;
|
2023-06-03 08:36:28 +10:00
|
|
|
CurveMapping *density_curve;
|
2011-05-01 03:57:53 +00:00
|
|
|
float velscale;
|
2023-07-27 13:10:42 +02:00
|
|
|
};
|
2008-10-01 07:13:28 +00:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
static float density_falloff(PointDensityRangeData *pdr, int index, float squared_dist)
|
|
|
|
|
{
|
|
|
|
|
const float dist = (pdr->squared_radius - squared_dist) / pdr->squared_radius * 0.5f;
|
|
|
|
|
float density = 0.0f;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pdr->falloff_type) {
|
|
|
|
|
case TEX_PD_FALLOFF_STD:
|
|
|
|
|
density = dist;
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_SMOOTH:
|
|
|
|
|
density = 3.0f * dist * dist - 2.0f * dist * dist * dist;
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_SOFT:
|
|
|
|
|
density = pow(dist, pdr->softness);
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_CONSTANT:
|
|
|
|
|
density = pdr->squared_radius;
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_ROOT:
|
|
|
|
|
density = sqrtf(dist);
|
|
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_PARTICLE_AGE:
|
2019-04-22 09:08:06 +10:00
|
|
|
if (pdr->point_data_life) {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist * MIN2(pdr->point_data_life[index], 1.0f);
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_FALLOFF_PARTICLE_VEL:
|
2019-04-22 09:08:06 +10:00
|
|
|
if (pdr->point_data_velocity) {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist * len_v3(&pdr->point_data_velocity[index * 3]) * pdr->velscale;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
|
|
|
|
else {
|
2016-03-24 11:41:44 +01:00
|
|
|
density = dist;
|
2019-04-22 09:08:06 +10:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->density_curve && dist != 0.0f) {
|
2020-08-01 13:02:21 +10:00
|
|
|
BKE_curvemapping_init(pdr->density_curve);
|
2019-08-07 03:21:55 +10:00
|
|
|
density = BKE_curvemapping_evaluateF(pdr->density_curve, 0, density / dist) * dist;
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
return density;
|
|
|
|
|
}
|
|
|
|
|
|
2016-03-19 17:16:50 +11:00
|
|
|
static void accum_density(void *userdata, int index, const float co[3], float squared_dist)
|
2008-10-01 07:13:28 +00:00
|
|
|
{
|
2008-10-02 01:38:12 +00:00
|
|
|
PointDensityRangeData *pdr = (PointDensityRangeData *)userdata;
|
2009-08-26 00:38:43 +00:00
|
|
|
float density = 0.0f;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-19 17:16:50 +11:00
|
|
|
UNUSED_VARS(co);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->point_data_velocity) {
|
|
|
|
|
pdr->vec[0] += pdr->point_data_velocity[index * 3 + 0]; // * density;
|
|
|
|
|
pdr->vec[1] += pdr->point_data_velocity[index * 3 + 1]; // * density;
|
|
|
|
|
pdr->vec[2] += pdr->point_data_velocity[index * 3 + 2]; // * density;
|
2011-06-02 16:59:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->point_data_life) {
|
|
|
|
|
*pdr->age += pdr->point_data_life[index]; // * density;
|
2011-06-02 16:59:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pdr->point_data_color) {
|
|
|
|
|
add_v3_v3(pdr->col, &pdr->point_data_color[index * 3]); // * density;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
density = density_falloff(pdr, index, squared_dist);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
*pdr->density += density;
|
2008-10-01 07:13:28 +00:00
|
|
|
}
|
|
|
|
|
|
2019-03-25 11:55:36 +11:00
|
|
|
static void init_pointdensityrangedata(PointDensity *pd,
|
|
|
|
|
PointDensityRangeData *pdr,
|
|
|
|
|
float *density,
|
|
|
|
|
float *vec,
|
|
|
|
|
float *age,
|
|
|
|
|
float *col,
|
2023-06-03 08:36:28 +10:00
|
|
|
CurveMapping *density_curve,
|
2019-03-25 11:55:36 +11:00
|
|
|
float velscale)
|
2008-11-09 01:16:12 +00:00
|
|
|
{
|
2015-03-28 23:50:36 +05:00
|
|
|
pdr->squared_radius = pd->radius * pd->radius;
|
2008-11-09 01:16:12 +00:00
|
|
|
pdr->density = density;
|
|
|
|
|
pdr->falloff_type = pd->falloff_type;
|
|
|
|
|
pdr->vec = vec;
|
|
|
|
|
pdr->age = age;
|
2016-03-24 11:41:44 +01:00
|
|
|
pdr->col = col;
|
2008-11-09 01:16:12 +00:00
|
|
|
pdr->softness = pd->falloff_softness;
|
|
|
|
|
pdr->noise_influence = pd->noise_influence;
|
2016-03-24 11:41:44 +01:00
|
|
|
point_data_pointers(
|
|
|
|
|
pd, &pdr->point_data_velocity, &pdr->point_data_life, &pdr->point_data_color);
|
2011-05-01 03:57:53 +00:00
|
|
|
pdr->density_curve = density_curve;
|
|
|
|
|
pdr->velscale = velscale;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
|
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
static int pointdensity(PointDensity *pd,
|
|
|
|
|
const float texvec[3],
|
|
|
|
|
TexResult *texres,
|
2016-03-24 11:41:44 +01:00
|
|
|
float r_vec[3],
|
2015-07-18 21:42:39 +02:00
|
|
|
float *r_age,
|
2016-03-24 11:41:44 +01:00
|
|
|
float r_col[3])
|
* Volumetrics
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
|
|
|
{
|
2009-03-17 05:33:05 +00:00
|
|
|
int retval = TEX_INT;
|
2008-10-02 01:38:12 +00:00
|
|
|
PointDensityRangeData pdr;
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
float density = 0.0f, age = 0.0f;
|
2016-03-24 11:41:44 +01:00
|
|
|
float vec[3] = {0.0f, 0.0f, 0.0f}, col[3] = {0.0f, 0.0f, 0.0f}, co[3];
|
2008-10-12 23:39:52 +00:00
|
|
|
float turb, noise_fac;
|
2015-03-28 23:50:36 +05:00
|
|
|
int num = 0;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-03-17 05:33:05 +00:00
|
|
|
texres->tin = 0.0f;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
init_pointdensityrangedata(pd,
|
|
|
|
|
&pdr,
|
|
|
|
|
&density,
|
|
|
|
|
vec,
|
|
|
|
|
&age,
|
|
|
|
|
col,
|
2023-07-27 13:10:42 +02:00
|
|
|
(pd->flag & TEX_PD_FALLOFF_CURVE ? pd->falloff_curve : nullptr),
|
2019-03-25 11:55:36 +11:00
|
|
|
pd->falloff_speed_scale * 0.001f);
|
|
|
|
|
noise_fac = pd->noise_fac * 0.5f; /* better default */
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2011-11-07 01:38:32 +00:00
|
|
|
copy_v3_v3(co, texvec);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-11-09 01:16:12 +00:00
|
|
|
if (point_data_used(pd)) {
|
|
|
|
|
/* does a BVH lookup to find accumulated density and additional point data *
|
|
|
|
|
* stores particle velocity vector in 'vec', and particle lifetime in 'time' */
|
2023-07-27 13:10:42 +02:00
|
|
|
num = BLI_bvhtree_range_query(
|
|
|
|
|
static_cast<const BVHTree *>(pd->point_tree), co, pd->radius, accum_density, &pdr);
|
2008-11-09 01:16:12 +00:00
|
|
|
if (num > 0) {
|
|
|
|
|
age /= num;
|
2015-03-28 23:50:36 +05:00
|
|
|
mul_v3_fl(vec, 1.0f / num);
|
2016-03-24 11:41:44 +01:00
|
|
|
mul_v3_fl(col, 1.0f / num);
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-03-17 05:33:05 +00:00
|
|
|
/* reset */
|
2016-03-24 11:41:44 +01:00
|
|
|
density = 0.0f;
|
|
|
|
|
zero_v3(vec);
|
|
|
|
|
zero_v3(col);
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
if (pd->flag & TEX_PD_TURBULENCE) {
|
2020-11-06 10:59:32 +11:00
|
|
|
turb = BLI_noise_generic_turbulence(pd->noise_size,
|
|
|
|
|
texvec[0] + vec[0],
|
|
|
|
|
texvec[1] + vec[1],
|
|
|
|
|
texvec[2] + vec[2],
|
|
|
|
|
pd->noise_depth,
|
2023-07-27 21:49:56 +10:00
|
|
|
false,
|
2020-11-06 10:59:32 +11:00
|
|
|
pd->noise_basis);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2019-03-25 11:55:36 +11:00
|
|
|
turb -= 0.5f; /* re-center 0.0-1.0 range around 0 to prevent offsetting result */
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2008-10-31 05:29:54 +00:00
|
|
|
/* now we have an offset coordinate to use for the density lookup */
|
|
|
|
|
co[0] = texvec[0] + noise_fac * turb;
|
|
|
|
|
co[1] = texvec[1] + noise_fac * turb;
|
|
|
|
|
co[2] = texvec[2] + noise_fac * turb;
|
2008-11-19 05:30:52 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2009-03-17 05:33:05 +00:00
|
|
|
/* BVH query with the potentially perturbed coordinates */
|
2023-07-27 13:10:42 +02:00
|
|
|
num = BLI_bvhtree_range_query(
|
|
|
|
|
static_cast<const BVHTree *>(pd->point_tree), co, pd->radius, accum_density, &pdr);
|
2008-11-19 05:30:52 +00:00
|
|
|
if (num > 0) {
|
|
|
|
|
age /= num;
|
2015-03-28 23:50:36 +05:00
|
|
|
mul_v3_fl(vec, 1.0f / num);
|
2016-03-24 11:41:44 +01:00
|
|
|
mul_v3_fl(col, 1.0f / num);
|
* New point density update: Turbulence
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
2008-10-06 12:25:22 +00:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 22:36:09 +02:00
|
|
|
texres->tin = density;
|
2023-07-27 13:10:42 +02:00
|
|
|
if (r_age != nullptr) {
|
2015-07-18 21:42:39 +02:00
|
|
|
*r_age = age;
|
|
|
|
|
}
|
2023-07-27 13:10:42 +02:00
|
|
|
if (r_vec != nullptr) {
|
2015-07-18 21:42:39 +02:00
|
|
|
copy_v3_v3(r_vec, vec);
|
|
|
|
|
}
|
2023-07-27 13:10:42 +02:00
|
|
|
if (r_col != nullptr) {
|
2016-03-24 11:41:44 +01:00
|
|
|
copy_v3_v3(r_col, col);
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
return retval;
|
|
|
|
|
}
|
|
|
|
|
|
2018-05-13 14:10:05 +02:00
|
|
|
static void pointdensity_color(
|
|
|
|
|
PointDensity *pd, TexResult *texres, float age, const float vec[3], const float col[3])
|
2015-07-18 21:42:39 +02:00
|
|
|
{
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v4_fl(texres->trgba, 1.0f);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
|
|
|
|
float rgba[4];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->color_source) {
|
|
|
|
|
case TEX_PD_COLOR_PARTAGE:
|
|
|
|
|
if (pd->coba) {
|
2017-12-07 15:52:59 +11:00
|
|
|
if (BKE_colorband_evaluate(pd->coba, age, rgba)) {
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, rgba);
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->tin *= rgba[3];
|
2022-01-28 13:28:31 +01:00
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_PARTSPEED: {
|
|
|
|
|
float speed = len_v3(vec) * pd->speed_scale;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
if (pd->coba) {
|
2017-12-07 15:52:59 +11:00
|
|
|
if (BKE_colorband_evaluate(pd->coba, speed, rgba)) {
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, rgba);
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->tin *= rgba[3];
|
2022-01-28 13:28:31 +01:00
|
|
|
texres->trgba[3] = texres->tin;
|
2019-04-17 06:17:24 +02:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
}
|
|
|
|
|
break;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2016-03-24 11:41:44 +01:00
|
|
|
case TEX_PD_COLOR_PARTVEL:
|
|
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
mul_v3_v3fl(texres->trgba, vec, pd->speed_scale);
|
|
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_CONSTANT:
|
|
|
|
|
default:
|
|
|
|
|
break;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
float rgba[4];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-03-24 11:41:44 +01:00
|
|
|
switch (pd->ob_color_source) {
|
|
|
|
|
case TEX_PD_COLOR_VERTCOL:
|
|
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, col);
|
|
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTWEIGHT:
|
|
|
|
|
texres->talpha = true;
|
2017-12-07 15:52:59 +11:00
|
|
|
if (pd->coba && BKE_colorband_evaluate(pd->coba, col[0], rgba)) {
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, rgba);
|
2016-03-24 11:41:44 +01:00
|
|
|
texres->tin *= rgba[3];
|
|
|
|
|
}
|
|
|
|
|
else {
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, col);
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
2022-01-28 13:28:31 +01:00
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_VERTNOR:
|
|
|
|
|
texres->talpha = true;
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(texres->trgba, col);
|
|
|
|
|
texres->trgba[3] = texres->tin;
|
2016-03-24 11:41:44 +01:00
|
|
|
break;
|
|
|
|
|
case TEX_PD_COLOR_CONSTANT:
|
|
|
|
|
default:
|
|
|
|
|
break;
|
2008-11-09 01:16:12 +00:00
|
|
|
}
|
|
|
|
|
}
|
2015-07-18 22:36:09 +02:00
|
|
|
}
|
|
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
static void sample_dummy_point_density(int resolution, float *values)
|
|
|
|
|
{
|
2020-08-08 13:29:21 +10:00
|
|
|
memset(values, 0, sizeof(float[4]) * resolution * resolution * resolution);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
static void particle_system_minmax(Depsgraph *depsgraph,
|
2017-07-21 11:53:13 +02:00
|
|
|
Scene *scene,
|
2015-09-28 20:10:01 +05:00
|
|
|
Object *object,
|
2015-07-18 21:42:39 +02:00
|
|
|
ParticleSystem *psys,
|
|
|
|
|
float radius,
|
|
|
|
|
float min[3],
|
|
|
|
|
float max[3])
|
|
|
|
|
{
|
2015-09-28 20:10:01 +05:00
|
|
|
const float size[3] = {radius, radius, radius};
|
2021-07-12 16:15:03 +02:00
|
|
|
const float cfra = BKE_scene_ctime_get(scene);
|
2015-07-18 21:42:39 +02:00
|
|
|
ParticleSettings *part = psys->part;
|
2023-07-27 13:10:42 +02:00
|
|
|
ParticleSimulationData sim = {nullptr};
|
|
|
|
|
ParticleData *pa = nullptr;
|
2015-09-28 20:10:01 +05:00
|
|
|
int i;
|
|
|
|
|
int total_particles;
|
|
|
|
|
float mat[4][4], imat[4][4];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
INIT_MINMAX(min, max);
|
|
|
|
|
if (part->type == PART_HAIR) {
|
2019-08-01 13:53:25 +10:00
|
|
|
/* TODO(sergey): Not supported currently. */
|
2015-07-18 21:42:39 +02:00
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-09-28 20:10:01 +05:00
|
|
|
unit_m4(mat);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
sim.depsgraph = depsgraph;
|
2015-09-28 20:10:01 +05:00
|
|
|
sim.scene = scene;
|
|
|
|
|
sim.ob = object;
|
|
|
|
|
sim.psys = psys;
|
|
|
|
|
sim.psmd = psys_get_modifier(object, psys);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-10-24 14:16:37 +02:00
|
|
|
invert_m4_m4(imat, object->object_to_world);
|
2015-09-28 20:10:01 +05:00
|
|
|
total_particles = psys->totpart + psys->totchild;
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_init(&sim);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-09-28 20:10:01 +05:00
|
|
|
for (i = 0, pa = psys->particles; i < total_particles; i++, pa++) {
|
2015-07-18 21:42:39 +02:00
|
|
|
float co_object[3], co_min[3], co_max[3];
|
2015-09-28 20:10:01 +05:00
|
|
|
ParticleKey state;
|
|
|
|
|
state.time = cfra;
|
2023-07-27 21:49:56 +10:00
|
|
|
if (!psys_get_particle_state(&sim, i, &state, false)) {
|
2015-09-28 20:10:01 +05:00
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
mul_v3_m4v3(co_object, imat, state.co);
|
2015-07-18 21:42:39 +02:00
|
|
|
sub_v3_v3v3(co_min, co_object, size);
|
|
|
|
|
add_v3_v3v3(co_max, co_object, size);
|
|
|
|
|
minmax_v3v3_v3(min, max, co_min);
|
|
|
|
|
minmax_v3v3_v3(min, max, co_max);
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-11-09 20:30:41 +01:00
|
|
|
psys_sim_data_free(&sim);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
|
2023-06-03 08:36:28 +10:00
|
|
|
void RE_point_density_cache(Depsgraph *depsgraph, PointDensity *pd)
|
2015-11-25 17:38:12 +05:00
|
|
|
{
|
2018-02-28 13:54:00 -03:00
|
|
|
Scene *scene = DEG_get_evaluated_scene(depsgraph);
|
2017-07-21 11:53:13 +02:00
|
|
|
|
2019-09-19 11:30:01 +02:00
|
|
|
/* Same matrices/resolution as dupli_render_particle_set(). */
|
2015-11-25 17:38:12 +05:00
|
|
|
BLI_mutex_lock(&sample_mutex);
|
2018-05-23 11:11:34 +02:00
|
|
|
cache_pointdensity(depsgraph, scene, pd);
|
2015-11-25 17:38:12 +05:00
|
|
|
BLI_mutex_unlock(&sample_mutex);
|
|
|
|
|
}
|
|
|
|
|
|
2023-06-03 08:36:28 +10:00
|
|
|
void RE_point_density_minmax(Depsgraph *depsgraph,
|
|
|
|
|
PointDensity *pd,
|
2016-01-27 07:32:48 +11:00
|
|
|
float r_min[3],
|
|
|
|
|
float r_max[3])
|
2015-07-18 21:42:39 +02:00
|
|
|
{
|
2018-02-28 13:54:00 -03:00
|
|
|
Scene *scene = DEG_get_evaluated_scene(depsgraph);
|
2015-07-18 21:42:39 +02:00
|
|
|
Object *object = pd->object;
|
2023-07-27 13:10:42 +02:00
|
|
|
if (object == nullptr) {
|
2016-01-26 12:50:55 +01:00
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
2016-02-01 08:34:29 +01:00
|
|
|
return;
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
if (pd->source == TEX_PD_PSYS) {
|
|
|
|
|
ParticleSystem *psys;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2015-07-18 21:42:39 +02:00
|
|
|
if (pd->psys == 0) {
|
2016-01-26 12:50:55 +01:00
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
2015-07-18 21:42:39 +02:00
|
|
|
return;
|
|
|
|
|
}
|
2023-07-27 13:10:42 +02:00
|
|
|
psys = static_cast<ParticleSystem *>(BLI_findlink(&object->particlesystem, pd->psys - 1));
|
|
|
|
|
if (psys == nullptr) {
|
2016-01-26 12:50:55 +01:00
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
2015-07-18 21:42:39 +02:00
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-06 12:07:27 +02:00
|
|
|
particle_system_minmax(depsgraph, scene, object, psys, pd->radius, r_min, r_max);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
|
|
|
|
else {
|
2020-08-07 22:36:11 +10:00
|
|
|
const float radius[3] = {pd->radius, pd->radius, pd->radius};
|
2022-04-01 13:45:02 -05:00
|
|
|
const BoundBox *bb = BKE_object_boundbox_get(object);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
if (bb != nullptr) {
|
2017-02-28 12:39:40 +01:00
|
|
|
BLI_assert((bb->flag & BOUNDBOX_DIRTY) == 0);
|
|
|
|
|
copy_v3_v3(r_min, bb->vec[0]);
|
|
|
|
|
copy_v3_v3(r_max, bb->vec[6]);
|
2016-02-28 22:56:18 +01:00
|
|
|
/* Adjust texture space to include density points on the boundaries. */
|
|
|
|
|
sub_v3_v3(r_min, radius);
|
|
|
|
|
add_v3_v3(r_max, radius);
|
|
|
|
|
}
|
|
|
|
|
else {
|
|
|
|
|
zero_v3(r_min);
|
|
|
|
|
zero_v3(r_max);
|
|
|
|
|
}
|
2016-01-26 12:50:55 +01:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
struct SampleCallbackData {
|
2016-02-23 12:19:56 +01:00
|
|
|
PointDensity *pd;
|
|
|
|
|
int resolution;
|
|
|
|
|
float *min, *dim;
|
|
|
|
|
float *values;
|
2023-07-27 13:10:42 +02:00
|
|
|
};
|
2016-02-23 12:19:56 +01:00
|
|
|
|
2018-01-10 12:49:51 +01:00
|
|
|
static void point_density_sample_func(void *__restrict data_v,
|
|
|
|
|
const int iter,
|
2023-07-27 13:10:42 +02:00
|
|
|
const TaskParallelTLS *__restrict /*tls*/)
|
2016-02-23 12:19:56 +01:00
|
|
|
{
|
|
|
|
|
SampleCallbackData *data = (SampleCallbackData *)data_v;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
const int resolution = data->resolution;
|
|
|
|
|
const int resolution2 = resolution * resolution;
|
|
|
|
|
const float *min = data->min, *dim = data->dim;
|
|
|
|
|
PointDensity *pd = data->pd;
|
|
|
|
|
float *values = data->values;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-05-13 14:10:05 +02:00
|
|
|
if (!pd || !pd->point_tree) {
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 21:49:56 +10:00
|
|
|
size_t z = size_t(iter);
|
2019-09-08 00:12:26 +10:00
|
|
|
for (size_t y = 0; y < resolution; y++) {
|
|
|
|
|
for (size_t x = 0; x < resolution; x++) {
|
2016-02-23 12:19:56 +01:00
|
|
|
size_t index = z * resolution2 + y * resolution + x;
|
|
|
|
|
float texvec[3];
|
2016-03-24 11:41:44 +01:00
|
|
|
float age, vec[3], col[3];
|
2016-02-23 12:19:56 +01:00
|
|
|
TexResult texres;
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
copy_v3_v3(texvec, min);
|
2023-07-27 21:49:56 +10:00
|
|
|
texvec[0] += dim[0] * float(x) / float(resolution);
|
|
|
|
|
texvec[1] += dim[1] * float(y) / float(resolution);
|
|
|
|
|
texvec[2] += dim[2] * float(z) / float(resolution);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2018-04-21 19:35:12 +02:00
|
|
|
pointdensity(pd, texvec, &texres, vec, &age, col);
|
|
|
|
|
pointdensity_color(pd, &texres, age, vec, col);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2022-01-28 13:28:31 +01:00
|
|
|
copy_v3_v3(&values[index * 4 + 0], texres.trgba);
|
2019-03-25 11:55:36 +11:00
|
|
|
values[index * 4 + 3] = texres.tin;
|
2016-02-23 12:19:56 +01:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2016-01-27 07:32:48 +11:00
|
|
|
void RE_point_density_sample(Depsgraph *depsgraph,
|
|
|
|
|
PointDensity *pd,
|
|
|
|
|
const int resolution,
|
|
|
|
|
float *values)
|
2016-01-26 12:50:55 +01:00
|
|
|
{
|
|
|
|
|
Object *object = pd->object;
|
|
|
|
|
float min[3], max[3], dim[3];
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-01-26 12:50:55 +01:00
|
|
|
/* TODO(sergey): Implement some sort of assert() that point density
|
|
|
|
|
* was cached already.
|
|
|
|
|
*/
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2023-07-27 13:10:42 +02:00
|
|
|
if (object == nullptr) {
|
2016-01-26 12:50:55 +01:00
|
|
|
sample_dummy_point_density(resolution, values);
|
|
|
|
|
return;
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-24 14:28:35 +01:00
|
|
|
BLI_mutex_lock(&sample_mutex);
|
2018-04-06 12:07:27 +02:00
|
|
|
RE_point_density_minmax(depsgraph, pd, min, max);
|
2016-02-24 14:28:35 +01:00
|
|
|
BLI_mutex_unlock(&sample_mutex);
|
2015-07-18 21:42:39 +02:00
|
|
|
sub_v3_v3v3(dim, max, min);
|
|
|
|
|
if (dim[0] <= 0.0f || dim[1] <= 0.0f || dim[2] <= 0.0f) {
|
|
|
|
|
sample_dummy_point_density(resolution, values);
|
|
|
|
|
return;
|
|
|
|
|
}
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 12:19:56 +01:00
|
|
|
SampleCallbackData data;
|
|
|
|
|
data.pd = pd;
|
|
|
|
|
data.resolution = resolution;
|
|
|
|
|
data.min = min;
|
|
|
|
|
data.dim = dim;
|
|
|
|
|
data.values = values;
|
2019-07-30 14:56:47 +02:00
|
|
|
TaskParallelSettings settings;
|
2018-01-08 11:35:48 +01:00
|
|
|
BLI_parallel_range_settings_defaults(&settings);
|
|
|
|
|
settings.use_threading = (resolution > 32);
|
2016-02-23 12:19:56 +01:00
|
|
|
BLI_task_parallel_range(0, resolution, &data, point_density_sample_func, &settings);
|
2019-04-17 06:17:24 +02:00
|
|
|
|
2016-02-23 14:33:15 +01:00
|
|
|
free_pointdensity(pd);
|
2015-07-18 21:42:39 +02:00
|
|
|
}
|
2016-02-23 11:58:27 +01:00
|
|
|
|
2023-06-03 08:36:28 +10:00
|
|
|
void RE_point_density_free(PointDensity *pd)
|
2016-02-23 11:58:27 +01:00
|
|
|
{
|
|
|
|
|
free_pointdensity(pd);
|
|
|
|
|
}
|
Remove Blender Internal and legacy viewport from Blender 2.8.
Brecht authored this commit, but he gave me the honours to actually
do it. Here it goes; Blender Internal. Bye bye, you did great!
* Point density, voxel data, ocean, environment map textures were removed,
as these only worked within BI rendering. Note that the ocean modifier
and the Cycles point density shader node continue to work.
* Dynamic paint using material shading was removed, as this only worked
with BI. If we ever wanted to support this again probably it should go
through the baking API.
* GPU shader export through the Python API was removed. This only worked
for the old BI GLSL shaders, which no longer exists. Doing something
similar for Eevee would be significantly more complicated because it
uses a lot of multiplass rendering and logic outside the shader, it's
probably impractical.
* Collada material import / export code is mostly gone, as it only worked
for BI materials. We need to add Cycles / Eevee material support at some
point.
* The mesh noise operator was removed since it only worked with BI
material texture slots. A displacement modifier can be used instead.
* The delete texture paint slot operator was removed since it only worked
for BI material texture slots. Could be added back with node support.
* Not all legacy viewport features are supported in the new viewport, but
their code was removed. If we need to bring anything back we can look at
older git revisions.
* There is some legacy viewport code that I could not remove yet, and some
that I probably missed.
* Shader node execution code was left mostly intact, even though it is not
used anywhere now. We may eventually use this to replace the texture
nodes with Cycles / Eevee shader nodes.
* The Cycles Bake panel now includes settings for baking multires normal
and displacement maps. The underlying code needs to be merged properly,
and we plan to add back support for multires AO baking and add support
to Cycles baking for features like vertex color, displacement, and other
missing baking features.
* This commit removes DNA and the Python API for BI material, lamp, world
and scene settings. This breaks a lot of addons.
* There is more DNA that can be removed or renamed, where Cycles or Eevee
are reusing some old BI properties but the names are not really correct
anymore.
* Texture slots for materials, lamps and world were removed. They remain
for brushes, particles and freestyle linestyles.
* 'BLENDER_RENDER' remains in the COMPAT_ENGINES of UI panels. Cycles and
other renderers use this to find all panels to show, minus a few panels
that they have their own replacement for.
2018-04-19 17:34:44 +02:00
|
|
|
|
2023-07-27 21:49:56 +10:00
|
|
|
void RE_point_density_fix_linking() {}
|