Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
sure if this is 'correct' but so far in testing it's been working
pretty well.
This also exposes a new 'Nearest' value, to determine how many
nearby particles are taken into account when determining density.
A greater number is more accurate, but slower.
Added Bullet/Gimpact concave collision detection to Blender. If your build system isn't updated yet, please add extern/bullet2/src/BulletCollision/Gimpact/*
This allows moving/dynamic concave triangle meshes (decomposing meshes into compound convex shapes, and using 'compound' shapes is still preferred)
Initial commit for supporting rendering particles directly as
volume density. It works by looking up how many particles are
within a specified radius of the currently shaded point and using
that to calculate density (which is used just as any other
measure of density would be).
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test01.mov
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test01.blend
Right now it's an early implementation, just to see that it can
work - it may end up changing quite a bit. Currently, it's just a
single switch on the volume material - it looks up all particles
in the scene for density at the current shaded point in world
space (so the volume region must enclose the particles in order
to render them.
This will probably change - one idea I have is to make the
particle density estimation a procedural texture with options for:
* the object and particle system to use
* the origin of the co-ordinate system, i.e. object center, world
space, etc.
This would allow you in a sense, to instance particle systems for
render - you only need to bake one particle system, but you can
render it anywhere.
Anyway, plenty of work to do here, firstly on getting a nice
density evaluation with falloff etc...
set a fake world transform for game soft bodies, based on center of the AABB, so visiblity and some game logic works. note: this world transform is not smooth.
- A check for getting the "better" dm was missing from the boids code. I plan on converting the boids code into using the collision modifier one of these days but hopefully this quick fix will do for now.
from Dalai Felinto (dfelinto)
would useually not encourage prints in these cases, except its often useful to know if an error happened since you last pressed Pkey and without this you end up needing to manually clear the terminal.
Just getting rid of license_key stuff.
The project files still need to be updated:
projectfiles_vc7/blender/src/BL_src_cre.vcproj
projectfiles_vc7/blender/blendercompactNG.vcproj
Just search for these files in them.
Kent
solids, in front of other volumes, etc. Now there's a 'layer depth'
value that works similarly to refraction depth - a limit for how many
times the view ray will penetrate different volumetric surfaces.
I have it close to being able to return alpha, but it's still not 100%
correct and needs a bit more work. Going to sit on this for a while.
* Now it's possible to render with the camera inside a volume. I'm not sure how this goes with overlapping volumes yet, will look at it. But it allows nice things like this :)
http://mke3.net/blender/devel/rendering/volumetrics/clouds_sky.mov
* Sped up shading significantly by not doing any shading if the density
of the current sample is less than 0.01 (there's nothing to shade there anyway!) Speeds up around 200% on that clouds scene.
* Fixed a bug in global texture coordinates for volume textures
Now other objects (and sky) correctly render if they're partially
inside or behind a volume. Previously all other objects were ignored,
and volumes just rendered on black. The colour of surfaces inside or
behind the volume gets correctly attenuated by the density of the
volume in between - i.e. thicker volumes will block the light coming
from behind. However, other solid objects don't receive volume shadows
yet, this is to be worked on later.
http://mke3.net/blender/devel/rendering/volumetrics/vol_inside_behind.png
Currently this uses raytracing to find intersections within the volume,
and rays are also traced from the volume, heading behind into the
scene, to see what's behind it (similar effect to ray transp with IOR
1). Because of this, objects inside or behind the volume will not be
antialiased. Perhaps I can come up with a solution for this, but until
then, for antialiasing, you can turn on Full OSA (warning, this will
incur a slowdown). Of course you can always avoid this by rendering
volumes on a separate renderlayer, and compositing in post, too.
Another idea I've started thinking about is to calculate an alpha
value, then use ztransp to overlay on top of other objects. This won't
accurately attenuate and absorb light coming from objects behind the
volume, but for some situations it may be fine, and faster too.
This problem is caused by discontinuities in the conversion
orientation matrix -> euler angles: the angle sign can
switch and thus the direction of the rotation produced
by the dRot Ipo.
To avoid this bug, the matrix->euler conversion must be
avoided during the game. I took the following approach that
is compatible with Blender (identical effect in the game and
in the 3D view):
- no change in Add mode: Rot and dRot are treated as additional
rotation to the orientation at the start of the Ipo. There is
no matrix->euler conversion and thus no discontinuities.
- Rot Ipo are treated as absolute rotation. All 3 axis should
be specified but if they are not, the startup object orientation
will be used to set the unspecified axis. By doing a matrix->
euler conversion once at the start, the discontinuities are
avoided. If there are also dRot curves, they are treated as
delta of the corresponding Rot curve or startup angle.
- dRot Ipo are treated as Add mode in Local axis.
Note about Add mode: Rot and dRot curves are treated identically
during the game. However, only dRot curves make sense because
they don't interfere with the object orientation in the 3D view.
removed calc_curve_subdiv_radius(), curve radius is now calculated the same way as tilt.
Added radius interpolation menu matching tilt interpolation, needed to add "Ease" interpolation type to keep 2.47 curves looking the same.
- In practice this removes the dependency of particle simulations from the update order of objects and different particle systems inside objects.
- As a nice side effect out of this we also get fully correct birth positions for "near reactor particles" (previously for example smoke trail reactor particles were not born smoothly along the target particles path).
-The fix is that particle random size factor should never ever be bigger than 1.0, else negative sizes are possible! Don't know who to blame, but probably my self :)
-The second issue of passing through the deflection and rotating wildly around strange centers is not a bug, but for particles the group and object visualization objects have to be centered on the global origin. I'll probably make an option later to use the object center, but this is how things are for now.
Rather than a single absorption value to control how much light is absorbed as it
travels through a volume, there's now an additional absorption colour. This is
used to absorb different R/G/B components of light at different amounts. For
example, if a white light shines on a volume which absorbs green and blue
components, the volume will appear red.
To make it easier to use, the colour set in the UI is actually the inverse of the
absorption colour, so the colour you set is the colour that the volume will
appear as.
Here's an example of how it works:
http://mke3.net/blender/devel/rendering/volumetrics/vol_col_absorption.jpg
And this can be textured too:
http://mke3.net/blender/devel/rendering/volumetrics/vol_absorb_textured.png
Keep in mind, this doesn't use accurate spectral light wavelength mixing (just
R/G/B channels) so in cases where the absorption colour is fully red green or
blue, you'll get non-physical results.
Todo: refactor the volume texturing internal interface...
this way each edge/segment gets the same number of points matching the resolution value.
before, a nurbs curve would have the same number of points no matter if it was cyclic or not.
This will make slight changes to objects on an animated path, but only noticable if the path has a low resolution.
bug [#11744] NurbCurve Radius incorrect - now dosnt show bad results with order 4 on non-cyclic curve.
- Fixed a shading bug, due to issues in the raytrace engine where it would ignore
intersections from the starting face (as it should). Disabled this for single
scattering intersections, thanks to Brecht for a hint there. It still shows a
little bit of noise, I think due to raytrace inaccuracy, which will have to be
fixed up later.
before: http://mke3.net/blender/devel/rendering/volumetrics/vol_shaded_old.png
after: http://mke3.net/blender/devel/rendering/volumetrics/vol_shaded_correct.png
Now single scatttering shading works very nicely and is capable of things like this:
http://mke3.net/blender/devel/rendering/volumetrics/vol_shaded_clouds.mov
- Added colour emission. Now as well as the overall 'Emit:' slider to control
overall emission strength, there's also a colour swatch for the volume to emit
that colour. This can also be textured, using 'Emit Col' in the map to panel.
This animation was made using a clouds texture, with colour band, mapped to both
emit colour and emit (strength):
http://mke3.net/blender/devel/rendering/volumetrics/vol_col_emit.mov
- Added 'Local' mapping to 'map input' - it's similar to Orco
- Fixed texture 'map input', wasn't using the offsets or scale values.
The constants KX_STATE1 to KX_STATE30 can be used
with setState() to change the object state in a
python controller. The constants are defined in the
GameLogic module so that the full name is
GameLogic.KX_STATE1 to GameLogic.KX_STATE30 but you
can simplify this with the import statement:
from GameLogic import *
cont = getCurrentController()
ob = cont.getOwner()
ob.setState(KX_STATE2) #go to state 2
KX_STATEx constants are defined as (1<<(x-1))
Binary operators |, &, ^ and ~ can be used to combine states:
You can activate more than one state at a time with the | operator:
ob.setState(KX_STATE1|KX_STATE2) #activate state 1 and 2, stop all others
You can add a state to the current state mask with:
state = ob.getState()
ob.setState(state|KX_STATE3) #activate state 3, keep others
You can substract a state to the current state mask with the & and operator:
state = ob.getState()
ob.setState(state&~KX_STATE2) #stop state 2, keep others
You can invert a state with the ^ operator:
state = ob.getState()
ob.setState(state^KX_STATE2) #invert state 2, keep others
- Button for 'shadow color' was drawn over 'layer shadow' button...
The shadow+spot panel was cramped... spot shadowbuffer uses all
space. Moved it to the Lamp panel with label, more clear now.
Panel reorg is for later :)
- Small fix: Area Lamp 'gamma' slider didn't update preview.
Unfortunately had to move this slider to smaller button...
When there was a setup where an object was linked to more than one scene at once, and in one of those scenes some of the objects in that scene were related to it, the objects related to it were not correctly relinked to the new copy.
This was due to the 2nd check for selected objects
if( (base->flag & flag)==flag)
meaning that only selected objects would get corrected.