SunSky didn't include skycolor in raytrace.
Note: there seems to be an error in sunsky when looking straight down,
so this option requires raytracing stuff not in outer space. :)
* subsurf code had a lot of unused variables, removed these where they are obviously not needed. commented if they could be useful later.
* some variables declorations hide existing variables (many of these left), but fixed some that could cause confusion.
* removed unused vars
* obscure python memory leak with colorband.
* make_sample_tables had a loop running wasnt used.
* if 0'd functions in arithb.c that are not used yet.
* made many functions static
- removed ugly clamping function (it was dividing XYZ based on max of
one of the values)
- added option to use Exposure, this only works for brightness (Y).
results look very pleasant, foggy and hazy results are possible.
with exposre==0, no exposure happens for HDR extreme range skies,
this is how yafray rendered it.
- added menu for choosing color spaces (CIE = modern lcds)
Please review! (and yes i know it's still not in World :)
When doing IPO-cleanup, added two break statements to hopefully optimise the evaluation process a teeny-weeny bit. However, that caused more problems than it was worth!
Recoded pidhash's recent Pad0 (Lastview) commits (r.16802 and r.16810). It was causing major issues with Ortho perspective + rotating the view with the MMB. Setting G.vd->view to -1 was not such a valid way to do so (and also, this didn't play nicely with smoothview).
This feature should now work correctly, though there are still one or two places where it doesn't always seem totally correct yet.
The basic idea of this feature, is that after going into camera mode (Pad0), pressing Pad0 again lets you go back to the view as you had it before entering camera mode.
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
1) Anisotropic friction works for static and dynamic objects
2) For soft bodies, assume triangle mesh if no bounds a chosen
3) Form factor == inertia scaling factor, it was actually hooked up in Bullet
4) Only show 'radius' if sphere is chosen, or no bounds+dynamics (== sphere bounds)
Went through and commented all the code in ipo.c, tidying up formating and coding style in places, and also rearranging to have a more logical order in some places. There shouldn't be any major issues arising from this commit.
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
add -nojoystick commandline option: it takes 5 seconds everytime to start the game engine, while there IS no joystick.
In other words: blender -noaudio -nojoystick improves workflow turnaround times for P - ESC from 7 seconds to 1 second!
Improved Bullet soft body advanced options, still work-in-progress. Make sure to create game Bullet soft bodies from scratch, it is not compatible with last weeks builds.
Text, ID names and RGB colours in the interface are now copied to and pasted from the system clipboard allowing them to be copied from and pasted into the text editor. Colours are encoded as floats in the form [r.rrrrrr, g.gggggg, b.bbbbbb] making them easy to use in Python scripts.
The header button is great but it didn't function for 2 seconds between clicks due to the old code waiting between modification checks. Fixed that now too :)
* out of sync text dosnt automatically popup a menu anymore since it was too easy to click on it without intending to, moved this to an alert button on the header.
* "_" character was acting as a delimiter, but in python its not.
* renamed "File" to "Text" (so as not to confuse with blenders file menu)
* added redraw_alltext function to remove many duplicate loops where every text display is redrawn.
It would only work when the bezier point had its handles set to auto before changing to a linear IpoCurve since the handles were being recalculated during transform.
The problem was due to a wrong number of IPO-channels getting keyed for the quaternion channels (3 instead of 4). Was a simple copy+paste error.
Also added in check for using "Limit Distance" constraint when using VisualKeying.
Now auto-keyframing can be enabled/disabled per scene (with the insertion mode also stored per scene). The flags used when insertng keyframes are still stored in the user-prefs.
New scenes have their auto-keyframing settings initialised from the user-preferences.
* Tidied up code a bit to remove an extra var declaration that may have been causing problems with Visual Keying
* Added buttons to Insert/Delete keyframes from current frame into Timeline header. Note that it preferentially works will insert keyframes for a 3d-view (if it exists), otherwise it "should" take the largest area available.