Finally, here is the basic (functional) prototype of the new animation system which will allow for the infamous "everything is animatable", and which also addresses several of the more serious shortcomings of the old system. Unfortunately, this will break old animation files (especially right now, as I haven't written the version patching code yet), however, this is for the future.
Highlights of the new system:
* Scrapped IPO-Curves/IPO/(Action+Constraint-Channels)/Action system, and replaced it with F-Curve/Action.
- F-Curves (animators from other packages will feel at home with this name) replace IPO-Curves.
- The 'new' Actions, act as the containers for F-Curves, so that they can be reused. They are therefore more akin to the old 'IPO' blocks, except they do not have the blocktype restriction, so you can store materials/texture/geometry F-Curves in the same Action as Object transforms, etc.
* F-Curves use RNA-paths for Data Access, hence allowing "every" (where sensible/editable that is) user-accessible setting from RNA to be animated.
* Drivers are no longer mixed with Animation Data, so rigs will not be that easily broken and several dependency problems can be eliminated. (NOTE: drivers haven't been hooked up yet, but the code is in place)
* F-Curve modifier system allows useful 'large-scale' manipulation of F-Curve values, including (I've only included implemented ones here): envelope deform (similar to lattices to allow broad-scale reshaping of curves), curve generator (polynomial or py-expression), cycles (replacing the old cyclic extrapolation modes, giving more control over this). (NOTE: currently this cannot be tested, as there's not access to them, but the code is all in place)
* NLA system with 'tracks' (i.e. layers), and multiple strips per track. (NOTE: NLA system is not yet functional, as it's only partially coded still)
There are more nice things that I will be preparing some nice docs for soon, but for now, check for more details:
http://lists.blender.org/pipermail/bf-taskforce25/2009-January/000260.html
So, what currently works:
* I've implemented two basic operators for the 3D-view only to Insert and Delete Keyframes. These are tempolary ones only that will be replaced in due course with 'proper' code.
* Object Loc/Rot/Scale can be keyframed. Also, the colour of the 'active' material (Note: this should really be for nth material instead, but that doesn't work yet in RNA) can also be keyframed into the same datablock.
* Standard animation refresh (i.e. animation resulting from NLA and Action evaluation) is now done completely separate from drivers before anything else is done after a frame change. Drivers are handled after this in a separate pass, as dictated by depsgraph flags, etc.
Notes:
* Drivers haven't been hooked up yet
* Only objects and data directly linked to objects can be animated.
* Depsgraph will need further tweaks. Currently, I've only made sure that it will update some things in the most basic cases (i.e. frame change).
* Animation Editors are currently broken (in terms of editing stuff). This will be my next target (priority to get Dopesheet working first, then F-Curve editor - i.e. old IPO Editor)
* I've had to put in large chunks of XXX sandboxing for old animation system code all around the place. This will be cleaned up in due course, as some places need special review.
In particular, the particles and sequencer code have far too many manual calls to calculate + flush animation info, which is really bad (this is a 'please explain yourselves' call to Physics coders!).
Think global, act local!
The old favorite G.scene gone! Man... that took almost 2 days.
Also removed G.curscreen and G.edbo.
Not everything could get solved; here's some notes.
- modifiers now store current scene in ModifierData. This is not
meant for permanent, but it can probably stick there until we
cleaned the anim system and depsgraph to cope better with
timing issues.
- Game engine G.scene should become an argument for staring it.
Didn't solve this yet.
- Texture nodes should get scene cfra, but the current implementation
is too tightly wrapped to do it easily.
From the anti-globalization department:
G.obedit terminated!
Wherever possible, use CTX_data_edit_object(C) to get this
now. It's stored in scene now, and the screen context has
it defined.
svn merge https://svn.blender.org/svnroot/bf-blender/trunk/blender -r12987:17416
Issues:
* GHOST/X11 had conflicting changes. Some code was added in 2.5, which was
later added in trunk also, but reverted partially, specifically revision
16683. I have left out this reversion in the 2.5 branch since I think it is
needed there.
http://projects.blender.org/plugins/scmsvn/viewcvs.php?view=rev&root=bf-blender&revision=16683
* Scons had various conflicting changes, I decided to go with trunk version
for everything except priorities and some library renaming.
* In creator.c, there were various fixes and fixes for fixes related to the -w
-W and -p options. In 2.5 -w and -W is not coded yet, and -p is done
differently. Since this is changed so much, and I don't think those fixes
would be needed in 2.5, I've left them out.
* Also in creator.c: there was code for a python bugfix where the screen was not
initialized when running with -P. The code that initializes the screen there
I had to disable, that can't work in 2.5 anymore but left it commented as a
reminder.
Further I had to disable some new function calls. using src/ and python/, as
was done already in this branch, disabled function calls:
* bpath.c: error reporting
* BME_conversions.c: editmesh conversion functions.
* SHD_dynamic: disabled almost completely, there is no python/.
* KX_PythonInit.cpp and Ketsji/ build files: Mathutils is not there, disabled.
* text.c: clipboard copy call.
* object.c: OB_SUPPORT_MATERIAL.
* DerivedMesh.c and subsurf_ccg, stipple_quarttone.
Still to be done:
* Go over files and functions that were moved to a different location but could
still use changes that were done in trunk.
Robin (Frrr) Allen did a decent job on this, so we can also welcome him
as a member in the svn committers team to maintain it!
I do the first commit with some minor fixes:
- get Makefiles work
- fix rounding issue with tiles on unit faces
- removed UI includes from tex node
A nice doc in wiki is here:
http://wiki.blender.org/index.php/User:Frr/TexnodeManual
On the todo for Robin is:
- When using one or more Texture-input nodes, you cannot edit them by activating
(as works now for Material nodes).
- The new "output node" option fails on the default case, when only one
output node is active. It then shows often a blank menu. Will get fixed asap.
- When using a NodeTree-Texture as input node, the menu for 'active output'
should not show. NodeTree should ignore other nodetrees to keep things sane
for now.
- On a future todo is proper usage of "Dxt" and "Dyt" texture vectors for
superior antialising of checkers/bricks.
General note; I know people are dying to get a full integrated shader system
with nodes. In theory we could merge this with Material Nodetrees... but I
rather wait for a solid and very well thought out design proposal for this,
also including design ideas for unifying with a shader language (GPU, CPU).
For the time being this is a nice extension of current textures. :)
This was a bit complicated to do, but is working pretty well now, and can make shading significantly faster to render.
This option pre-calculates self-shading information into a
3d voxel grid before rendering, then uses and interpolates
that data during the main rendering phase, rather than
calculating shading for each sample. It's an approximation
and isn't as accurate as getting the lighting directly,
but in many cases it looks very similar and renders much faster.
The voxel grid covers the object's 3D screen-aligned bounding box
so this may not be that useful for large volume regions like a
big range of cloud cover, since you'll need a lot of resolution.
The render time speaks for itself here:
http://mke3.net/blender/devel/rendering/volumetrics/vol_light_cache_interpolation.jpg
The resolution is set in the volume panel - it's the resolution
of one edge of the voxel grid. Keep in mind that the higher the
resolution, the more memory needed, like in fluid sim. The
memory requirements increase with the cube of the edge
resolution so be careful. I might try and add a little memory
calculator thing like fluid sim has there later.
The voxels are interpolated using trilinear interpolation -
here's a comparison image I made during testing:
http://mke3.net/blender/devel/rendering/volumetrics/vol_light_cache_compare.jpg
There might still be a couple of little tweaks I can do to
improve the visual quality, I'll see.
- modified point density so that it returns a more consistent
density with regards to search radius. Previously larger radii
would give much higher density but this is equalised out now.
- Added a new volume material option 'density scale'. This is an
overall scale multiplier for density, allowing you to (for
example) crank down the density to a more desirable range if
you're working at a large physical scale. Volume rendering is
fundamentally scale dependant so this lets you correct to get the
right visual result.
- Also tweaked a few constants, old files won't render exactly
the same, just minor things though.
limit ray intersections like as for ray transparency). It
remains to be seen if it's even that useful, and was
preventing refracting materials behind volumes from
working easily.
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
sure if this is 'correct' but so far in testing it's been working
pretty well.
This also exposes a new 'Nearest' value, to determine how many
nearby particles are taken into account when determining density.
A greater number is more accurate, but slower.
solids, in front of other volumes, etc. Now there's a 'layer depth'
value that works similarly to refraction depth - a limit for how many
times the view ray will penetrate different volumetric surfaces.
I have it close to being able to return alpha, but it's still not 100%
correct and needs a bit more work. Going to sit on this for a while.
Rather than a single absorption value to control how much light is absorbed as it
travels through a volume, there's now an additional absorption colour. This is
used to absorb different R/G/B components of light at different amounts. For
example, if a white light shines on a volume which absorbs green and blue
components, the volume will appear red.
To make it easier to use, the colour set in the UI is actually the inverse of the
absorption colour, so the colour you set is the colour that the volume will
appear as.
Here's an example of how it works:
http://mke3.net/blender/devel/rendering/volumetrics/vol_col_absorption.jpg
And this can be textured too:
http://mke3.net/blender/devel/rendering/volumetrics/vol_absorb_textured.png
Keep in mind, this doesn't use accurate spectral light wavelength mixing (just
R/G/B channels) so in cases where the absorption colour is fully red green or
blue, you'll get non-physical results.
Todo: refactor the volume texturing internal interface...
This is an initial commit to get it in SVN and make it easier to work on.
Don't expect it to work perfectly, it's still in development and there's
plenty of work still needing to be done. And so no I'm not very interested
in hearing bug reports or feature requests at this stage :)
There's some info on this, and a todo list at:
http://mke3.net/weblog/volume-rendering/
Right now I'm trying to focus on getting shading working correctly (there's
currently a problem in which 'surfaces' of the volume facing towards or away
from light sources are getting shaded differently to how they should be),
then I'll work on integration issues, like taking materials behind the volume
into account, blending with alpha, etc. You can do simple testing though,
mapping textures to density or emission on a cube with volume material.
the features that are needed to run the game. Compile tested with
scons, make, but not cmake, that seems to have an issue not related
to these changes. The changes include:
* GLSL support in the viewport and game engine, enable in the game
menu in textured draw mode.
* Synced and merged part of the duplicated blender and gameengine/
gameplayer drawing code.
* Further refactoring of game engine drawing code, especially mesh
storage changed a lot.
* Optimizations in game engine armatures to avoid recomputations.
* A python function to get the framerate estimate in game.
* An option take object color into account in materials.
* An option to restrict shadow casters to a lamp's layers.
* Increase from 10 to 18 texture slots for materials, lamps, word.
An extra texture slot shows up once the last slot is used.
* Memory limit for undo, not enabled by default yet because it
needs the .B.blend to be changed.
* Multiple undo for image painting.
* An offset for dupligroups, so not all objects in a group have to
be at the origin.
Fix for bug #7418: texture ipo's didn't show for textures in node materials.
Fix for part of bug #6758: node materials in other node materials could
miss texture coordinates.
=========
- Fix crash in particle transform with the particle system not editable.
- Particle child distribution and caching is now multithreaded.
- Child particles now have a separate Render Amount next to the existing
Amount. The render amount particles are now only distributed and cached
at render time, which should make editing with child particles faster.
- Two new options for diffuse strand shading:
- Surface Diffuse: computes the strand normal taking the normal at
the surface into account.
- Blending Distance: the distance in Blender units over which to
blend in the normal at the surface.
- Special strand rendering for more memory efficient and faster hair and
grass. This is a work in progress, and has a number of known issues,
don't report bugs to me for this feature yet.
More info:
http://www.blender.org/development/current-projects/changes-since-244/particles/
=============
A new "Selected to Active" option in the Bake panel, to (typically) bake
a high poly object onto a low poly object. Code based on patch #7339 by
Frank Richter (Crystal Space developer), thanks!.
Normal Mapping
==============
Camera, World, Object and Tangent space is now supported for baking, and
for material textures. The "NMap TS" setting is replaced with a dropdown
of the four choices in the image texture buttons.
http://www.blender.org/development/current-projects/changes-since-244/render-baking/
in EditButtons, panel "Links and Materials", there's now a browse button
to directly assign a material to selected faces. It does:
- check if material was already in one of the 'slots' of the object
- if so, then use this as index to assign
- if not, then add a new slot, and assign the new index
Initial commit of imagebrowser in trunk.
BIG COMMIT!
Main changes:
* completely reworked imasel space
* creation and storage of the preview images for materials, textures, world and lamp
* thumbnails of images and movie files when browsing in the file system
* loading previews from external .blend when linking or appending
* thumbnail caching according to the Thumbnail Managing Standard: http://jens.triq.net/thumbnail-spec/
* for now just kept imasel access mostly as old imgbrowser (CTRL+F4, CTRL+F1) a bit hidden still.
* filtering of file types (images, movies, .blend, py,...)
* preliminary managing of bookmarks ('B' button to add, XKEY while bookmark active to delete)
More detailed info which will be updated here: http://wiki.blender.org/index.php/User:Elubie/PreviewImageBrowser
Places that need special review (and probably fixes):
* BLO_blendhandle_get_previews in readblenentry
* readfile.c: do_version and refactorings of do_library_append
* UI integration
TODO and known issues still:
* Accented characters do not display correctly with international fonts
* Crash was reported when browsing in directory with movie files
* Bookmark management still needs some UI work (second scrollbar?), feedback here is welcome!
Credits:
Samir Bharadwaj (samirbharadwaj@yahoo.com) for the icon images.
Many thanks to everyone who gave feedback and helped so far!
Material, but relinked the active. Was an old confusing annoying actually.
(And not useful, when do you want 2 material indices with same material?)
Now the 'new' duplicates material, if there is an active material.
Vertex color node worked only if VCol Paint/Light was enabled. Fixed
that, and removed the vertex color node making it part of the geometry
node instead.
Also, preview.blend had black vertex colors for the sphere, so set them
to white like the other primitives.
Node Editor: selecting Material buttons in header crashed, when no buttons
window was opened. Code didn't check for proper window it was called from.
Also: autoname "Cyan" was spelled dutch! :)
The new Material "LightGroups" only worked with lamps in visible layers.
Now also lamps from the group that are not visible are included for
rendering, ensuring that a lightgroup always works on that material,
disregarding layer settings (unless lamp is type 'layer lamp').
Silly: when using vector blur on a curve or text object, without having a
material assigned to it, the default material didn't get initialized OK
for vector blur, causing random streaks.
- Ztransp looked weird in Node previews, only showing the backfacing pixels
- previous change in preview.blend accidentally set camera clipping too low
for correct display of lamp preview
- refresh issue solved in preview when using Node shaders with ray-mirror
By default it is disabled (depth 0.0), so rendering is as usual.
The meaning of "depth" and "falloff" will be extensively shown in the
release log pages. Coming soon!
(Patch provided by Ed Halley)