* Remove the manual OSA method but rather pass on derivatives to the
textures. This means that at the moment e.g. the bricks node is not
antialiased, but that image textures are now using mipmaps. Doing
oversampling on the whole nodetree is convenient but it is really
the individual textures that can do filtering best and quickest.
* Image textures in a texture node tree were not color corrected and
did not support 2d mapping, now it's passing along shadeinput to
make this possible. Would like to avoid this but not sure how.
* Fix preview not filling in all pixels when scaling or rotating in
the texture nodes.
do this on drawing instead since SCREEN_OT_animation_step isnt calling the notifier (assume this is to be more efficient?). this isnt slow so is ok to do on drawing.
rename BKE_image_user_calc_imanr to BKE_image_user_calc_frame
After testing and feedback, I've decided to slightly modify the way color
management works internally. While the previous method worked well for
rendering, was a smaller transition and had some advantages over this
new method, it was a bit more ambiguous, and was making things difficult
for other areas such as compositing.
This implementation now considers all color data (with only a couple of
exceptions such as brush colors) to be stored in linear RGB color space,
rather than sRGB as previously. This brings it in line with Nuke, which also
operates this way, quite successfully. Color swatches, pickers, color ramp
display are now gamma corrected to display gamma so you can see what
you're doing, but the numbers themselves are considered linear. This
makes understanding blending modes more clear (a 0.5 value on overlay
will not change the result now) as well as making color swatches act more
predictably in the compositor, however bringing over color values from
applications like photoshop or gimp, that operate in a gamma space,
will give identical results.
This commit will convert over existing files saved by earlier 2.5 versions to
work generally the same, though there may be some slight differences with
things like textures. Now that we're set on changing other areas of shading,
this won't be too disruptive overall.
I've made a diagram explaining the pipeline here:
http://mke3.net/blender/devel/2.5/25_linear_workflow_pipeline.png
and some docs here:
http://www.blender.org/development/release-logs/blender-250/color-management/
* Convert all code to use new functions.
* Branch maintainers may want to skip this commit, and run this
conversion script instead, if they use a lot of math functions
in new code:
http://www.pasteall.org/9052/python
After code review and experimentation, this commit makes some changes to the way that volumes are shaded. Previously, there were problems with the 'scattering' component, in that it wasn't physically correct - it didn't conserve energy and was just acting as a brightness multiplier. This has been changed to be more correct, so that as the light is scattered out of the volume, there is less remaining to penetrate through.
Since this behaviour is very similar to absorption but more useful, absorption has been removed and has been replaced by a 'transmission colour' - controlling the colour of light penetrating through the volume after it has been scattered/absorbed. As well as this, there's now 'reflection', a non-physically correct RGB multiplier for out-scattered light. This is handy for tweaking the overall colour of the volume, without having to worry about wavelength dependent absorption, and its effects on transmitted light. Now at least, even though there is the ability to tweak things non-physically, volume shading is physically based by default, and has a better combination of correctness and ease of use.
There's more detailed information and example images here:
http://wiki.blender.org/index.php/User:Broken/VolumeRendering
Also did some tweaks/optimisation:
* Removed shading step size (was a bit annoying, if it comes back, it will be in a different form)
* Removed phase function options, now just one asymmetry slider controls the range between back-scattering, isotropic scattering, and forward scattering. (note, more extreme values gives artifacts with light cache, will fix...)
* Disabled the extra 'bounce lights' from the preview render for volumes, speeds updates significantly
* Enabled voxeldata texture in preview render
* Fixed volume shadows (they were too dark, fixed by avoiding using the shadfac/AddAlphaLight stuff)
More revisions to come later...
Editors Modules
* render/ module added in editors, moved the preview render code there and
also shading related operators.
* physics/ module made more consistent with other modules. renaming files,
making a single physics_ops.c for operators and keymaps. Also move all
particle related operators here now.
* space_buttons/ now should have only operators relevant to the buttons
specificially.
Updates & Notifiers
* Material/Texture/World/Lamp can now be passed to DAG_id_flush_update,
which will go back to a callback in editors. Eventually these should
be in the depsgraph itself, but for now this gives a unified call for
doing updates.
* GLSL materials are now refreshed on changes. There's still various
cases missing,
* Preview icons now hook into this system, solving various update cases
that were missed before.
* Also fixes issue in my last commit, where some preview would not render,
problem is avoided in the new system.
Icon Rendering
* On systems with support for non-power of two textures, an OpenGL texture
is now used instead of glDrawPixels. This avoids problems with icons get
clipped on region borders. On my Linux desktop, this gives an 1.1x speedup,
and on my Mac laptop a 2.3x speedup overall in redrawing the full window,
with the default setup. The glDrawPixels implementation on Mac seems to
have a lot of overhread.
* Preview icons are now drawn using proper premul alpha, and never faded so
you can see them clearly.
* Also tried to fix issue with texture node preview rendering, globals can't
be used with threads reliably.
Delegates now receive a TexParams* instead of float *coords. This gives texture nodes access to dxt, dyt, cfra as well as coords. This fixes the time node and allows nice sampling to be implemented.
* data reorganisation - uses own values now instead of reusing surface material properties (i.e. an individual density value, rather than reusing alpha) Files saved with the old system won't load up the same after this.
* improved defaults and ui
Integration is still very rough around the edges and WIP, but it works, and can render smoke (using new Smoke format in Voxel Data texture) --> http://vimeo.com/6030983
More to come, but this makes things much easier to work on for me :)
missing commits from peter 20942, 21165, 21170, 21174, 21597
these files still need manual merging
source/blender/makesdna/DNA_sequence_types.h
source/blender/src/sequence.c
source/blender/src/seqeffects.c
source/blender/src/editseq.c
source/blender/include/BSE_sequence.h
Patch by Alfredo de Greef. Considerably improves the quality of bump
mapping, and texture filtering for displacement and warp too. Mainly
this is achieved by getting the texture derivatives just right in
various cases, many thanks to Alfredo for figuring this one out, works
great.
This is enabled by default now, but disabled still for existing
textures to preserve backwards compatibility. Can be enabled with
the "New Bump" option in the material texture slot in the outliner.
Also, I made the range for the normal factor a bit smaller since this
gives stronger effects, but note that you can still type in larger
values than the slider allows.
Patch by Alfredo de Greef with high quality image texture filters.
This adds 3 new filters:
* SAT: Summed Area Tables. This is like mipmaps, but using somewhat
more memory avoids some artifacts.
* EWA: Ellipitical Weighted Average, anisotropic filter.
* FELINE: Fast elliptical lines for anisotropic texture mapping.
The one change I made to this was to try to fix an alpha/premul
problem, hopefully I didn't break anything, it looks compatible
with the existing filter now for me.
- 1st stage: Linear Workflow
This implements automatic linear workflow in Blender's renderer. With the
new Colour Management option on in the Render buttons, all inputs to the
renderer and compositor are converted to linear colour space before
rendering, and gamma corrected afterwards. In essence, this makes all
manual gamma correction with nodes, etc unnecessary, since it's done
automatically through the pipeline.
It's all explained much better in the notes/doc here, so please have a look:
http://wiki.blender.org/index.php/Dev:Source/Blender/Architecture/Colour_Management
And an example of the sort of difference it makes:
http://mke3.net/blender/devel/rendering/b25_colormanagement_test01.jpg
This also enables Colour Management in the default B.blend, and changes the
default lamp falloff to inverse square, which is more correct, and much
easier to use now it's all gamma corrected properly.
Next step is to look into profiles/soft proofing for the compositor.
Thanks to brecht for reviewing and fixing some oversights!
Checker: FORWARD_NULL (help)
File: base/src/source/blender/render/intern/source/texture.c
Function: do_lamp_tex
Description: Variable "dx" tracked as NULL was dereferenced.
Also found a typo the 3rd check was checking projx instead of projz
I also expanded the elses to set dyt as well as dxt.
Kent
Checker: FORWARD_NULL (help)
File: base/src/source/blender/render/intern/source/texture.c
Function: do_lamp_tex
Description: Variable "co" tracked as NULL was dereferenced.
co was set to NULL at the beginning of the function and it could
possibly slip through all the logic above so lets test it before
we use it blindly.
Kent
svn merge https://svn.blender.org/svnroot/bf-blender/trunk/blender -r19820:HEAD
Notes:
* Game and sequencer RNA, and sequencer header are now out of date
a bit after changes in trunk.
* I didn't know how to port these bugfixes, most likely they are
not needed anymore.
* Fix "duplicate strip" always increase the user count for ipo.
* IPO pinning on sequencer strips was lost during Undo.
Finally, here is the basic (functional) prototype of the new animation system which will allow for the infamous "everything is animatable", and which also addresses several of the more serious shortcomings of the old system. Unfortunately, this will break old animation files (especially right now, as I haven't written the version patching code yet), however, this is for the future.
Highlights of the new system:
* Scrapped IPO-Curves/IPO/(Action+Constraint-Channels)/Action system, and replaced it with F-Curve/Action.
- F-Curves (animators from other packages will feel at home with this name) replace IPO-Curves.
- The 'new' Actions, act as the containers for F-Curves, so that they can be reused. They are therefore more akin to the old 'IPO' blocks, except they do not have the blocktype restriction, so you can store materials/texture/geometry F-Curves in the same Action as Object transforms, etc.
* F-Curves use RNA-paths for Data Access, hence allowing "every" (where sensible/editable that is) user-accessible setting from RNA to be animated.
* Drivers are no longer mixed with Animation Data, so rigs will not be that easily broken and several dependency problems can be eliminated. (NOTE: drivers haven't been hooked up yet, but the code is in place)
* F-Curve modifier system allows useful 'large-scale' manipulation of F-Curve values, including (I've only included implemented ones here): envelope deform (similar to lattices to allow broad-scale reshaping of curves), curve generator (polynomial or py-expression), cycles (replacing the old cyclic extrapolation modes, giving more control over this). (NOTE: currently this cannot be tested, as there's not access to them, but the code is all in place)
* NLA system with 'tracks' (i.e. layers), and multiple strips per track. (NOTE: NLA system is not yet functional, as it's only partially coded still)
There are more nice things that I will be preparing some nice docs for soon, but for now, check for more details:
http://lists.blender.org/pipermail/bf-taskforce25/2009-January/000260.html
So, what currently works:
* I've implemented two basic operators for the 3D-view only to Insert and Delete Keyframes. These are tempolary ones only that will be replaced in due course with 'proper' code.
* Object Loc/Rot/Scale can be keyframed. Also, the colour of the 'active' material (Note: this should really be for nth material instead, but that doesn't work yet in RNA) can also be keyframed into the same datablock.
* Standard animation refresh (i.e. animation resulting from NLA and Action evaluation) is now done completely separate from drivers before anything else is done after a frame change. Drivers are handled after this in a separate pass, as dictated by depsgraph flags, etc.
Notes:
* Drivers haven't been hooked up yet
* Only objects and data directly linked to objects can be animated.
* Depsgraph will need further tweaks. Currently, I've only made sure that it will update some things in the most basic cases (i.e. frame change).
* Animation Editors are currently broken (in terms of editing stuff). This will be my next target (priority to get Dopesheet working first, then F-Curve editor - i.e. old IPO Editor)
* I've had to put in large chunks of XXX sandboxing for old animation system code all around the place. This will be cleaned up in due course, as some places need special review.
In particular, the particles and sequencer code have far too many manual calls to calculate + flush animation info, which is really bad (this is a 'please explain yourselves' call to Physics coders!).
Think global, act local!
The old favorite G.scene gone! Man... that took almost 2 days.
Also removed G.curscreen and G.edbo.
Not everything could get solved; here's some notes.
- modifiers now store current scene in ModifierData. This is not
meant for permanent, but it can probably stick there until we
cleaned the anim system and depsgraph to cope better with
timing issues.
- Game engine G.scene should become an argument for staring it.
Didn't solve this yet.
- Texture nodes should get scene cfra, but the current implementation
is too tightly wrapped to do it easily.
This commit introduces a new texture ('Voxel Data'), used to load up saved voxel
data sets for rendering, contributed by Raúl 'farsthary' Fernández Hernández
with some additional tweaks. Thanks, Raúl!
The texture works similar to the existing point density texture, currently it
only provides intensity information, which can then be mapped (for example) to
density in a volume material. This is an early version, intended to read the
voxel format saved by Raúl's command line simulators, in future revisions
there's potential for making a more full-featured 'Blender voxel file format',
and also for supporting other formats too.
Note: Due to some subtleties in Raúl's existing released simulators, in order
to load them correctly the voxel data texture, you'll need to raise the
'resolution' value by 2. So if you baked out the simulation at resolution 50,
enter 52 for the resolution in the texture panel. This can possibly be fixed in
the simulator later on.
Right now, the way the texture is mapped is just in the space 0,0,0 <-> 1,1,1
and it can appear rotated 90 degrees incorrectly. This will be tackled, for now,
probably the easiest way to map it is with and empty, using Map Input -> Object.
Smoke test: http://www.vimeo.com/2449270
One more note, trilinear interpolation seems a bit slow at the moment, we'll
look into this.
For curiosity, while testing/debugging this, I made a script that exports a mesh
to voxel data. Here's a test of grogan (www.kajimba.com) converted to voxels,
rendered as a volume: http://www.vimeo.com/2512028
The script is available here: http://mke3.net/projects/bpython/export_object_voxeldata.py
* Another smaller thing, brought back early ray termination (was disabled
previously for debugging) and made it user configurable. It now appears as a new
value in the volume material: 'Depth Cutoff'. For some background info on what
this does, check:
http://farsthary.wordpress.com/2008/12/11/cutting-down-render-times/
* Also some disabled work-in-progess code for light cache
Robin (Frrr) Allen did a decent job on this, so we can also welcome him
as a member in the svn committers team to maintain it!
I do the first commit with some minor fixes:
- get Makefiles work
- fix rounding issue with tiles on unit faces
- removed UI includes from tex node
A nice doc in wiki is here:
http://wiki.blender.org/index.php/User:Frr/TexnodeManual
On the todo for Robin is:
- When using one or more Texture-input nodes, you cannot edit them by activating
(as works now for Material nodes).
- The new "output node" option fails on the default case, when only one
output node is active. It then shows often a blank menu. Will get fixed asap.
- When using a NodeTree-Texture as input node, the menu for 'active output'
should not show. NodeTree should ignore other nodetrees to keep things sane
for now.
- On a future todo is proper usage of "Dxt" and "Dyt" texture vectors for
superior antialising of checkers/bricks.
General note; I know people are dying to get a full integrated shader system
with nodes. In theory we could merge this with Material Nodetrees... but I
rather wait for a solid and very well thought out design proposal for this,
also including design ideas for unifying with a shader language (GPU, CPU).
For the time being this is a nice extension of current textures. :)