Robin (Frrr) Allen did a decent job on this, so we can also welcome him
as a member in the svn committers team to maintain it!
I do the first commit with some minor fixes:
- get Makefiles work
- fix rounding issue with tiles on unit faces
- removed UI includes from tex node
A nice doc in wiki is here:
http://wiki.blender.org/index.php/User:Frr/TexnodeManual
On the todo for Robin is:
- When using one or more Texture-input nodes, you cannot edit them by activating
(as works now for Material nodes).
- The new "output node" option fails on the default case, when only one
output node is active. It then shows often a blank menu. Will get fixed asap.
- When using a NodeTree-Texture as input node, the menu for 'active output'
should not show. NodeTree should ignore other nodetrees to keep things sane
for now.
- On a future todo is proper usage of "Dxt" and "Dyt" texture vectors for
superior antialising of checkers/bricks.
General note; I know people are dying to get a full integrated shader system
with nodes. In theory we could merge this with Material Nodetrees... but I
rather wait for a solid and very well thought out design proposal for this,
also including design ideas for unifying with a shader language (GPU, CPU).
For the time being this is a nice extension of current textures. :)
* Fixed a stupid crash caused by last commit that worked fine on the mac
(but never should have...)
* Fix for using child particles with the new particle age color options
This introduces a few new ways of modifying the intensity and colour output
generated by the Point Density texture. Previously, the texture only output
intensity information, but now you can map it to colours along a gradient
ramp, based on information coming out of a particle system.
This lets you do things like colour a particle system based on the individual
particles' age - the main reason I need it is to fade particles out over time.
The colorband influences both the colour and intensity (using the colorband's
alpha value), which makes it easy to map a single point density texture to
both intensity values in the Map To panel (such as density or emit) and colour
values (such as absorb col or emit col). This is how the below examples are
set up, an example .blend file is available here:
http://mke3.net/blender/devel/rendering/volumetrics/pd_test4.blend
The different modes:
* Constant
No modifications to intensity or colour (pure white)
* Particle Age
Maps the color ramp along the particles' lifetimes:
http://mke3.net/blender/devel/rendering/volumetrics/pd_mod_partage.mov
* Particle Speed
Maps the color ramp to the particles' absolute speed per frame (in Blender
units). There's an additional scale parameter that you can use to bring this
speed into a 0.0 - 1.0 range, if your particles are travelling too faster or
slower than 0-1.
http://mke3.net/blender/devel/rendering/volumetrics/pd_mod_speed.mov
* Velocity -> RGB
Outputs the particle XYZ velocity vector as RGB colours. This may be useful
for comp work, or maybe in the future things like displacement. Again, there's
a scale parameter to control it.
http://mke3.net/blender/devel/rendering/volumetrics/pd_mod_velrgb.mov
Replaced 'Sharp' falloff with 'Soft'. This falloff type has
a variable softness, and can get some quite smooth results.
It can be useful to get smooth transitions in density when
you're using particles on a large scale:
http://mke3.net/blender/devel/rendering/volumetrics/pd_falloff_soft.jpg
Also removed 'angular velocity' turbulence source - it
wasn't doing anything useful atm
This addition allows you to perturb the point density with noise, to give
the impression of more resolution. It's a quick way to add detail, without
having to use large, complex, and slower to render particle systems.
Rather than just overlaying noise, like you might do by adding a secondary
clouds texture, it uses noise to perturb the actual coordinate looked up
in the density evaluation. This gives a much better looking result, as it
actually alters the original density.
Comparison of the particle cloud render without, and with added turbulence
(the render with turbulence only renders slightly more slowly):
http://mke3.net/blender/devel/rendering/volumetrics/pd_turbulence.jpg
Using the same constant noise function/spatial coordinates will give a
static appearance. This is fine (and quicker) if the particles aren't
moving, but on animated particle systems, it looks bad, as if the
particles are moving through a static noise field. To overcome this, there
are additional options for particle systems, to influence the turbulence
with the particles' average velocity, or average angular velocity. This
information is only available for particle systems at the present.
Here you can see the (dramatic) difference between no turbulence, static
turbulence, and turbulence influenced by particle velocity:
http://mke3.net/blender/devel/rendering/volumetrics/turbu_compare.mov
Replaced the previous KD-tree (for caching points) with a
BVH-tree (thanks to Andre 'jaguarandi' Pinto for help here!).
The bvh is quite a bit faster and doesn't suffer some of the
artifacts that were apparent with the kd-tree.
I've also added a choice of falloff types: Standard, Smooth, and
Sharp. Standard gives a harder edge, easier to see individual
particles, and when used with a larger radius, Smooth and Sharp
falloffs make a much cloudier appearance possible. See the image
below (note the settings and render times too)
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_bvh.jpg
The Point Density texture now has some additional options for how
the point locations are cached. Previously it was all relative to
worldspace, but there are now some other options that make things
a lot more convenient for mapping the texture to Local (or Orco).
Thanks to theeth for helping with the space conversions!
The new Object space options allow this sort of thing to be possible
- a particle system, instanced on a transformed renderable object:
http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov
It's also a lot easier to use multiple instances, just duplicate
the renderable objects and move them around.
The new particle cache options are:
* Emit Object space
This caches the particles relative to the emitter object's
coordinate space (i.e. relative to the emitter's object center).
This makes it possible to map the Texture to Local or Orco
easily, so you can easily move, rotate or scale the rendering
object that has the Point Density texture. It's relative to the
emitter's location, rotation and scale, so if the object you're
rendering the texture on is aligned differently to the emitter,
the results will be rotated etc.
* Emit Object Location
This offsets the particles to the emitter object's location in 3D
space. It's similar to Emit Object Space, however the emitter
object's rotation and scale are ignored. This is probably the
easiest to use, since you don't need to worry about the rotation
and scale of the emitter object (just the rendered object), so
it's the default.
* Global Space
This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
Removed all the old particle rendering code and options I had in there
before, in order to make way for...
A new procedural texture: 'Point Density'
Point Density is a 3d texture that find the density of a group of 'points'
in space and returns that in the texture as an intensity value. Right now,
its at an early stage and it's only enabled for particles, but it would be
cool to extend it later for things like object vertices, or point cache
files from disk - i.e. to import point cloud data into Blender for
rendering volumetrically.
Currently there are just options for an Object and its particle system
number, this is the particle system that will get cached before rendering,
and then used for the texture's density estimation.
It works totally consistent with as any other procedural texture, so
previously where I've mapped a clouds texture to volume density to make
some of those test renders, now I just map a point density texture to
volume density.
Here's a version of the same particle smoke test file from before, updated
to use the point density texture instead:
http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend
There are a few cool things about implementing this as a texture:
- The one texture (and cache) can be instanced across many different
materials:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png
This means you can calculate and bake one particle system, but render it
multiple times across the scene, with different material settings, at no
extra memory cost.
Right now, the particles are cached in world space, so you have to map it
globally, and if you want it offset, you have to do it in the material (as
in the file above). I plan to add an option to bake in local space, so you
can just map the texture to local and it just works.
- It also works for solid surfaces too, it just gets the density at that
particular point on the surface, eg:
http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov
- You can map it to whatever you want, not only density but the various
emissions and colours as well. I'd like to investigate using the other
outputs in the texture too (like the RGB or normal outputs), perhaps with
options to colour by particle age, generating normals for making particle
'dents' in a surface, whatever!
the features that are needed to run the game. Compile tested with
scons, make, but not cmake, that seems to have an issue not related
to these changes. The changes include:
* GLSL support in the viewport and game engine, enable in the game
menu in textured draw mode.
* Synced and merged part of the duplicated blender and gameengine/
gameplayer drawing code.
* Further refactoring of game engine drawing code, especially mesh
storage changed a lot.
* Optimizations in game engine armatures to avoid recomputations.
* A python function to get the framerate estimate in game.
* An option take object color into account in materials.
* An option to restrict shadow casters to a lamp's layers.
* Increase from 10 to 18 texture slots for materials, lamps, word.
An extra texture slot shows up once the last slot is used.
* Memory limit for undo, not enabled by default yet because it
needs the .B.blend to be changed.
* Multiple undo for image painting.
* An offset for dupligroups, so not all objects in a group have to
be at the origin.
* GBE Python API's alignToVect wasnt clamping the align ammount from 0.0-1.0
* Generated images arnt animated - use for a test to see if the textures animated.
animated textures
I have added a dependsOnTime function for the Displace modifier which checks
if the displacement texture has IPOs, is a plugin, or uses an animated image.
[#8067] external texture plugin thread-safe modifications
Submitted By: David Anderson (davywavy)
It makes it so the "result" array is passed in, instead of a global var.
I expanded the patch so it will play nice with older plugins that are not
thread safe as well.
I also updated the existing plugins in the release, so they are thread safe.
--------------- What do people think of this.... ------------------
This should maybe be talked about in the functionality board or something,
but what do people think of adding in default texture/sequence plugins.
or making a seperate tree like lib for plugins.
The reason I ask is we have had a couple of upgrades to the plugin system.
(supporting float buffers for sequencer, and this one for textures)
http://www.cs.umn.edu/~mein/blender/plugins does not store revisions of
plugins I just make sure they work with the latest version. This is
getting messy. I haven't upgraded a lot of them to use floats (I know,
I'm lazy, and now this will also make modifications to the plugins)
It would be nice to have some of the standard ones under revision control.
We also seem to be having an explosion of platforms supported. It would
be nice to have platform maintainers compiling plugins as well for releases.
(Its getting to be more work for me to keep up with things...)
I'll go back to my corner now and be quite. ;)
Kent
Fix for bug #7418: texture ipo's didn't show for textures in node materials.
Fix for part of bug #6758: node materials in other node materials could
miss texture coordinates.
=============
A new "Selected to Active" option in the Bake panel, to (typically) bake
a high poly object onto a low poly object. Code based on patch #7339 by
Frank Richter (Crystal Space developer), thanks!.
Normal Mapping
==============
Camera, World, Object and Tangent space is now supported for baking, and
for material textures. The "NMap TS" setting is replaced with a dropdown
of the four choices in the image texture buttons.
http://www.blender.org/development/current-projects/changes-since-244/render-baking/
This option sets the relative scaling factor to the amount set in the
scene "100%/75%/50%/25%" buttons. It's useful when you've got a fixed
background image, and want to do preview renders at a lesser
percentage, so you don't have to go and change the scale node each
time you change the %.
Also removed unnecessary use of a global from texture node.
Initial commit of imagebrowser in trunk.
BIG COMMIT!
Main changes:
* completely reworked imasel space
* creation and storage of the preview images for materials, textures, world and lamp
* thumbnails of images and movie files when browsing in the file system
* loading previews from external .blend when linking or appending
* thumbnail caching according to the Thumbnail Managing Standard: http://jens.triq.net/thumbnail-spec/
* for now just kept imasel access mostly as old imgbrowser (CTRL+F4, CTRL+F1) a bit hidden still.
* filtering of file types (images, movies, .blend, py,...)
* preliminary managing of bookmarks ('B' button to add, XKEY while bookmark active to delete)
More detailed info which will be updated here: http://wiki.blender.org/index.php/User:Elubie/PreviewImageBrowser
Places that need special review (and probably fixes):
* BLO_blendhandle_get_previews in readblenentry
* readfile.c: do_version and refactorings of do_library_append
* UI integration
TODO and known issues still:
* Accented characters do not display correctly with international fonts
* Crash was reported when browsing in directory with movie files
* Bookmark management still needs some UI work (second scrollbar?), feedback here is welcome!
Credits:
Samir Bharadwaj (samirbharadwaj@yahoo.com) for the icon images.
Many thanks to everyone who gave feedback and helped so far!
Nice discovery by Ralf (cheleb): crasher with colorband, caused by NULL
pointer reading. It actually reveiled a weakness in code too, the
buttonswindow context was not set appropriate when buttons window had not
a header...
Please read:
http://www.blender3d.org/cms/Imaging.834.0.html
Or in short:
- adding MultiLayer Image support
- recoded entire Image API
- better integration of movie/sequence Images
Was a whole load of work... went down for a week to do this. So, will need
a lot of testing! Will be in irc all evening.
Added a callback instance_init() so that any particular instance of a
texture plugin can initialize data. Updated the clouds2 and tile example
plugins to have a (dummy) call back.
- ImagePaint now uses ImBuf directly, and the rect blending functions
were moved into the imbuf module.
- The brush spacing, timing and sampling was abstracted into brush.c, for
later reuse in other paint modes.
Float ImagePaint support.
Textured Brushes:
- Only the first texture channel is used now.
- Options for size and offset should be added, but need to find some space
in the panel, or add a second one ..
A full detailed description of this will be done later... is several days
of work. Here's a summary:
Render:
- Full cleanup of render code, removing *all* globals and bad level calls
all over blender. Render module is now not called abusive anymore
- API-fied calls to rendering
- Full recode of internal render pipeline. Is now rendering tiles by
default, prepared for much smarter 'bucket' render later.
- Each thread now can render a full part
- Renders were tested with 4 threads, goes fine, apart from some lookup
tables in softshadow and AO still
- Rendering is prepared to do multiple layers and passes
- No single 32 bits trick in render code anymore, all 100% floats now.
Writing images/movies
- moved writing images to blender kernel (bye bye 'schrijfplaatje'!)
- made a new Movie handle system, also in kernel. This will enable much
easier use of movies in Blender
PreviewRender:
- Using new render API, previewrender (in buttons) now uses regular render
code to generate images.
- new datafile 'preview.blend.c' has the preview scenes in it
- previews get rendered in exact displayed size (1 pixel = 1 pixel)
3D Preview render
- new; press Pkey in 3d window, for a panel that continuously renders
(pkey is for games, i know... but we dont do that in orange now!)
- this render works nearly identical to buttons-preview render, so it stops
rendering on any event (mouse, keyboard, etc)
- on moving/scaling the panel, the render code doesn't recreate all geometry
- same for shifting/panning view
- all other operations (now) regenerate the full render database still.
- this is WIP... but big fun, especially for simple scenes!
Compositor
- Using same node system as now in use for shaders, you can composit images
- works pretty straightforward... needs much more options/tools and integration
with rendering still
- is not threaded yet, nor is so smart to only recalculate changes... will be
done soon!
- the "Render Result" node will get all layers/passes as output sockets
- The "Output" node renders to a builtin image, which you can view in the Image
window. (yes, output nodes to render-result, and to files, is on the list!)
The Bad News
- "Unified Render" is removed. It might come back in some stage, but this
system should be built from scratch. I can't really understand this code...
I expect it is not much needed, especially with advanced layer/passes
control
- Panorama render, Field render, Motion blur, is not coded yet... (I had to
recode every single feature in render, so...!)
- Lens Flare is also not back... needs total revision, might become composit
effect though (using zbuffer for visibility)
- Part render is gone! (well, thats obvious, its default now).
- The render window is only restored with limited functionality... I am going
to check first the option to render to a Image window, so Blender can become
a true single-window application. :)
For example, the 'Spare render buffer' (jkey) doesnt work.
- Render with border, now default creates a smaller image
- No zbuffers are written yet... on the todo!
- Scons files and MSVC will need work to get compiling again
OK... thats what I can quickly recall. Now go compiling!
- New Node: "Mapping". Allows input vector to be translated, rotated and
scaled. And optional be clipped to a range. Works for colors too!
- The button "Normal" now allows incremental input, so a click in the
button won't change the normal anymore
- Connecting wires now show selection state for Nodes, with nice blended
colors. Both colors were added in Themes, but default to black and white
********* Node editor work:
- To enable Nodes for Materials, you have to set the "Use Nodes"
button, in the new Material buttons "Nodes" Panel or in header
of the Node editor. Doing this will disable Material-Layers.
- Nodes now execute materials ("shaders"), but still only using the
previewrender code.
- Nodes have (optional) previews for rendered images.
- Node headers allow to hide buttons and/or preview image
- Nodes can be dragged larger/smaller (right-bottom corner)
- Nodes can be hidden (minimized) with hotkey H
- CTRL+click on an Input Socket gives a popup with default values.
- Changing Material/Texture or Mix node will adjust Node title.
- Click-drag outside of a Node changes cursor to "Knife' and allows to
draw a rect where to cut Links.
- Added new node types RGBtoBW, Texture, In/Output, ColorRamp
- Material Nodes have options to ouput diffuse or specular, or to use
a negative normal. The input socket 'Normal' will force the material
to use that normal, otherwise it uses the normal from the Material
that has the node tree.
- When drawing a link between two not-matching sockets, Blender inserts
a converting node (now only for value/rgb combos)
- When drawing a link to an input socket that's already in use, the
old link will either disappear or flip to another unused socket.
- A click on a Material Node will activate it, and show all its settings
in the Material Buttons. Active Material Nodes draw the material icon
in red.
- A click on any node will show its options in the Node Panel in the
Material buttons.
- Multiple Output Nodes can be used, to sample contents of a tree, but
only one Output is the real one, which is indicated in a different
color and red material icon.
- Added ThemeColors for node types
- ALT+C will convert existing Material-Layers to Node... this currently
only adds the material/mix nodes and connects them. Dunno if this is
worth a lot of coding work to make perfect?
- Press C to call another "Solve order", which will show all possible
cyclic conflicts (if there are).
- Technical: nodes now use "Type" structs which define the
structure of nodes and in/output sockets. The Type structs store all
fixed info, callbacks, and allow to reconstruct saved Nodes to match
what is required by Blender.
- Defining (new) nodes now is as simple as filling in a fixed
Type struct, plus code some callbacks. A doc will be made!
- Node preview images are by default float
********* Icon drawing:
- Cleanup of how old icons were implemented in new system, making
them 16x16 too, correctly centered *and* scaled.
- Made drawing Icons use float coordinates
- Moved BIF_calcpreview_image() into interface_icons.c, renamed it
icon_from_image(). Removed a lot of unneeded Imbuf magic here! :)
- Skipped scaling and imbuf copying when icons are OK size
********* Preview render:
- Huge cleanup of code....
- renaming BIF_xxx calls that only were used internally
- BIF_previewrender() now accepts an argument for rendering method,
so it supports icons, buttonwindow previewrender and node editor
- Only a single BIF_preview_changed() call now exists, supporting all
signals as needed for buttos and node editor
********* More stuff:
- glutil.c, glaDrawPixelsSafe() and glaDrawPixelsTex() now accept format
argument for GL_FLOAT rects
- Made the ColorBand become a built-in button for interface.c
Was a load of cleanup work in buttons_shading.c...
- removed a load of unneeded glBlendFunc() calls
- Fixed bug in calculating text length for buttons (ancient!)
marker wasn't on the first or last possible position. Caused by clipping.
As bonus; added Cardinal interpolation option too, which is just that
little bit different! (Cardinal goes through the controlpoints, bspline not)
NOTE: BLI_winstuff.h was meant to be a wrapper around windows.h to handle
undefining various crap that windows.h defines. Platform specific headers
should only have to be included in a few places. This reduces the number
of inclusions of BLI_winstuff.h to 16 which is a much more reasonable
number (than the 144 or whatever it used to be)
Render:
- New; support for dual CPU render (SDL thread)
Currently only works with alternating scanlines, but gives excellent
performance. For both normal render as unified implemented.
Note the "mutex" locks on z-transp buffer render and imbuf loads.
- This has been made possible by major cleanups in render code, especially
getting rid of globals (example Tin Tr Tg Tb Ta for textures) or struct
OSA or using Materials or Texture data to write to.
- Made normal render fully 4x32 floats too, and removed all old optimizes
with chars or shorts.
- Made normal render and unified render use same code for sky and halo
render, giving equal (and better) results for halo render. Old render
now also uses PostProcess options (brightness, mul, gamma)
- Added option ("FBuf") in F10 Output Panel, this keeps a 4x32 bits buffer
after render. Using PostProcess menu you will note an immediate re-
display of image too (32 bits RGBA)
- Added "Hue" and "Saturation" sliders to PostProcess options
- Render module is still not having a "nice" API, but amount of dependencies
went down a lot. Next todo: remove abusive "previewrender" code.
The last main global in Render (struct Render) now can be re-used for fully
controlling a render, to allow multiple "instances" of render to open.
- Renderwindow now displays a smal bar on top with the stats, and keeps the
stats after render too. Including "spare" page support.
Not only easier visible that way, but also to remove the awkward code that
was drawing stats in the Info header (extreme slow on some ATIs too)
- Cleaned up blendef.h and BKE_utildefines.h, these two had overlapping
defines.
- I might have forgotten stuff... and will write a nice doc on the architecture!
While going over the code, I found out the "nabla", the size of offset
vectors for calculating derivatives of a texture, is a built in constant.
Even worse, the value was different for new noise types (musgrave etc).
So I've added a new slider for it in the procedural texture panels, which
by default is set to 0.025, the value of the old constant. Also made sure
it works with equal effect in all procedurals.
NOTE: a small Nabla will give sharper, detailed bump, but the effect also
becomes smaller, correct that in the Mapping Panel of materials.
For better & compliant control over the bumpmapping, I've also included
the Colorband output in derivatives calculus, so the bump output then
matches the color created. It's also a nice tool to finetune output of
textures for bumpmapping in general.
Bug fix; clicking on the rightmose 'item' in ColorBand didn't activate it.
Found out the ColorBand was slightly drawn off (2 pixels).
channels to link texture to.
The amount of code changes seems large, but is mostly getting rind of
hardcoded values (6 and 8) for channels, replacing it with MAX_MTEX.
Further did some fixes;
- Ipo for Lamp showed too many mapping channels
- Texture MapTo buttons for lamp missed the slider to blend texture color
- Lamp texture mapping "View" only worked for Spot, now it uses lamp-
view vector for all types. (Nice for projections!)
- now more than 31 channels possible for ipos
- added lotsa new channels all over
- Texture block has ipo now too
- recoded getname_ei functions
(Will ask nathan to give release log info when he's back!)
http://www.blender3d.org/cms/Ramp_Shaders.348.0.html
Material color and specular now can be defined by a Colorband. The actual
color then is defined during shading based on:
- shade value (like dotproduct)
- energy value (dot product plus light)
- normal
- result of all shading (useful for adding stuff in the end)
Special request from [A]ndy! :)