images. Found method that doesn't require image to be rendered larger.
Note: assembling pre-rendered parts that are result of FSA renders might
still give minor visible artefacts on edges; however, we should include
such render methods in the render pipeline, so multiple computers can
each render parts, save all samples, and have 1 computer assembling and
compositing all. This is for another project... :)
Press Rkey in compositor for reading back render results and invoke a compo.
This now correctly reads AO (skipped it sometimes) and it makes a correct
composite.
FSA todo:
- hotkey + button for reading all samples back + composite
- solve black border around image
Now bounding boxes are computed per object, and checked first before
zbuffering objects. For strands, bounding boxes are computed per
original face in the mesh. Overall the speed improvement from this
is quite small (zbuffering is rarely the bottleneck), but it seems a
sensible thing to do anyway.
They now also store a list of samples per pixel, and then get
shaded together with the ztransp samples. This comes with a
slight speed hit, but mainly memory might be a concern. However,
testing some peach scenes I haven't problems.
This completes the pipeline make-over, as started in 2006. With this
option, during rendering, each sample for every layer and pass is being
saved on disk (looks like non-antialiased images). Then the composite
and color correction happens, then a clip to 0-1 range, and only in end
all samples get combined - using sampling filters such as gauss/mitch/catmul.
This results in artefact-free antialiased images. Even Z-combine or
ID masks now work perfect for it!
This is an unfinished commit btw; Brecht will finish this for strands.
Also Halo doesnt work yet.
To activate FSA: press "Save Buffers" and the new button next to it. :)
Problem: artist wants character to walk in grass, but still have all rendered
in seperate render-layers, for postpro effects and vblur. How to efficiently
create a mask image you can put *over* the character for the grass?
Solution has two parts; this commits allows any layer inside of the renderlayers
to become a Z-mask (Z values for solid gets filled in, but not rendered).
Second part of commit is render option "Only render stuff that's in front of
a zbuffer value that was filled in (saves render time)
of strands changing between frames, vector blur couldn't work. Now
speed vectors are interpolated from the surface. This also means
child particles don't have to be computed in the previous and next
frames, so saves time too.
Also, duplis are now taking into account, the proper way to exclude
them is to set the material to be not traceable.
Removed an unnecessary pointer from the VlakRen struct to save some
memory, not really that significant, but still, saves 70 mb for 10
million faces.
Especially for fast moving objects (as we have here in Peach) the
art department demanded nice curved vector blur. This formula uses
a quadratic bezier function, which is not giving perfect circles, but
certainly useful results.
Also on todo: get this blur code to do nicer accumulation...
- Particle system distribution wasn't flushed properly for non-edited hair.
- For instances in the renderer, also count their verts and faces in the stats.
- Fix for error in the "surface diffuse" formula for strand shading.
Removed FTYPE from render output panel - was some old format that did index colors, and wasn't even used anywhere.
Added 2 options to the render output panel that can be used for a really basic local renderfarm (even artists can use it!),
"NoOverwrite" and "Touch"
When both are enabled, rendering 1 scene between many pc's on a fast network will populate the directory with frames.
Also useful to delete frames that have errors and re-render (without manually re-rendering each frame)
- non OSA case didn't work
- ztransp adding was accidentally using an incorrect alpha value
NOTE: allmost all pass types rendered in OSA with a filter (not box!) were
incorrectly added on solid layers. Like diffuse, AO, etc.
This is actually just the alpha value as currently being calculated
by the mist code. It is in many cases not very useful to have this as
alpha in shading result, also for postprocess and composite.
Note: this pass also works with "Mist" not set in World, of course.
=============================
A new approximate ambient occlusion method has been added, next to the
existing one based on raytracing. This method is specifically targetted
at use in animations, since it is inherently noise free, and so will
not flicker across frames.
http://www.blender.org/development/current-projects/changes-since-244/approximate-ambient-occlusion/http://peach.blender.org/index.php/approximate-ambient-occlusion/
Further improvements are still needed, but it can be tested already. There
are still a number of known issues:
- Bias errors on backfaces.
- For performance, instanced object do not occlude currently.
- Sky textures don't work well, the derivatives for texture evaluation
are not correct.
- Multiple passes do not work entirely correct (they are not accurate
to begin with, but could be better).
Lampbuffers require painful bias tweaking (to prevent aliasing or to
get shadow detail). Sometimes you want this different per object, like
for gras you want less shadow detail, but for the ground you want high
detail. This feature allows to tweak it.
The new "LBias" slider is in shader panel, bottom. Ugly! But, thats for
later...