For composting sky behind alpha, a gamma corrected alpha_under was used,
but Blender didnt make the gamma tables unless OSA was set.
Old bug!
Changed; while rendering with "Backbuf" that doesn't exist, blender still
renders, but without backfbuf now. It used to stop render and return, with
only a print in console... confusing.
Gaussian sampling/rendering now supported too!
Also corrected gamma corrected adding of colors, which gives better alpha
and blending with sky than normal render does. The latter I could check
once too...
Thanks to Brecht & Valgrind, found 2 cases for unitialized variables in
the render code. Both were for 2D texture input (Window and Sticky) which
didn't initalize a the third coordinate. Goes fine for 2D textures, but not
for 3d ones :)
Crash in this situation;
- one blender file with more scenes
- render image
- goto another Scene with no camera and larger output size for image
- render
Render buffer has to be freed then :)
- Ztransp material didn't raytrace at all (now just traces it entirely,
remember too set the transp depth for it)
- tramsparent material reflected wrong in mirror material, due to
specular being added without alpha.
- Cleaned up some code to improve raytrace speed some. The old conventions
from before the AA recode were still there, this allowed coherence for
octree traversal. Current AA doesn't allow this anymore.
Added is improved check for 'first hit' on shadow render, per lamp this
now is stored
All in all, render with ray trace improved about 10-15%.
Two accumulating errors, causing 'scanline' errors too, but now based on
using different filtering values for transparent shadows.
Was another 2 cases of unused variable render as well. :)
with the zblur plugin for faster dof (or other seq.plugins that need the
zbuffer).
I don't think the conversion to a blender zbuffer value is quite correct,
but at least it does produce usable results for zblur without too much
differences with the blender render (at least not for the short tests that I
could do in this short time...)
- onlyshadow material defaulted to black when no shadow calc was used, is
now 100% transparent
- AO 'shadows' were not included in onlyshadow material
- alpha render appeared to be wrong since 2.32... it was gamma corrected
giving difference in OSA render with 'Gamma' on
That alpha issue i am going to tackle once, it is not well functioning, and
might be combined with new 'transmission' colors idea
In DisplayButtons, Panel "Output", a new slider "Dither" allows to add
random noise dither to rendered output. It works on sky as well as solid
and transparent. Note however that in OSA render, the Unified Render gives
much better results, since that render nicely delivers full scanlines of
high definition color. The old render mode isn't well suited for this
postprocess.
A dither value of '1.0' will exactly add maximum of 1.0/256.0 to the
pixels.
Potential improvements for next releases;
- regular patterns
- dither per color channel
- not only add, but also subtract dither
Also note that this gives best results for print work or stills. Animating it
gives slight visible noise. Also runlength compression wont really work, and
Jpeg needs to be given higher quality too.
This was something users found since tracing got into blender, having
sometimes small 'dots' or bright pixels or missing reflection rays in
an image. Thanks to the very simple sample file I could disect it...
it appeared to be an incomplete check for all numerical exceptions when
you traverse the octree nodes. Very technical, but clear comments are in
the code to explain ;)
raytracer. Instead of only tracing the current subpixel it did all
(or most) of them.
Solves reports on slow AO in 2.34, but also will affect ray_mir and transp
Extended the range of the depth and cdepth parameters as reqested by leope.
Bumpmapping should now be a bit more similar to the Blender render.
Added support for all remaining lightsources in yafray, tried to make use of
as much of the existing Blender parameters as possible.
Blender Lamp: added switch to enable rendering with shadowbuffer ('softlight' in yafray).
All other parameters are similar to the Blender settings, for yafray both the
bias parameter and the shadowbuffer size can be lower than equivalent Blender
settings, since the yafray buffer is floating point. Remember that 6 shadowmaps
are created in this case, so can use quite a bit of memory with large
buffer settings.
When 'ray shadow' is enabled for this lamp type, it is possible to set a light
radius to create a spherical arealight source ('spherelight' in yafray),
when this is 0, it is exported as a pointlight instead.
Blender Spot: as in Blender now supports 'halo' rendering.
Halo spots always use shadowbuffers, so when enabled the buttons for shadowmap
settings will appear. The 'ray shadow' button can still be used to disable
shadows cast onto other objects, independent of halo shadows.
One thing to remember, halo's don't work with empty backgrounds, something must
be behind the spotlight for it to be visible.
And finally, the photonlight:
probably the most confusing (as more things related to yafray), the photonlight
is not a real lightsource, it is only used as a source to shoot photons from.
Since indirect lighting is already supported (and looks better as well)
only caustics mode is supported.
So to be able to use this properly other lightsources must be used with it.
For the photonlighting to be 'correct' similar lightsettings as for the 'source'
light are needed.
Probably the best way to do this, when you are happy with the lighting setup
you have, and want to add caustics, copy the light you want to enable for
caustics (shift-D) and leave everything as is, then change the mode to
'Photon'.
To not waiste any photons, the photonlight behaves similar to the spotlight,
you can set the width of the beam with the 'angle' parameter. Make sure
that any object that needs to cast caustics is within that beam, make
the beam width as small as possible to tightly fit the object.
The following other parameters can be set:
-photons: the number of photons to shoot.
-search: the number of photons to search when rendering, the higher,
the blurrier the caustics.
-depth: the amount of photon bounces allowed, since the primary use is for
caustics, you probably best set this to the same level as the 'ray depth'
parameter.
-Blur: this controls the amount of caustics blur (in addition to the search
parameter), very low values will cause very sharp caustics, which when used
with a low photonnumber, probably lead to only some noisy specks being rendered.
-Use QMC: Use quasi monte carlo sampling, can lead to cleaner results, but also
can sometimes cause patterns.
Since the photonlight has no meaning to Blender, when using photonlights and
switching back to the internal render, the light doesn't do anything, and no
type button will be selected. The lightsource can still be selected, but unless
switching to yafray, no parameters can set.
Apologies to Anexus, I had no time to really do something with your code,
I'll still look at it later, to see if I can improve anything in my implementation.
old files still use the old fast OSA, and when you want a specific
material to have specular/shader/texture AA you can set this individual.
When rendering ray_mir or ray_transp or ray_shadow the new OSA will be
effective by default however.
Still todo; make this switch work for transparant faces and unified...
Problem was in calculation of oversampling vectors for correct AA. With
the new AA method, this is less necessary, so the code now doesnt use
mipmapped or filtered images when it is refracted. For reflected rays it
does still use the filter though, there the error is hardly noticable.
For all tests and report .blend files it looks much better.
However, a real mathematical solution is preferable still.
This is another extreme old one; from before NaN days even!
Issue is that shadowbuffers have a bias to prevent faces shadowing itself.
To make bias smarter, code was added to adjust bias based on light angle.
This correction allowed a factor of 10 times smaller bias, being in many
cases much too strong, causing frontally lighted faces becoming too dark.
New correction only halves the bias on frontal light, which looks quite
more convincing and pretty.
This problem appeared to be a famous one, with some fun read to be found
on the web. The solution as I commit here is described on the site:
http://www.blender3d.org/cms/Misc_improvements.355.0.html
As extra (I needed it quite some!) added requested feature to have the
renderwindow display in titlebar whether the spare page is shown (JKEY)
although it rendered the submitted bug file fine...
Note to self again; always also check if code even works in general! :)
Note to self 2: don't fix things ad hoc when you're not coding
When using xml export, yafray will now render the alpha channel as well when 'RGBA' button in blender is enabled (Plugin does this automatically).
In plugin code, fixed smooth shading bug for non-mesh objects.
Relative paths for textures are now recognized (plugin & xml).
Fixed problem with duplicate objects (plugin & xml).
Really old bug, sun position is now correct (plugin & xml).
World background now can also be a regular image texture (jpeg & tga), but for now always assumes spheremapping, which is not the same as Blender either. In yafray the texture is assumed to be a full 360 (panorama type) map.
convertBlenderScene.c cleanup, the identity transform 'hack' is removed.
THIS AFFECTS ALL EXTERNAL RENDERERS (Aqsis and others) WHICH RELY ON THE RENDERDATA OUTPUT, VERTICES AND LAMPCOORDINATES/VECTORS NOW NEED TO BE TRANSFORMED BACK TO WORLD COORDINATES. See yafray plugin/export code.
Edges in Mesh
- adds automatic when you use creases. For other situations; call the
void make_edges(Mesh *me) in mesh.c. Of course, once in editmode the
edges are automatically recreated.
- in F9 buttons you can add/remove edges too
- both for Mesh and DisplistMesh, so it speeds up drawing quite some in
wireframe
- render for edges can't work... edges have no material nor tface nor col..
so here still the faces are rendered in wire
Creases in Subsurf
- based on the code by Chris McFarlen
- main changes is that now edges are used, saving quite some data in file
- use SHIFT+E in editmode to set edges-sharpness. values go from 0-1
- in F9 buttons you can set draw-crease mode. It draws now blended from
wire color to edge-select color (as provided in Theme)
Known issue: setting sharpness on 1 cube (subdiv 2) gives weird results
with some values... Chris, can you check?
Further; code cleanups, changing 0 in NULL when needed, no warnings, etc etc
- textures: added support for new mixers (div, diff etc) to work on the
other map-to channels too, like ref or spec
Also it works on lamp and world textures
- brought back uncommented line of code that was removed by leon, to have
particle motion based on textures
- recoded the glPylonOffset hack to be nice function, this for future
testing.
Kent Mein. So next to the mix, mult, add, sub we have now:
- Div: divides by texture color
- Screen: is like Mult, but works opposite (makes lighter)
- Diff: the difference between texture color and material
- Light: if texture is lighter it shows (per component)
- Dark: if texture is darker it shows (per component)
Next step: add this for specular and mirror, and the other channels...
I commit it now because it also fixes error in previous commit.
Please note the following:
- pictures need to be saved as 'premul' sky render if you want to use it
in Blender as texture
- but for alpha-over in sequencer it has to be 'key alpha'...
This inconsistancy needs to be solved.. for example as option for both
texture as sequencer.
Division by zero in calculating render coords... only happens for
Wire material AND having face-less edges. Then the normal is zero, and
some calculations can't happen correctly.
(error in rendercore.c, other files committed was because of removed and
cleaned up enters)
http://www.blender3d.org/cms/Ramp_Shaders.348.0.html
Material color and specular now can be defined by a Colorband. The actual
color then is defined during shading based on:
- shade value (like dotproduct)
- energy value (dot product plus light)
- normal
- result of all shading (useful for adding stuff in the end)
Special request from [A]ndy! :)
New is that objects can have a force field, and Meshes can even deflect
(collide) particles. This is in a new sub-menu in Object buttons F7
The full instructions where on the web, Leon mailed it me and I will put
it in CMS tomorrow. For those who like to play with it now, here are demo
files:
http://download.blender.org/demo/test/
Quite some changes where in the integration though... so previous created
particle deflectors will not work. Changes to mention now are:
- gravity is renamed to 'force field'
- force field and deflector options are in Object now, not in Mesh
- the options also have its own struct, doesnt add to Object by default
- force fields are possible for all object types, but only work on center.
So empty objects are typical for it.
Work to do:
- add draw method in 3d win to denote forcefield objects
- check on the UI (panel with different size?)
- add 'recalc' button in deflector panel
The work i did end of may on render normals (displacemap especially)
caused refraction code to work wrong... took a while to find out, but
just removed a couple of lines too much.
Added clear comment there what it is, and what danger of removing is!
Needs latest yafray, you can get it from cvs, but I have also binaries
for os x here:
http://www.coala.uniovi.es/~jandro/noname/downloads/yafray-0.0.6-3.pkg.zip
To use it, go to yafray panels (global settings) and uncheck the "xml" button.
That would tell the export code to avoid xml export and use the yafray plugin
instead. You'll see the render being draw while running and you can even stop it
with ESC key.
Since I'm sure problems will appear, expect updates soon.
Remember: does not work on win32