The idea is to avoid having roundtrip from byte to float and back to byte buffer
and use render result's byte buffer to store result of sequencer rendering.
This actually matches to what regular render pipeline is doing and this gives
around 2-3 times speedup of sequencer export on a simple scenes.
The issue was caused by render pipeline freeing render parts prior to finishing
exr file writing which resulted in unfinished parts not being written into the
file by save_empty_result_tiles().
As a temporary solution we do explicitly write unfinished parts as empty tiles
to the exr file prior to freeing parts.
Not ideal solution, but should work for the release.
Official Documentation:
http://www.blender.org/manual/render/workflows/multiview.html
Implemented Features
====================
Builtin Stereo Camera
* Convergence Mode
* Interocular Distance
* Convergence Distance
* Pivot Mode
Viewport
* Cameras
* Plane
* Volume
Compositor
* View Switch Node
* Image Node Multi-View OpenEXR support
Sequencer
* Image/Movie Strips 'Use Multiview'
UV/Image Editor
* Option to see Multi-View images in Stereo-3D or its individual images
* Save/Open Multi-View (OpenEXR, Stereo3D, individual views) images
I/O
* Save/Open Multi-View (OpenEXR, Stereo3D, individual views) images
Scene Render Views
* Ability to have an arbitrary number of views in the scene
Missing Bits
============
First rule of Multi-View bug report: If something is not working as it should *when Views is off* this is a severe bug, do mention this in the report.
Second rule is, if something works *when Views is off* but doesn't (or crashes) when *Views is on*, this is a important bug. Do mention this in the report.
Everything else is likely small todos, and may wait until we are sure none of the above is happening.
Apart from that there are those known issues:
* Compositor Image Node poorly working for Multi-View OpenEXR
(this was working prefectly before the 'Use Multi-View' functionality)
* Selecting camera from Multi-View when looking from camera is problematic
* Animation Playback (ctrl+F11) doesn't support stereo formats
* Wrong filepath when trying to play back animated scene
* Viewport Rendering doesn't support Multi-View
* Overscan Rendering
* Fullscreen display modes need to warn the user
* Object copy should be aware of views suffix
Acknowledgments
===============
* Francesco Siddi for the help with the original feature specs and design
* Brecht Van Lommel for the original review of the code and design early on
* Blender Foundation for the Development Fund to support the project wrap up
Final patch reviewers:
* Antony Riakiotakis (psy-fi)
* Campbell Barton (ideasman42)
* Julian Eisel (Severin)
* Sergey Sharybin (nazgul)
* Thomas Dinged (dingto)
Code contributors of the original branch in github:
* Alexey Akishin
* Gabriel Caraballo
Simply add an option to render settings to save an EXR cache,
just when the render is finished. Also changed RE_ReadRenderResult() to read
cache instead of temp sample files (those are fully volatile now anyway).
Path to save cached render results is an UserPreferences setting.
Also added 'Reload render' feature to the Image Editor (so one can now re-open a blend,
and in an Image Editor hit ctrl-R to (try to) reload last render from cache).
Reviewers: campbellbarton, sergey
Differential Revision: https://developer.blender.org/D553
This assumptions are now made:
- Internally float buffers are always linear alpha-premul colors
- Readers should worry about delivering float buffers with that
assumptions.
- There's an input image setting to say whether it's stored with
straight/premul alpha on the disk.
- Byte buffers are now assumed have straight alpha, readers should
deliver straight alpha.
Some implementation details:
- Removed scene's color unpremultiply setting, which was very
much confusing and was wrong for default settings.
Now all renderers assumes to deliver premultiplied alpha.
- IMB_buffer_byte_from_float will now linearize alpha when
converting from buffer.
- Sequencer's effects were changed to assume bytes have got
straight alpha. Most of effects will work with bytes still,
however for glow it was more tricky to avoid data loss, so
there's a commented out glow implementation which converts
byte buffer to floats first, operates on floats and returns
bytes back. It's slower and not sure if it should actually
be used -- who're using glow on alpha anyway?
- Sequencer modifiers should also be working nice with straight
bytes now.
- GLSL preview will predivide float textures to make nice shading,
shading with byte textures worked nice (GLSL was assuming straight
alpha).
- Blender Internal will set alpha=1 to the whole sky. The same
happens in Cycles and there's no way to avoid this -- sky is
neither straight nor premul and doesn't fit color pipeline well.
- Straight alpha mode for render result was also eliminated.
- Conversion to correct alpha need to be done before linearizing
float buffer.
- TIFF will now load and save files with proper alpha mode setting
in file meta data header.
- Remove Use Alpha from texture mapping and replaced with image
datablock setting.
Behaves much more predictable and clear from code point of view
and solves possible regressions when non-premultiplied images were
used as textures with ignoring alpha channel.
Issue was caused by completely different way how multi-layer EXRs are loading,
they're bypassing general image buffer loading functions.
Solved by running color space transformation on render result construction
from multi-layer EXR image.
Also fixed issue with wrong display buffer computing for buffers with less
than 4 channels. Issues were:
- Display buffer is always expected to be RGBA
- OpenColorIO can not apply color space transformations on non-{RGB, RGBA}
pixels.
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
Regular rendering now works tiled, and supports save buffers to save memory
during render and cache render results.
Brick texture node by Thomas.
http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles/Nodes/Textures#Brick_Texture
Image texture Blended Box Mapping.
http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles/Nodes/Textures#Image_Texturehttp://mango.blender.org/production/blended_box/
Various bug fixes by Sergey and Campbell.
* Fix for reading freed memory in some node setups.
* Fix incorrect memory read when synchronizing mesh motion.
* Fix crash appearing when direct light usage is different on different layers.
* Fix for vector pass gives wrong result in some circumstances.
* Fix for wrong resolution used for rendering Render Layer node.
* Option to cancel rendering when doing initial synchronization.
* No more texture limit when using CPU render.
* Many fixes for new tiled rendering.