Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
This implements basic color grading modifiers in sequencer, supporting
color balance, RGB curves and HUE corrections.
Implementation is close to object modifiers, some details are there:
http://wiki.blender.org/index.php/User:Nazg-gul/SequencerModifiers
Modifiers supports multi-threaded calculation, masks and instant
parameter changes.
Also added cache for pre-processed image buffers for current frame,
so changing sequence properties does not require rendering of original
sequence (like rendering scene, loading file from disk and so)
Note about long lines: I did not touch to two pieces of code (because I don’t see any way to keep a nicely formated, compact code, with shorter lines):
* The node types definitions into rna_nodetree_types.h
* The vgroup name functions into rna_particle.c
effect for a render engine using new shading nodes. In short:
* No longer uses images assigned to faces in the uv layer, rather the active
image texture node is what is edited/painted/drawn.
* Textured draw type now shows the active image texture node, with solid
lighting.
* Material draw mode shows GLSL shader of a simplified material node tree,
using solid lighting.
* Textures for modifiers, brushes, etc, are now available from a dropdown in
the texture tab in the properties editor. These do not use new shading nodes
yet.
http://wiki.blender.org/index.php/Dev:2.6/Source/Render/TextureWorkflow
to be used by cycles. For testing there's a panel in the node editor if you set
debug to 777, didn't enable it because I'm not sure it's very useful there.
===========================
Commiting camera tracking integration gsoc project into trunk.
This commit includes:
- Bundled version of libmv library (with some changes against official repo,
re-sync with libmv repo a bit later)
- New datatype ID called MovieClip which is optimized to work with movie
clips (both of movie files and image sequences) and doing camera/motion
tracking operations.
- New editor called Clip Editor which is currently used for motion/tracking
stuff only, but which can be easily extended to work with masks too.
This editor supports:
* Loading movie files/image sequences
* Build proxies with different size for loaded movie clip, also supports
building undistorted proxies to increase speed of playback in
undistorted mode.
* Manual lens distortion mode calibration using grid and grease pencil
* Supervised 2D tracking using two different algorithms KLT and SAD.
* Basic algorithm for feature detection
* Camera motion solving. scene orientation
- New constraints to "link" scene objects with solved motions from clip:
* Follow Track (make object follow 2D motion of track with given name
or parent object to reconstructed 3D position of track)
* Camera Solver to make camera moving in the same way as reconstructed camera
This commit NOT includes changes from tomato branch:
- New nodes (they'll be commited as separated patch)
- Automatic image offset guessing for image input node and image editor
(need to do more tests and gather more feedback)
- Code cleanup in libmv-capi. It's not so critical cleanup, just increasing
readability and understanadability of code. Better to make this chaneg when
Keir will finish his current patch.
More details about this project can be found on this page:
http://wiki.blender.org/index.php/User:Nazg-gul/GSoC-2011
Further development of small features would be done in trunk, bigger/experimental
features would first be implemented in tomato branch.
update signal for this was missing. Problem is that the operator properties
RNA update callback doesn't know the associated keymap item, worked around it
with UI template now.
from Andy Braham (andybraham)
This adds support for empties to reference images and draw in the 3D view.
Modifications from the original patch.
- use an empty draw 'image' type
- use image aspect ratio for non-square-pixels
- when the image is not found, still draw the frame.
Also use const char in many other parts of blenders code.
Currently this gives warnings for setting operator id, label and description since these are an exception and allocated beforehand.
- made interface, windowmanager, readfile build without unused warnings.
- re-arranged CMake's source/blender build order so less changed libs are build later, eg: IK, avi
- set_frame() --> frame_set()
- set_context_pointer() --> context_pointer_set()
material adding works for curves and metaballs, new function to remove materials.
materials.link() didnt well fit how this is used elsewhere
- order matters
- it can be linked more than once.
- remove(material), isnt that useful since you need to manage indicies.
... use list style functions instead. materials.append(mat) / materials.pop(index)
eg:
row.prop_search_self(scene, "active", "keying_sets", text="")
...becomes
row.prop_search(scene.keying_sets, "active", scene, "keying_sets", text="")
This is more flexible since it works for other UI functions too.
layout.prop_search_self(), the same as layout.prop_search() except it uses an attribute of the collection.
A number of collections have an 'active' member which couldnt be used with prop_search() and meant we had a mix of active properties being in collections and directly added as properties.