velocity is measured in pixels per frame. It is basically a coordinate
difference of track coordinate at current frame and previous one (no future
prediction happens).
It's not really most intuitive place for such a things, but historically the
node was called this way..
Track velocity could be used to face effects like motion blur bu piping it
to the vector blur node.
Reviewers: campbellbarton
Reviewed By: campbellbarton
Subscribers: hype, sebastian_k
Differential Revision: https://developer.blender.org/D1591
Nodes have a feature for moving existing links to unoccupied sockets when connecting
to an already used input. This is based on the standard legacy socket types (value/float,
vector, color/rgba) and works reasonably well for shader, compositor and texture nodes.
For new pynode systems, however, the hardcoded nature of that feature has major drawbacks:
* It does not take different type systems into account, leading to meaningless connections
when sockets are swapped and making the feature useless or outright debilitating.
* Advanced socket behaviors would be possible with a registerable callback, e.g. creating
extensible input lists that move existing connections down to make room for a new link.
Now any handling of new links is done via the 'insert_links' callback, which can also be
registered through the RNA API. For the legacy shader/compo/tex nodes the behavior is the
same, using a C callback.
Note on the 'use_swap' flag: this has been removed because it was meaningless anyway:
It was disabled only for the insert-node-on-link feature, which works only for
completely unconnected nodes anyway, so there would be nothing to swap in the first place.
The issue was caused by non-continuous tangent space calculated for triangles.
This commit adds a Tangent input to Hair BSDF node which can be used to hook up
Tangent calculated form UV as an input to the node in order to make sure the
tangent space is continuous.
Doing this as an input instead of using default tangent layer from UV because of
several reasons:
- This way it's really easy to preserve compatibility with existing setups.
- Default UV map is not necessarily giving continuous space, one might want to
use other tangent space sources or distort the space for some artistic
decision.
Reviewers: juicyfruit, dingto
Reviewed By: dingto
Differential Revision: https://developer.blender.org/D1428
- Add blentranslation `BLT_*` module.
- moved & split `BLF_translation.h` into (`BLT_translation.h`, `BLT_lang.h`).
- moved `BLF_*_unifont` functions from `blf_translation.c` to new source file `blf_font_i18n.c`.
The issue was caused by the following construction:
def = env['SOMETHING']
defs.append('SOMETHING_MORE')
Since first assignment was actually referencing environment option it was totally
polluted hawing weird and wonderful side effects on all other areas of Blender.
This commit fixes issues with wrong socket type being added to the Cycles debug
pass compositor operation, which lead to crashes with non-value pass types.
This commit also reverts socket renaming thing because while it's was behaving
ok on runtime file reload might have loose the links which is annoying.
Currently only works correct with single float output, RGBA and vector are not
supported so if one need to use this passes he'll need to wait a bit still.
It is coming, don't worry.
This commit implements point density texture for Cycles shading nodes.
It's done via creating voxel texture at shader compilation time, Not
totally memory efficient, but avoids adding sampling code to kernel
(which keeps render time as low as possible), In the future this will
be compensated by using OpenVDB for more efficient storage of sparse
volume data.
Sampling of the voxel texture is happening at blender side and the
same code is used as for Blender Internal's renderer.
This texture is controlled by only object, particle system and radius.
Linear falloff is used and there's no turbulence. This is because
falloff is expected to happen using Curve Mapping node. Turbulence
will be done as a distortion on the input coordinate. It's already
possible to fake it using nose textures and in the future we can add
more proper turbulence distortion node, which then could also be used
for 2D texture mapping.
Particle color support is done by Lukas, thanks!
With this patch "Particle Info" node from Cycles works in GLSL and BI
Alexander (Blend4Web Team)
Reviewers: psy-fi
Note: moved particle info to object render instance instead of
shadeinput during review - Antony.
Differential Revision: https://developer.blender.org/D1313
Revert "Nodes: Remove hardcoded BLENDER_MAX_THREADS number of threads"
This reverts commit fdc653e8ce.
The threads override is not affected by the scene, and hence the limit of the
threads was not giving correct result. Need to re-consider some things here.
Use actual available number of threads now, which will make it easier
to increase max number of threads, without having some sloppy memory
usage and without doing some redundant checks on thread data which was
never used.
Official Documentation:
http://www.blender.org/manual/render/workflows/multiview.html
Implemented Features
====================
Builtin Stereo Camera
* Convergence Mode
* Interocular Distance
* Convergence Distance
* Pivot Mode
Viewport
* Cameras
* Plane
* Volume
Compositor
* View Switch Node
* Image Node Multi-View OpenEXR support
Sequencer
* Image/Movie Strips 'Use Multiview'
UV/Image Editor
* Option to see Multi-View images in Stereo-3D or its individual images
* Save/Open Multi-View (OpenEXR, Stereo3D, individual views) images
I/O
* Save/Open Multi-View (OpenEXR, Stereo3D, individual views) images
Scene Render Views
* Ability to have an arbitrary number of views in the scene
Missing Bits
============
First rule of Multi-View bug report: If something is not working as it should *when Views is off* this is a severe bug, do mention this in the report.
Second rule is, if something works *when Views is off* but doesn't (or crashes) when *Views is on*, this is a important bug. Do mention this in the report.
Everything else is likely small todos, and may wait until we are sure none of the above is happening.
Apart from that there are those known issues:
* Compositor Image Node poorly working for Multi-View OpenEXR
(this was working prefectly before the 'Use Multi-View' functionality)
* Selecting camera from Multi-View when looking from camera is problematic
* Animation Playback (ctrl+F11) doesn't support stereo formats
* Wrong filepath when trying to play back animated scene
* Viewport Rendering doesn't support Multi-View
* Overscan Rendering
* Fullscreen display modes need to warn the user
* Object copy should be aware of views suffix
Acknowledgments
===============
* Francesco Siddi for the help with the original feature specs and design
* Brecht Van Lommel for the original review of the code and design early on
* Blender Foundation for the Development Fund to support the project wrap up
Final patch reviewers:
* Antony Riakiotakis (psy-fi)
* Campbell Barton (ideasman42)
* Julian Eisel (Severin)
* Sergey Sharybin (nazgul)
* Thomas Dinged (dingto)
Code contributors of the original branch in github:
* Alexey Akishin
* Gabriel Caraballo
This way re-mapping scene nodes to EXR files becomes much easier,
no extra trickery with separate RGBA setups is needed.
Plus makes it more consistent with regular EXR files.
This uses EGBA pass to get alpha from.
This attribute means how "pointy" the geometry surface is, which allows to do
effects like dirt maps and wear-off effects on render geometry. This means the
attribute is calculated for the final mesh which means no baking (which implies
UV unwrap) is needed. Apart from this the behavior is quite close to how vertex
dirty colors works.
The new attribute is available as an output socket of Geometry node.
There's no penalty for the render time, only some delay on scene preparation
(the delay is linear of the mesh complexity).
Reviewers: brecht, juicyfruit
Subscribers: eyecandy, venomgfx
Differential Revision: https://developer.blender.org/D1086
Quite striaghtforward change, and in theory we can even try supporting motion
blur for the corner pin node (which is tricky because coordinates actually
coming from sockets, but with some black magic should be doable).
There was a differences between how Cycles and BI treats Normal shader:
- Different normal direction assumption
- Different policy about vector normalization
Previous idea of trying to use single function and flip the output if
needed becomes more tricky, so i've just added new GLSL function which
corresponds to how Cycles deals with the Normal shader.