This prevents access to non-existent typeinfo during type initialization,
when node types have been removed and such nodes are deleted from older files.
All blenkernel functions now only set the node->update flag instead of directly
calling the update function. All operators, etc. calling blenkernel functions
to modify nodes should make a ntreeUpdate call afterward (they already did that
anyway).
Editor/RNA/renderer/etc. high-level functions still can do immediate updates by
using nodeUpdate and nodeUpdateID (replacing NodeTagChanged/NodeTagIDChanged
respectively). These old functions were previously used only for setting
compositor node needexec flags and clearing cached data, but have become generic
update functions that require type-specific functionality (i.e. a valid typeinfo
struct).
Files created in blender before this revision should be rendered in
exactly the same way they used to render before.
Patch by Morten Mikkelsen, finished by Ton and me.
threading to no longer sleep 50ms for each object, using work queue now.
Also it was showing SSS preprocessing while actually doing Volume precaching,
fixed as well.
- move some larger vars into a nested scope.
- replace memset with zero initializer.
- rempace VECCOPY macros with copy_v3v3
- change function args to give the float array length.
- converting nurbs to mesh was casting the material to unsigned char.
- subsurf was casting to char, then int -> short in a loop.
- have material functions take & return shorts.
- define __BIG_ENDIAN__ or __LITTLE_ENDIAN__ with cmake & scons.
- ENDIAN_ORDER is now a define rather than a global short.
- replace checks like this with single ifdef: #if defined(__sgi) || defined (__sparc) || defined (__sparc__) || defined (__PPC__) || defined (__ppc__) || defined (__hppa__) || defined (__BIG_ENDIAN__)
- remove BKE_endian.h which isn't used
Volume pre-caching altered shared data simultaneously in multiple threads, causing invalid scattering results when "Asymmetry" value was used. The view vector is now passed as a function argument.