This repository has been archived on 2023-10-09. You can view files and clone it, but cannot push or open issues or pull requests.
Files
blender-archive/source/blender/render/intern/source/pointdensity.c

267 lines
5.8 KiB
C
Raw Normal View History

* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
/*
* ***** BEGIN GPL LICENSE BLOCK *****
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software Foundation,
* Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
*
* The Original Code is Copyright (C) 2001-2002 by NaN Holding BV.
* All rights reserved.
*
* Contributors: Matt Ebb
*
* ***** END GPL LICENSE BLOCK *****
*/
#include <stdlib.h>
#include <stdio.h>
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
#include "BLI_arithb.h"
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
#include "BLI_kdtree.h"
#include "BKE_DerivedMesh.h"
#include "BKE_global.h"
#include "BKE_main.h"
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
#include "BKE_object.h"
#include "BKE_particle.h"
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
#include "DNA_texture_types.h"
#include "DNA_particle_types.h"
#include "render_types.h"
#include "renderdatabase.h"
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
#include "texture.h"
static void pointdensity_cache_psys(Render *re, PointDensity *pd, Object *ob, ParticleSystem *psys)
{
DerivedMesh* dm;
ParticleKey state;
float cfra=bsystem_time(ob,(float)G.scene->r.cfra,0.0);
int i, childexists;
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
float partco[3];
float obview[4][4];
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
/* init crap */
if (!psys || !ob || !pd) return;
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
Mat4MulMat4(obview, re->viewinv, ob->obmat);
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
/* Just to create a valid rendering context */
psys_render_set(ob, psys, re->viewmat, re->winmat, re->winx, re->winy, 0);
dm = mesh_create_derived_render(ob,CD_MASK_BAREMESH|CD_MASK_MTFACE|CD_MASK_MCOL);
dm->release(dm);
if ( !psys_check_enabled(ob, psys) ){
psys_render_restore(ob, psys);
return;
}
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
/* in case ob->imat isn't up-to-date */
Mat4Invert(ob->imat, ob->obmat);
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
pd->point_tree = BLI_kdtree_new(psys->totpart+psys->totchild);
if (psys->totchild > 0 && !(psys->part->draw & PART_DRAW_PARENT))
childexists = 1;
for (i = 0; i < psys->totpart + psys->totchild; i++) {
state.time = cfra;
if(psys_get_particle_state(ob, psys, i, &state, 0)) {
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
VECCOPY(partco, state.co);
if (pd->psys_cache_space == TEX_PD_OBJECTSPACE)
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
Mat4MulVecfl(ob->imat, partco);
else if (pd->psys_cache_space == TEX_PD_OBJECTLOC) {
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
float obloc[3];
VECCOPY(obloc, ob->loc);
VecSubf(partco, partco, obloc);
} else {
/* TEX_PD_WORLDSPACE */
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
}
BLI_kdtree_insert(pd->point_tree, i, partco, NULL);
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
}
}
BLI_kdtree_balance(pd->point_tree);
psys_render_restore(ob, psys);
}
static void pointdensity_cache_object(Render *re, PointDensity *pd, ObjectRen *obr)
{
int i;
if (!obr || !pd) return;
if(!obr->vertnodes) return;
/* in case ob->imat isn't up-to-date */
Mat4Invert(obr->ob->imat, obr->ob->obmat);
pd->point_tree = BLI_kdtree_new(obr->totvert);
for(i=0; i<obr->totvert; i++) {
float ver_co[3];
VertRen *ver= RE_findOrAddVert(obr, i);
VECCOPY(ver_co, ver->co);
if (pd->ob_cache_space == TEX_PD_OBJECTSPACE) {
Mat4MulVecfl(re->viewinv, ver_co);
Mat4MulVecfl(obr->ob->imat, ver_co);
} else {
/* TEX_PD_WORLDSPACE */
Mat4MulVecfl(re->viewinv, ver_co);
}
BLI_kdtree_insert(pd->point_tree, i, ver_co, NULL);
}
BLI_kdtree_balance(pd->point_tree);
}
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
static void cache_pointdensity(Render *re, Tex *tex)
{
PointDensity *pd = tex->pd;
if (pd->point_tree) {
BLI_kdtree_free(pd->point_tree);
pd->point_tree = NULL;
}
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
if (pd->source == TEX_PD_PSYS) {
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
ParticleSystem *psys;
Object *ob = pd->object;
int i;
for(psys=ob->particlesystem.first, i=0; i< pd->psysindex-1; i++)
psys= psys->next;
if (!ob || !psys) return;
pointdensity_cache_psys(re, pd, ob, psys);
}
else if (pd->source == TEX_PD_OBJECT) {
Object *ob = pd->object;
ObjectRen *obr;
int found=0;
/* find the obren that corresponds to the object */
for (obr=re->objecttable.first; obr; obr=obr->next) {
if (obr->ob == ob) {
found=1;
break;
}
}
if (!found) return;
pointdensity_cache_object(re, pd, obr);
}
* Volumetrics Removed all the old particle rendering code and options I had in there before, in order to make way for... A new procedural texture: 'Point Density' Point Density is a 3d texture that find the density of a group of 'points' in space and returns that in the texture as an intensity value. Right now, its at an early stage and it's only enabled for particles, but it would be cool to extend it later for things like object vertices, or point cache files from disk - i.e. to import point cloud data into Blender for rendering volumetrically. Currently there are just options for an Object and its particle system number, this is the particle system that will get cached before rendering, and then used for the texture's density estimation. It works totally consistent with as any other procedural texture, so previously where I've mapped a clouds texture to volume density to make some of those test renders, now I just map a point density texture to volume density. Here's a version of the same particle smoke test file from before, updated to use the point density texture instead: http://mke3.net/blender/devel/rendering/volumetrics/smoke_test02.blend There are a few cool things about implementing this as a texture: - The one texture (and cache) can be instanced across many different materials: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_instanced.png This means you can calculate and bake one particle system, but render it multiple times across the scene, with different material settings, at no extra memory cost. Right now, the particles are cached in world space, so you have to map it globally, and if you want it offset, you have to do it in the material (as in the file above). I plan to add an option to bake in local space, so you can just map the texture to local and it just works. - It also works for solid surfaces too, it just gets the density at that particular point on the surface, eg: http://mke3.net/blender/devel/rendering/volumetrics/pointdensity_solid.mov - You can map it to whatever you want, not only density but the various emissions and colours as well. I'd like to investigate using the other outputs in the texture too (like the RGB or normal outputs), perhaps with options to colour by particle age, generating normals for making particle 'dents' in a surface, whatever!
2008-09-28 08:00:22 +00:00
}
static void free_pointdensity(Render *re, Tex *tex)
{
PointDensity *pd = tex->pd;
if (pd->point_tree) {
BLI_kdtree_free(pd->point_tree);
pd->point_tree = NULL;
}
}
void make_pointdensities(Render *re)
{
Tex *tex;
if(re->scene->r.scemode & R_PREVIEWBUTS)
return;
re->i.infostr= "Caching Point Densities";
re->stats_draw(&re->i);
for (tex= G.main->tex.first; tex; tex= tex->id.next) {
if(tex->id.us && tex->type==TEX_POINTDENSITY) {
cache_pointdensity(re, tex);
}
}
}
void free_pointdensities(Render *re)
{
Tex *tex;
if(re->scene->r.scemode & R_PREVIEWBUTS)
return;
for (tex= G.main->tex.first; tex; tex= tex->id.next) {
if(tex->id.us && tex->type==TEX_POINTDENSITY) {
free_pointdensity(re, tex);
}
}
}
#define MAX_POINTS_NEAREST 25
int pointdensitytex(Tex *tex, float *texvec, TexResult *texres)
{
int rv = TEX_INT;
PointDensity *pd = tex->pd;
KDTreeNearest nearest[MAX_POINTS_NEAREST];
float density=0.0f;
int n, neighbours=0;
if ((!pd) || (!pd->point_tree)) {
texres->tin = 0.0f;
return 0;
}
neighbours = BLI_kdtree_find_n_nearest(pd->point_tree, pd->nearest, texvec, NULL, nearest);
for(n=1; n<neighbours; n++) {
if ( nearest[n].dist < pd->radius) {
float dist = 1.0 - (nearest[n].dist / pd->radius);
density += 3.0f*dist*dist - 2.0f*dist*dist*dist;
}
}
density /= neighbours;
density *= 1.0 / pd->radius;
texres->tin = density;
/*
texres->tr = 1.0f;
texres->tg = 1.0f;
texres->tb = 0.0f;
BRICONTRGB;
texres->ta = 1.0;
if (texres->nor!=NULL) {
texres->nor[0] = texres->nor[1] = texres->nor[2] = 0.0f;
}
*/
BRICONT;
return rv;
Point Density texture The Point Density texture now has some additional options for how the point locations are cached. Previously it was all relative to worldspace, but there are now some other options that make things a lot more convenient for mapping the texture to Local (or Orco). Thanks to theeth for helping with the space conversions! The new Object space options allow this sort of thing to be possible - a particle system, instanced on a transformed renderable object: http://mke3.net/blender/devel/rendering/volumetrics/pd_objectspace.mov It's also a lot easier to use multiple instances, just duplicate the renderable objects and move them around. The new particle cache options are: * Emit Object space This caches the particles relative to the emitter object's coordinate space (i.e. relative to the emitter's object center). This makes it possible to map the Texture to Local or Orco easily, so you can easily move, rotate or scale the rendering object that has the Point Density texture. It's relative to the emitter's location, rotation and scale, so if the object you're rendering the texture on is aligned differently to the emitter, the results will be rotated etc. * Emit Object Location This offsets the particles to the emitter object's location in 3D space. It's similar to Emit Object Space, however the emitter object's rotation and scale are ignored. This is probably the easiest to use, since you don't need to worry about the rotation and scale of the emitter object (just the rendered object), so it's the default. * Global Space This is the same as previously, the particles are cached in global space, so to use this effectively you'll need to map the texture to Global, and have the rendered object in the right global location.
2008-09-29 04:19:24 +00:00
}