Hair object - Node types design #78515

Open
opened 2020-07-02 01:08:02 +02:00 by Daniel Bystedt · 47 comments
Member

IMPORTANT: Please note that this document is subject to change and will most likely not represent the final product of the new hair object in Blender

Hair object - Node types design


Hair object - project description
https://developer.blender.org/T68981
Hair object design
https://developer.blender.org/T78606

Commissioner: @DanielBystedt d.bystedt@gmail.com, https://www.artstation.com/dbystedt
Project leader: @sebbas
Project members: @ideasman42, @lichtwerk

[Thread on devtalk ]] about scattering objects by [ https:*devtalk.blender.org/u/BD3D/summary | BD3D (eg Instances)


NOTES

  • Please note that all mockups are rather simplified at the moment. For example the sphere in most images will never be a part of the actual hair curves geometry. Attributes are visualized with black/white color etc.
  • Sockets are colored by using @JacquesLucke simulation editor in the daily builds of blender as a reference.

NODE GRAPH - EXAMPLES

Example 1
Open image in new tab to view details
{F8679279, height = 500}


NODES - DATA FLOW

  - Node: Input
  - Type: Data flow

Input type (Upper drop down menu)

  • Guide curves

    • that exists in current hair object
  • Alembic

    • Read (potentially) animated curves from alembic. List from .abc in blender file
  • Object

    • Object that exists in blender file. Used for geo masking or instancing object per hair
  • Collection content

    • Objects that exists in a collection in the blender file. Used for scattering random objects on surface. Random object in collection used per hair “instance”
  • Collection

    • That exists in blender file. Used for scattering objects on surface. The result of the entire collection is used per hair "instance"

{F8673998, height = 400}

Example of input node where Input type = Alembic
{F8674005, height = 400}


  - Node: Delete
  - Type: Data flow

Deletes vertices if attribute value in socket Fac is higher than 0.5. Avoid "floating" hairs where the root is missing.

Note on Fac
Visualisation of input mask. In a node tree this is represented by nodes of course (for example image texture)

{F8674022, height = 400}


  - Node: Split
  - Type: Data flow

Splits hair curves into two separate "groups". A hair curve can only belong to one group or the other (i.e. avoid hair curves that are split from root to tip)

Note on Fac
Visualisation of input mask. In a node tree this is represented by nodes of course (for example image texture)

{F8658118, height = 400}


  - Node: Merge
  - Type: Data flow

merge two inputs of hair curves to one output. It would also be nice to allow merge for instanced geo

{F8658184, height = 400}


  - Node: Switch
  - Type: Data flow

Hair 1 or 2 is selected as hair output via drop down menu. This could just as well be a slider, radio button etc

{F8658187, height = 400}


  - Node: Hair curves output
  - Type: Data flow

hair curves output to viewport and render. Multiple outputs can exist in the node graph, but only one can be active (just like in the shader editor)

{F8674220, height = 400}

  - Node: Object output
  - Type: Data flow

Output node for instances (rocks, trees etc) and deformed instances (game hair, feathers).

IMPORTANT: I would love developer input regarding if this is the best way to handle instances and deformed instances. "Deformed instances" might not be the best term, but I hope you understand what I mean.

{F8674235, height = 400}


NODES - CREATE


  - Node: Scatter hair
  - Type: Create

Influence radius (float)
Controls of how wide radius the input hair controls the direction/interpolation of the output hair. It also scatter hair within the area of the input hair roots. Using this could turn the children=simple type of hair useless

Vertices (integer)
Vertex count per hair curves

Relax iteration (integer)
Relaxes the distrubution of root vertecies. In the current implementation of hair, the distribution is pretty uneven. Possibly skip input if it complicates coding (eg if sampling from texture etc is hard to implement)

Density (float)
Sets the density of the hair curves

Length (float)
Length shoud also be represented by an individual node. For simpler node networks it could be nice to have length included in the scatter hair node. Mostly because it's convenient.

Width (float)
Sets the width of the hair curves

{F8674432, height = 400}


  - Node: Resample
  - Type: Create

Vertex amount (upper drop down menu)

  • Count
    • Amount of vertices per hair curve is decided depending on the value in the variable Fac
  • Length
    • Amount of vertices per hair curve is decided depending on distance in object space between each vertex/sample. Distance is set in the variable Fac

Fac
Float value used for variable vertex amount. When vertex amount = Count, then round off to lower integer (eg floor(Fac))

Resample method
When upsampling a low res hair curve it is useful to have the result as a smooth curve.

  • Nurbs
  • Linear

{F8663697, height = 400}

  - Node: Instance
  - Type: Create

Instance node socket (left side)
Accepts input node with type

  • Object
  • Collection content
  • Collection

Deform

  • False: the instanced objects are not deformed
  • True: The instanced objects are deformed along the shape of each hair curve

Object up (1:st drop down menu)
Decides which axis (in object space) of the source object that should be treated as object up (pointing in the direction of the hair curve tip vertex)

  • +X
  • -X
  • +Y
  • -Y
  • +Z
  • -Z

Object front (2:nd drop down menu)
Decides which axis (in object space) of the source object that should be treated as object front.

Front aim vector (3:d drop down menu)
Decides which vector that "Object front" should aim.

  • Tangent (Cross product calculated from position of root vertex and second vertex)
  • +X
  • -X
  • +Y
  • -Y
  • +Z
  • -Z

Tilt

  • If Deform = True: Rotate per component (vertex) along axis of vector between root and tip hair vertex.
  • If Deform = False: Rotate per hair curve along axis of vector between root and tip hair vertex.

NOTE: The word "Tilt" is used, since it is already used for curves in Blender

Non deforming example. Instanced objects up vector are aligned from root to tip. Used for trees, rocks etc
{F8674511, height = 400}

Deforming example. Used for game hair, feathers etc
{F8674541, height = 400}


NODES - DEFORM


  - Node: Deform
  - Type: Create

Direction
Along the direction of the hair (per segment/vertex)

Normal
Inherited/Transfered from the growth mesh

Read more about transferred normals from growth mesh to guide curves here #78606 (Hair object - design proposal)

{F8658219, height = 400}


  - Node: Frizz
  - Type: Create

{F8658223, height = 400}


  - Node: Rotate
  - Type: Deform

{F8658230, height = 200}
{F8658225, height = 400}


  - Node: Curl
  - Type: Deform

{F8658233, height = 400}


  - Node: Length
  - Type: Deform

Cut/Extend
Cut = shorter, extend = grow along vector between last two vertices at the tip

Scale
Scale per hair from hair root vertex

{F8658235, height = 400}


  - Node: Clump
  - Type: Deform

Fac
Amount of clumping. The user can edit the attribute that is connected to this socket in order to clump more/less at the root/tip of the hair curves.

Frequency
The primarily clump frequencey/size

Fractal iterations (integer)
Creating smaller (recursive) clumps per clump so that clumping does not look as uniform

{F8674575, height = 400}


  - Node: Width
  - Type: Deform

Width is also controllable through the UI of the hair object (outside of node network) and act as a multiplier to the width inside of the node network. There should be separate width for viewport and render

{F8663656, height = 400}


  - Node: Shrinkwrap
  - Type: Deform

This node is very useful for creating the look of wet hair

Object
Shrinkwrap target. Collection or geometrical object. See the node Input in the data flow section

Fac
The amount/blend of whrinkwrap deform

{F8674660, height = 400}


  - Node: Smooth
  - Type: Deform

Smooth vector (upper drop down menu)

  • Hair tip vector
  • Normal vector

Curve length (lower drop down menu)

  • Keep length
  • Lock tip

Lower drop down menu

normal vector + keep length
In this example the vector that the hair smooths/conforms to is the normal direction from the growth mesh surface. The length of the original hair curve is kept as it is smoothed
{F8663673, height = 200}

Hair tip vector + lock tip
In this example the vector that the hair smooths/conforms to is the vector from the root to the tip of the hair curve (pre smooth). The length of the hair curve is shortened so that the tip of the hair stays in place.
{F8663674, height = 200}

Iterations (integer)
Smoothing iterations

{F8674688, height = 400}


  - Node: Simulation influence
  - Type: Deform?

IMPORTANT: Update after feedback from @JacquesLucke. Feel free to give further feedback

Reference to a simulation network where the actual simulation happens. The simulation network simulates the hair and the new position per hair curve vertex is output from the right socket. It is useful to simulate a sparse set of hair curves and then use a scatter hair node after in order to generate more hair curves that are conformed to the shape of the sparse set of simulated hair curves.

Simulation network (top drop down menu)
Link to a node network with simulation setup

{F8674704, height = 400}


NODES - ATTRIBUTE

Attributes on hair objects are created in the hair object UI, much like vertex groups on mesh objects. Perhaps these attributes could be picked up during shading in the future? See #78606 (Hair object - design proposal) for more info


  - Node: Object weight transfer
  - Type: Attribute

Input weight (top drop down menu)
Gets value of vertex weight from input object(s). Useful for dynamic weights controlled by modifiers. Input can be a mesh object (see Input node in the Data flow* section in this document.

{F8674777, height = 400}


  - Node: UV transfer
  - Type: Attribute

I've been thinking about how to solve multiple uv sets in conjunction with the generation of hair curves. This node is one solution and is rather flexible, since we can sample uv's from any object. Another route would be to decide that we only source the uv's from the growth mesh. Feel free to give input regarding what would be the easiest in terms of development/coding.

{F8674782, height = 400}


  - Node: Hair root proximity
  - Type: Attribute

The hair root proximity node is very useful when scattering ground vegetation with the Instance node. The root mask can be used to make sure that bushes should not grow close to trees for instance.

Radius
The radius in object space of the mask produced from each root vertex of the hair curves

{F8674896, height = 400}

  - Node: Attribute get
  - Type: Attribute

Whenever the user needs to remap an attribute (multiply, add, clamp etc) she/he can break out the attribute with an attribute get node, do operations and then use a attribute set node to write it to the hair curves later in the node graph

{F8674889, height = 400}


  - Node: Attribute set
  - Type: Attribute

Whenever the user needs to remap an attribute (multiply, add, clamp etc) she/he can break out the attribute with an attribute get node, do operations and then use a attribute set node to write it to the hair curves later in the node graph. Usually the attribute is used right after the node, so it's convenient with a ouput socket with Factor.

{F8679290, height = 400}


  - Node: Geometry intersection
  - Type: Attribute

NOTE: I will add further visual examples to how this works later

Position (Top drop down menu)

  • Inside
    • Masks the hair curve vertices that positioned inside of the geometry object.
  • Outside
    • Masks the hair curve vertices that positioned outside of the geometry object.

Hair component(Bottom drop down menu)

  • Root
    • For each hair curve: if hair curve root is positioned inside/outside (depending on the variable Position), all of the vertices per hair is masked
  • Vertex
    • For each hair curve: If a vertex is positioned inside/outside (depending on the variable Position), that vertex and all of the vertices with higher index number per hair curve ("child vertices") will get a value of 1 for the hair curve attribute that is specified in the node.

{F8674898, height = 400}


USEFUL NODES FROM THE SHADING EDITOR

Almost all nodes in the shading editor would be very useful for creating masks for hair. It would be nice to reuse so the functionality is instantly recognized by the user. I need input from developers such as @sebbas if it's even possible to use existing nodes from shading editor.

Shading editor nodes is evaluated after the hair is deformed and we will likely want to evaluate these nodes BEFORE the final deformation of the hair. This could cause issues and I'm not sure it's possible to work around it somehow. Seems tedious to rewrite so many nodes.

A lot of these nodes would require an object/geometry input socket in order to define the source (such as

  • Vertex Color
  • Attribute
  • Texture Coordinate

{F8663900, height = 700}

IMPORTANT: Please note that this document is subject to change and will most likely not represent the final product of the new hair object in Blender ## Hair object - Node types design --- ## Reference links Hair object - project description https://developer.blender.org/T68981 Hair object design https://developer.blender.org/T78606 Commissioner: @DanielBystedt d.bystedt@gmail.com, https://www.artstation.com/dbystedt Project leader: @sebbas Project members: @ideasman42, @lichtwerk [Thread on devtalk ]] about scattering objects by [[ https:*devtalk.blender.org/u/BD3D/summary | BD3D ](https:*devtalk.blender.org/t/future-particle-hair-nodes-discussion-and-everything-wrong-with-hair-particles/9576/5) (eg Instances) --- ## NOTES - Please note that all mockups are rather simplified at the moment. For example the sphere in most images will never be a part of the actual hair curves geometry. Attributes are visualized with black/white color etc. - Sockets are colored by using @JacquesLucke simulation editor in the daily builds of blender as a reference. --- ## NODE GRAPH - EXAMPLES **Example 1** Open image in new tab to view details {[F8679279](https://archive.blender.org/developer/F8679279/node_graph_example_1.jpg), height = 500} --- ## NODES - DATA FLOW - Node: Input - Type: Data flow **Input type (Upper drop down menu)** - Guide curves - that exists in current hair object - Alembic - Read (potentially) animated curves from alembic. List from .abc in blender file - Object - Object that exists in blender file. Used for geo masking or instancing object per hair - Collection content - Objects that exists in a collection in the blender file. Used for scattering random objects on surface. Random object in collection used per hair “instance” - Collection - That exists in blender file. Used for scattering objects on surface. The result of the entire collection is used per hair "instance" {[F8673998](https://archive.blender.org/developer/F8673998/image.png), height = 400} Example of input node where **Input type** = Alembic {[F8674005](https://archive.blender.org/developer/F8674005/image.png), height = 400} --- - Node: Delete - Type: Data flow Deletes vertices if attribute value in socket **Fac** is higher than 0.5. Avoid "floating" hairs where the root is missing. **Note on Fac** Visualisation of input mask. In a node tree this is represented by nodes of course (for example image texture) {[F8674022](https://archive.blender.org/developer/F8674022/image.png), height = 400} --- - Node: Split - Type: Data flow Splits hair curves into two separate "groups". A hair curve can only belong to one group or the other (i.e. avoid hair curves that are split from root to tip) **Note on Fac** Visualisation of input mask. In a node tree this is represented by nodes of course (for example image texture) {[F8658118](https://archive.blender.org/developer/F8658118/image.png), height = 400} --- - Node: Merge - Type: Data flow merge two inputs of hair curves to one output. It would also be nice to allow merge for instanced geo {[F8658184](https://archive.blender.org/developer/F8658184/image.png), height = 400} --- - Node: Switch - Type: Data flow Hair 1 or 2 is selected as hair output via drop down menu. This could just as well be a slider, radio button etc {[F8658187](https://archive.blender.org/developer/F8658187/image.png), height = 400} --- - Node: Hair curves output - Type: Data flow hair curves output to viewport and render. Multiple outputs can exist in the node graph, but only one can be active (just like in the shader editor) {[F8674220](https://archive.blender.org/developer/F8674220/image.png), height = 400} --- - Node: Object output - Type: Data flow Output node for instances (rocks, trees etc) and deformed instances (game hair, feathers). IMPORTANT: I would love developer input regarding if this is the best way to handle instances and deformed instances. "Deformed instances" might not be the best term, but I hope you understand what I mean. {[F8674235](https://archive.blender.org/developer/F8674235/image.png), height = 400} --- ## NODES - CREATE --- - Node: Scatter hair - Type: Create **Influence radius (float)** Controls of how wide radius the input hair controls the direction/interpolation of the output hair. It also scatter hair within the area of the input hair roots. Using this could turn the children=simple type of hair useless **Vertices (integer)** Vertex count per hair curves **Relax iteration (integer)** Relaxes the distrubution of root vertecies. In the current implementation of hair, the distribution is pretty uneven. Possibly skip input if it complicates coding (eg if sampling from texture etc is hard to implement) **Density (float)** Sets the density of the hair curves **Length (float)** Length shoud also be represented by an individual node. For simpler node networks it could be nice to have length included in the scatter hair node. Mostly because it's convenient. **Width (float)** Sets the width of the hair curves {[F8674432](https://archive.blender.org/developer/F8674432/image.png), height = 400} --- - Node: Resample - Type: Create **Vertex amount (upper drop down menu)** - Count - Amount of vertices per hair curve is decided depending on the value in the variable **Fac** - Length - Amount of vertices per hair curve is decided depending on distance in object space between each vertex/sample. Distance is set in the variable **Fac** **Fac** Float value used for variable **vertex amount**. When **vertex amount** = Count, then round off to lower integer (eg floor(Fac)) **Resample method** When upsampling a low res hair curve it is useful to have the result as a smooth curve. - Nurbs - Linear {[F8663697](https://archive.blender.org/developer/F8663697/image.png), height = 400} --- - Node: Instance - Type: Create **Instance node socket (left side)** Accepts input node with type - Object - Collection content - Collection **Deform** - False: the instanced objects are not deformed - True: The instanced objects are deformed along the shape of each hair curve **Object up (1:st drop down menu)** Decides which axis (in object space) of the source object that should be treated as object up (pointing in the direction of the hair curve tip vertex) - +X - -X - +Y - -Y - +Z - -Z **Object front (2:nd drop down menu)** Decides which axis (in object space) of the source object that should be treated as object front. **Front aim vector (3:d drop down menu)** Decides which vector that "Object front" should aim. - Tangent (Cross product calculated from position of root vertex and second vertex) - +X - -X - +Y - -Y - +Z - -Z **Tilt** - If Deform = True: Rotate **per component (vertex)** along axis of vector between root and tip hair vertex. - If Deform = False: Rotate **per hair curve** along axis of vector between root and tip hair vertex. NOTE: The word "Tilt" is used, since it is already used for curves in Blender Non deforming example. Instanced objects up vector are aligned from root to tip. Used for trees, rocks etc {[F8674511](https://archive.blender.org/developer/F8674511/image.png), height = 400} Deforming example. Used for game hair, feathers etc {[F8674541](https://archive.blender.org/developer/F8674541/image.png), height = 400} --- ## NODES - DEFORM --- - Node: Deform - Type: Create **Direction** Along the direction of the hair (per segment/vertex) **Normal** Inherited/Transfered from the growth mesh Read more about transferred normals from growth mesh to guide curves here #78606 (Hair object - design proposal) {[F8658219](https://archive.blender.org/developer/F8658219/image.png), height = 400} --- - Node: Frizz - Type: Create {[F8658223](https://archive.blender.org/developer/F8658223/image.png), height = 400} --- - Node: Rotate - Type: Deform {[F8658230](https://archive.blender.org/developer/F8658230/rotate_hair.gif), height = 200} {[F8658225](https://archive.blender.org/developer/F8658225/image.png), height = 400} --- - Node: Curl - Type: Deform {[F8658233](https://archive.blender.org/developer/F8658233/image.png), height = 400} --- - Node: Length - Type: Deform Cut/Extend Cut = shorter, extend = grow along vector between last two vertices at the tip Scale Scale per hair from hair root vertex {[F8658235](https://archive.blender.org/developer/F8658235/image.png), height = 400} --- - Node: Clump - Type: Deform **Fac** Amount of clumping. The user can edit the attribute that is connected to this socket in order to clump more/less at the root/tip of the hair curves. **Frequency** The primarily clump frequencey/size **Fractal iterations (integer)** Creating smaller (recursive) clumps per clump so that clumping does not look as uniform {[F8674575](https://archive.blender.org/developer/F8674575/image.png), height = 400} --- - Node: Width - Type: Deform Width is also controllable through the UI of the hair object (outside of node network) and act as a multiplier to the width inside of the node network. There should be separate width for viewport and render {[F8663656](https://archive.blender.org/developer/F8663656/image.png), height = 400} --- - Node: Shrinkwrap - Type: Deform This node is very useful for creating the look of wet hair **Object** Shrinkwrap target. Collection or geometrical object. See the node **Input** in the **data flow** section **Fac** The amount/blend of whrinkwrap deform {[F8674660](https://archive.blender.org/developer/F8674660/image.png), height = 400} --- - Node: Smooth - Type: Deform **Smooth vector (upper drop down menu)** - Hair tip vector - Normal vector **Curve length (lower drop down menu)** - Keep length - Lock tip Lower drop down menu **normal vector + keep length** In this example the vector that the hair smooths/conforms to is the normal direction from the growth mesh surface. The length of the original hair curve is kept as it is smoothed {[F8663673](https://archive.blender.org/developer/F8663673/normal_vector___keep_length.gif), height = 200} **Hair tip vector + lock tip** In this example the vector that the hair smooths/conforms to is the vector from the root to the tip of the hair curve (pre smooth). The length of the hair curve is shortened so that the tip of the hair stays in place. {[F8663674](https://archive.blender.org/developer/F8663674/Hair_tip_vector___lock_tip.gif), height = 200} **Iterations (integer)** Smoothing iterations {[F8674688](https://archive.blender.org/developer/F8674688/image.png), height = 400} --- - Node: Simulation influence - Type: Deform? IMPORTANT: Update after feedback from @JacquesLucke. Feel free to give further feedback Reference to a simulation network where the actual simulation happens. The simulation network simulates the hair and the new position per hair curve vertex is output from the right socket. It is useful to simulate a sparse set of hair curves and then use a **scatter hair** node after in order to generate more hair curves that are conformed to the shape of the sparse set of simulated hair curves. **Simulation network** (top drop down menu) Link to a node network with simulation setup {[F8674704](https://archive.blender.org/developer/F8674704/image.png), height = 400} --- ## NODES - ATTRIBUTE Attributes on hair objects are created in the hair object UI, much like vertex groups on mesh objects. Perhaps these attributes could be picked up during shading in the future? See #78606 (Hair object - design proposal) for more info --- - Node: Object weight transfer - Type: Attribute **Input weight (top drop down menu)** Gets value of vertex weight from input object(s). Useful for dynamic weights controlled by modifiers. Input can be a mesh object (see **Input** node in the *Data flow** section in this document. {[F8674777](https://archive.blender.org/developer/F8674777/image.png), height = 400} --- - Node: UV transfer - Type: Attribute I've been thinking about how to solve multiple uv sets in conjunction with the generation of hair curves. This node is one solution and is rather flexible, since we can sample uv's from any object. Another route would be to decide that we only source the uv's from the growth mesh. Feel free to give input regarding what would be the easiest in terms of development/coding. {[F8674782](https://archive.blender.org/developer/F8674782/image.png), height = 400} --- - Node: Hair root proximity - Type: Attribute The hair root proximity node is very useful when scattering ground vegetation with the **Instance** node. The root mask can be used to make sure that bushes should not grow close to trees for instance. **Radius** The radius in object space of the mask produced from each root vertex of the hair curves {[F8674896](https://archive.blender.org/developer/F8674896/image.png), height = 400} --- - Node: Attribute get - Type: Attribute Whenever the user needs to remap an attribute (multiply, add, clamp etc) she/he can break out the attribute with an **attribute get** node, do operations and then use a **attribute set** node to write it to the hair curves later in the node graph {[F8674889](https://archive.blender.org/developer/F8674889/image.png), height = 400} --- - Node: Attribute set - Type: Attribute Whenever the user needs to remap an attribute (multiply, add, clamp etc) she/he can break out the attribute with an **attribute get** node, do operations and then use a **attribute set** node to write it to the hair curves later in the node graph. Usually the attribute is used right after the node, so it's convenient with a ouput socket with Factor. {[F8679290](https://archive.blender.org/developer/F8679290/image.png), height = 400} --- - Node: Geometry intersection - Type: Attribute NOTE: I will add further visual examples to how this works later **Position (Top drop down menu)** - Inside - Masks the hair curve vertices that positioned **inside** of the geometry object. - Outside - Masks the hair curve vertices that positioned **outside** of the geometry object. **Hair component(Bottom drop down menu)** - Root - For each hair curve: if hair curve root is positioned inside/outside (depending on the variable **Position**), all of the vertices per hair is masked - Vertex - For each hair curve: If a vertex is positioned inside/outside (depending on the variable **Position**), that vertex and all of the vertices with higher index number per hair curve ("child vertices") will get a value of 1 for the hair curve attribute that is specified in the node. {[F8674898](https://archive.blender.org/developer/F8674898/image.png), height = 400} --- ## USEFUL NODES FROM THE SHADING EDITOR Almost all nodes in the shading editor would be very useful for creating masks for hair. It would be nice to reuse so the functionality is instantly recognized by the user. I need input from developers such as @sebbas if it's even possible to use existing nodes from shading editor. Shading editor nodes is evaluated after the hair is deformed and we will likely want to evaluate these nodes BEFORE the final deformation of the hair. This could cause issues and I'm not sure it's possible to work around it somehow. Seems tedious to rewrite so many nodes. A lot of these nodes would require an object/geometry input socket in order to define the source (such as - Vertex Color - Attribute - Texture Coordinate {[F8663900](https://archive.blender.org/developer/F8663900/image.png), height = 700}
Author
Member

Changed status from 'Needs Triage' to: 'Confirmed'

Changed status from 'Needs Triage' to: 'Confirmed'
Author
Member

Added subscriber: @DanielBystedt

Added subscriber: @DanielBystedt
Member

Added subscriber: @JacquesLucke

Added subscriber: @JacquesLucke

Added subscriber: @Andruxa696

Added subscriber: @Andruxa696

Added subscriber: @Pipeliner

Added subscriber: @Pipeliner

Added subscriber: @RC12

Added subscriber: @RC12
Author
Member

Added subscribers: @sebbas, @lichtwerk, @ideasman42

Added subscribers: @sebbas, @lichtwerk, @ideasman42
Member

Hair simulation should be done in another node tree indeed. We have the Simulation data block now, that owns a simulation node tree, that will contain simulation nodes for particles and hair (and later hopefully other simulation types). At least that is my understanding.

It's good to see that the other nodes fit quite well into a data-flow-graph. So they have inputs, compute some stuff, and output new/modified stuff. I think those nodes should exist in the same node tree as mesh/volume/modifier nodes. We don't really have an agreed upon design for that yet. We might need a design-sprint for that at some point, similar to what we had for simulation nodes.

One thing that is not entirely clear yet is the relationship between meshes and hair:
Does your "Hair" socket include hair+mesh? If yes, what are Split and Merge doing to the mesh.
Does the hair object somehow reference a base mesh? Can individual strands reference different base meshes? Or maybe hair should only be loosely coupled to a mesh somehow.

Also what exactly is the pink socket labelled "Instance" and "Output". Is it a mesh, or any other object (including hair, volume, ...). How is it different from the black socket?

I'm not sure if the "Weight" node can work like that. It outputs a float and it is not clear whether it is a float per position, per vertex, per polygon, ... Alternatively, the node could take a mesh and hair as input and then transfer some weight attribute from the mesh over to the hair object.

The "Root Mask" problem has a similar problem, but the other way around.

Does the "Geometry Mask" node "select" a bunch of hairs (creates a "strand group" that contains these hairs) or create new hair object containing only the masked strands?

I'm using some of the general nodes (like Combine XYZ) from the shader editor in the simulation node editor as well. Those "function nodes" can be used in hair nodes as well. Nodes like "Hair Info", "UV Map", "Vertex Color", ... probably can't be use without changes, because they have the same problem as the "Weight" and "Root Mask" node: the domain of their output value is not clear. In the shader editor, all values are in the same domain conceptually (at a single point on a surface or inside a volume).

Hair simulation should be done in another node tree indeed. We have the `Simulation` data block now, that owns a simulation node tree, that will contain simulation nodes for particles and hair (and later hopefully other simulation types). At least that is my understanding. It's good to see that the other nodes fit quite well into a data-flow-graph. So they have inputs, compute some stuff, and output new/modified stuff. I think those nodes should exist in the same node tree as mesh/volume/modifier nodes. We don't really have an agreed upon design for that yet. We might need a design-sprint for that at some point, similar to what we had for simulation nodes. One thing that is not entirely clear yet is the relationship between meshes and hair: Does your "Hair" socket include hair+mesh? If yes, what are Split and Merge doing to the mesh. Does the hair object somehow reference a base mesh? Can individual strands reference different base meshes? Or maybe hair should only be loosely coupled to a mesh somehow. Also what exactly is the pink socket labelled "Instance" and "Output". Is it a mesh, or any other object (including hair, volume, ...). How is it different from the black socket? I'm not sure if the "Weight" node can work like that. It outputs a float and it is not clear whether it is a float per position, per vertex, per polygon, ... Alternatively, the node could take a mesh and hair as input and then transfer some weight attribute from the mesh over to the hair object. The "Root Mask" problem has a similar problem, but the other way around. Does the "Geometry Mask" node "select" a bunch of hairs (creates a "strand group" that contains these hairs) or create new hair object containing only the masked strands? I'm using some of the general nodes (like Combine XYZ) from the shader editor in the simulation node editor as well. Those "function nodes" can be used in hair nodes as well. Nodes like "Hair Info", "UV Map", "Vertex Color", ... probably can't be use without changes, because they have the same problem as the "Weight" and "Root Mask" node: the domain of their output value is not clear. In the shader editor, all values are in the same domain conceptually (at a single point on a surface or inside a volume).
Author
Member

Thanks for the feedback @JacquesLucke. Your questions are a really good reference for how I should make my design task more readable and clear. I'll start off by adressing your questions

In #78515#973976, @JacquesLucke wrote:
Hair simulation should be done in another node tree indeed. We have the Simulation data block now, that owns a simulation node tree, that will contain simulation nodes for particles and hair (and later hopefully other simulation types). At least that is my understanding.

Sounds great. Then perhaps it's best to completely remove

  • Physics velocity
  • Damp
  • Gravity
  • Force fields

from the node and only keep the node tree dropdown?
{F8663740, height = 400}

It's good to see that the other nodes fit quite well into a data-flow-graph. So they have inputs, compute some stuff, and output new/modified stuff. I think those nodes should exist in the same node tree as mesh/volume/modifier nodes. We don't really have an agreed upon design for that yet. We might need a design-sprint for that at some point, similar to what we had for simulation nodes.

Great! I'm very open to changing design. I only care about getting the functionality from the nodes

One thing that is not entirely clear yet is the relationship between meshes and hair:
Does your "Hair" socket include hair+mesh? If yes, what are Split and Merge doing to the mesh.

The hair data (orange dots) that is passed through the node tree only contain hair geometry (i.e. vertices and edges). The sphere growth mesh is only used for visualisation in the node examples. Perhaps I should do a node connection mockup example somehow so that it becomes even more clear. The Split node only separates hair curves based pm an input mask.

Does the hair object somehow reference a base mesh? Can individual strands reference different base meshes? Or maybe hair should only be loosely coupled to a mesh somehow.

Yes, the hair object references a base mesh. My current idea is to set the growth mesh in the hair object properties UI. It felt easier for me to handle one growth mesh per hair object in cases where the user needs to temporary detatch hair for remodeling or changing growth mesh and then attaching the hair again. See growth mesh in the image below

image.png

Also what exactly is the pink socket labelled "Instance" and "Output". Is it a mesh, or any other object (including hair, volume, ...). How is it different from the black socket?

Instances are sourced from the input (data flow) node. It can reference

  • mesh objects
  • Entire collection
  • Collection content (e.g. random objects that belongs to a specific collection)

When I did the mockup I thought that perhaps I needed to define instances created in the hair object node tree with other colors, but I don't have any strong feeling about this. Perhaps it's better with black sockets? I guess they are sort of the same.

I'm not sure if the "Weight" node can work like that. It outputs a float and it is not clear whether it is a float per position, per vertex, per polygon, ... Alternatively, the node could take a mesh and hair as input and then transfer some weight attribute from the mesh over to the hair object.

It outputs a float value between 0-1 per hair curve vertex. The weight node sources the weight from an object via the input node (or multiple object if the input node contains a collection and each object has a weight with a matching weight name). Weights are stored as float values per vertices on the source object. Therefore there will need to be a "transfer process" where the weight from the object(s) vertices are transferred to the root per hair curve and then copied to the remaining vertices per hair curve. I'll try to add a visual example so it's a bit more clear.

The "Root Mask" problem has a similar problem, but the other way around.

It outputs a float value between 0-1 per hair curve vertex. I think that I need to do another visual example of this if my node mockup is not explanation enough. It's basically the same as the process of the weight node, but in this case the source geometry is a mesh containing each root vertex of the input hair mesh.

Does the "Geometry Mask" node "select" a bunch of hairs (creates a "strand group" that contains these hairs) or create new hair object containing only the masked strands?

Ah. My bad. The output socket should be a Faq (i.e. float value between 0-1 per hair curve vertex).
For example: very short and curly hair that ends up inside of a characters body can be masked and deformed so that they can displaced outside of the body (I did something like this in houdini a while back and found it really useful)

I'm using some of the general nodes (like Combine XYZ) from the shader editor in the simulation node editor as well. Those "function nodes" can be used in hair nodes as well. Nodes like "Hair Info", "UV Map", "Vertex Color", ... probably can't be use without changes, because they have the same problem as the "Weight" and "Root Mask" node: the domain of their output value is not clear. In the shader editor, all values are in the same domain conceptually (at a single point on a surface or inside a volume).

Yeah I suspected this. I hate to load a lot of work on developers, bit it would be amazing to have access to those nodes in the hair object node graph as well.

Thanks for the feedback @JacquesLucke. Your questions are a really good reference for how I should make my design task more readable and clear. I'll start off by adressing your questions > In #78515#973976, @JacquesLucke wrote: > Hair simulation should be done in another node tree indeed. We have the `Simulation` data block now, that owns a simulation node tree, that will contain simulation nodes for particles and hair (and later hopefully other simulation types). At least that is my understanding. Sounds great. Then perhaps it's best to completely remove - Physics velocity - Damp - Gravity - Force fields from the node and only keep the node tree dropdown? {[F8663740](https://archive.blender.org/developer/F8663740/image.png), height = 400} > It's good to see that the other nodes fit quite well into a data-flow-graph. So they have inputs, compute some stuff, and output new/modified stuff. I think those nodes should exist in the same node tree as mesh/volume/modifier nodes. We don't really have an agreed upon design for that yet. We might need a design-sprint for that at some point, similar to what we had for simulation nodes. Great! I'm very open to changing design. I only care about getting the functionality from the nodes > One thing that is not entirely clear yet is the relationship between meshes and hair: > Does your "Hair" socket include hair+mesh? If yes, what are Split and Merge doing to the mesh. The hair data (orange dots) that is passed through the node tree only contain hair geometry (i.e. vertices and edges). The sphere growth mesh is only used for visualisation in the node examples. Perhaps I should do a node connection mockup example somehow so that it becomes even more clear. The **Split** node only separates hair curves based pm an input mask. > Does the hair object somehow reference a base mesh? Can individual strands reference different base meshes? Or maybe hair should only be loosely coupled to a mesh somehow. Yes, the hair object references a base mesh. My current idea is to set the growth mesh in the hair object properties UI. It felt easier for me to handle one growth mesh per hair object in cases where the user needs to temporary detatch hair for remodeling or changing growth mesh and then attaching the hair again. See **growth mesh** in the image below ![image.png](https://archive.blender.org/developer/F8664056/image.png) > Also what exactly is the pink socket labelled "Instance" and "Output". Is it a mesh, or any other object (including hair, volume, ...). How is it different from the black socket? Instances are sourced from the *input* (data flow) node. It can reference - mesh objects - Entire collection - Collection content (e.g. random objects that belongs to a specific collection) When I did the mockup I thought that perhaps I needed to define instances created in the hair object node tree with other colors, but I don't have any strong feeling about this. Perhaps it's better with black sockets? I guess they are sort of the same. > I'm not sure if the "Weight" node can work like that. It outputs a float and it is not clear whether it is a float per position, per vertex, per polygon, ... Alternatively, the node could take a mesh and hair as input and then transfer some weight attribute from the mesh over to the hair object. It outputs a float value between 0-1 per hair curve vertex. The **weight** node sources the weight from an object via the **input node** (or multiple object if the **input node** contains a collection and each object has a weight with a matching weight name). Weights are stored as float values per vertices on the source object. Therefore there will need to be a "transfer process" where the weight from the object(s) vertices are transferred to the root per hair curve and then copied to the remaining vertices per hair curve. I'll try to add a visual example so it's a bit more clear. > The "Root Mask" problem has a similar problem, but the other way around. It outputs a float value between 0-1 per hair curve vertex. I think that I need to do another visual example of this if my node mockup is not explanation enough. It's basically the same as the process of the **weight** node, but in this case the source geometry is a mesh containing each root vertex of the input hair mesh. > Does the "Geometry Mask" node "select" a bunch of hairs (creates a "strand group" that contains these hairs) or create new hair object containing only the masked strands? Ah. My bad. The output socket should be a Faq (i.e. float value between 0-1 per hair curve vertex). For example: very short and curly hair that ends up inside of a characters body can be masked and deformed so that they can displaced outside of the body (I did something like this in houdini a while back and found it really useful) > I'm using some of the general nodes (like Combine XYZ) from the shader editor in the simulation node editor as well. Those "function nodes" can be used in hair nodes as well. Nodes like "Hair Info", "UV Map", "Vertex Color", ... probably can't be use without changes, because they have the same problem as the "Weight" and "Root Mask" node: the domain of their output value is not clear. In the shader editor, all values are in the same domain conceptually (at a single point on a surface or inside a volume). Yeah I suspected this. I hate to load a lot of work on developers, bit it would be amazing to have access to those nodes in the hair object node graph as well.

Added subscriber: @Peine_Perdue

Added subscriber: @Peine_Perdue

My only concern is to have once again the object/collection scattering tools in the middle of the hair tools.

It makes sense with the "deform along guide" option but i think it should have it's own part somewhere else and keep this only for hair grooming in my opinion.

My only concern is to have once again the object/collection scattering tools in the middle of the hair tools. It makes sense with the "deform along guide" option but i think it should have it's own part somewhere else and keep this only for hair grooming in my opinion.
Author
Member

In #78515#974201, @Peine_Perdue wrote:
My only concern is to have once again the object/collection scattering tools in the middle of the hair tools.

It makes sense with the "deform along guide" option but i think it should have it's own part somewhere else and keep this only for hair grooming in my opinion.

That is a valid point and I will take note. I would need input from the developers what would be the best option regarding how to handle instances. My main concern is to make sure instance scattering (rocks, trees etc) and instance deform (game hair, feathers) can be implemented at the same time as the new hair system so that users don't loose functionality.

For now I will continue with the current design for instances, but make adjustments once I get developer feedback regarding how to handle instances and deformed instances

> In #78515#974201, @Peine_Perdue wrote: > My only concern is to have once again the object/collection scattering tools in the middle of the hair tools. > > It makes sense with the "deform along guide" option but i think it should have it's own part somewhere else and keep this only for hair grooming in my opinion. That is a valid point and I will take note. I would need input from the developers what would be the best option regarding how to handle instances. My main concern is to make sure instance scattering (rocks, trees etc) and instance deform (game hair, feathers) can be implemented at the same time as the new hair system so that users don't loose functionality. For now I will continue with the current design for instances, but make adjustments once I get developer feedback regarding how to handle instances and deformed instances

Added subscriber: @frameshift

Added subscriber: @frameshift
Contributor

Added subscriber: @KenzieMac130

Added subscriber: @KenzieMac130

Added subscriber: @KidTempo

Added subscriber: @KidTempo
Member

Added subscriber: @zanqdo

Added subscriber: @zanqdo

Added subscriber: @DonCoyote

Added subscriber: @DonCoyote

Added subscriber: @Brunomendesgussoni

Added subscriber: @Brunomendesgussoni

Removed subscriber: @Brunomendesgussoni

Removed subscriber: @Brunomendesgussoni

Added subscriber: @Hyudan

Added subscriber: @Hyudan

Added subscriber: @mindmark

Added subscriber: @mindmark

Added subscriber: @nox.cg

Added subscriber: @nox.cg

Added subscriber: @s12a

Added subscriber: @s12a

Added subscriber: @Peeeynk

Added subscriber: @Peeeynk

Added subscriber: @KanishkChouhan

Added subscriber: @KanishkChouhan

Added subscriber: @FrancoisRimasson

Added subscriber: @FrancoisRimasson

Removed subscriber: @FrancoisRimasson

Removed subscriber: @FrancoisRimasson

Added subscriber: @FrancoisRimasson

Added subscriber: @FrancoisRimasson

i hope i will be able to have this kind of workflow
In another app I created sparse basic guides
From these guides, I generated hair curves
And converted these curves to new guides that i'm able to sculpt, cut, add new guides....
From these guides, I will be able to generate the final hair.

i hope i will be able to have this kind of workflow In another app I created sparse basic guides From these guides, I generated hair curves And converted these curves to new guides that i'm able to sculpt, cut, add new guides.... From these guides, I will be able to generate the final hair.
Contributor

This comment was removed by @KenzieMac130

*This comment was removed by @KenzieMac130*

Added subscriber: @ddade

Added subscriber: @ddade

Added subscriber: @Ulrich-Buch

Added subscriber: @Ulrich-Buch

I'd humbly propose a third option for the 'Smooth'-Node (other than 'Hair Tip Vector' and 'Normal Vector'):
It might be called e.g. 'Bezier' or 'Interpolated' (or whatever seems approptiate).

Anyway, the idea is it'd make the hair conform to the shape of a 2-Vertices-Bezier-Spline, which basically interpolates between the (pre-smoothing) first and last (hair-) edge's longitudinal vectors.
In other words, imagine one vertex of said spline to be co-located with the hair root-vertex, the other with the hair tip-vertex.
Furthermore (in Blender-terminology) imagine said spline's handles to be of type 'Aligned' and, well, one handle being aligned with the vector from the hair's root-vertex towards it's neighbour vertex (pre-smoothing) and accordingly the other handle aligned to the vector from the second-last (if we start counting at the root) hair-vertex towards the tip-vertex.

The obvious question which remains here is how to determine the scaling of this imaginary Bezier-Spline's handles (as this'd obviously influence the 'pointiness' of the interpolation).
I suppose ideally it could default to some value such as to make the effective resulting spline's length equal to the (pre-smoothing) length of the hair.image.png

I'd humbly propose a third option for the 'Smooth'-Node (other than 'Hair Tip Vector' and 'Normal Vector'): It might be called e.g. 'Bezier' or 'Interpolated' (or whatever seems approptiate). Anyway, the idea is it'd make the hair conform to the shape of a 2-Vertices-Bezier-Spline, which basically interpolates between the (pre-smoothing) first and last (hair-) edge's longitudinal vectors. In other words, imagine one vertex of said spline to be co-located with the hair root-vertex, the other with the hair tip-vertex. Furthermore (in Blender-terminology) imagine said spline's handles to be of type 'Aligned' and, well, one handle being aligned with the vector from the hair's root-vertex towards it's neighbour vertex (pre-smoothing) and accordingly the other handle aligned to the vector from the second-last (if we start counting at the root) hair-vertex towards the tip-vertex. The obvious question which remains here is how to determine the scaling of this imaginary Bezier-Spline's handles (as this'd obviously influence the 'pointiness' of the interpolation). I suppose ideally it could default to some value such as to make the effective resulting spline's length equal to the (pre-smoothing) length of the hair.![image.png](https://archive.blender.org/developer/F10208022/image.png)

Added subscriber: @RosarioRosato

Added subscriber: @RosarioRosato

Removed subscriber: @RosarioRosato

Removed subscriber: @RosarioRosato

Added subscriber: @rwman

Added subscriber: @rwman

Added subscriber: @Jarrett-Johnson

Added subscriber: @Jarrett-Johnson

Added subscriber: @STANN.co

Added subscriber: @STANN.co

i hope hair stuff get's worked on sometime soon

i hope hair stuff get's worked on sometime soon

Added subscriber: @Roughy

Added subscriber: @Roughy

Added subscriber: @Defka

Added subscriber: @Defka

Added subscriber: @Cigitia

Added subscriber: @Cigitia

Added subscriber: @JoseConseco

Added subscriber: @JoseConseco

Added subscriber: @dlc17

Added subscriber: @dlc17

Added subscriber: @Floreum

Added subscriber: @Floreum

Added subscriber: @Robert-Hintz

Added subscriber: @Robert-Hintz

The Rotate node needs to be extended to create vector fields for short fur or hair. Houdini and the Ornatrix plugin for Maya have this function.
https://www.sidefx.com/docs/houdini/shelf/sop_groom_curveadvect.html
https://ephere.com/plugins/autodesk/maya/ornatrix/docs/3/Surface_Comb_Operator.html

The Rotate node needs to be extended to create vector fields for short fur or hair. Houdini and the Ornatrix plugin for Maya have this function. https://www.sidefx.com/docs/houdini/shelf/sop_groom_curveadvect.html https://ephere.com/plugins/autodesk/maya/ornatrix/docs/3/Surface_Comb_Operator.html
Philipp Oeser removed the
Interest
Nodes & Physics
label 2023-02-10 08:46:44 +01:00
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
32 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#78515
No description provided.