NPR Design #120403

Open
opened 2024-04-08 15:48:56 +02:00 by Miguel Pozo · 20 comments
Member

Design Goals

  • Provide first-class support for stylized shading and rendering, unconstrained by PBR rules.
  • Be expressive enough to allow arbitrary art styles, instead of just providing built-in support for a fixed set of common effects.
  • Extend (instead of replace) the already existing physically-based material/render workflows and implementations.
  • PBR and Stylized materials should interact and be composed together automatically in a visually consistent and coherent way.

NPR Challenges

  • NPR requires way more customization support than PBR, and requires exposing parts of the rendering process that are typically kept behind the render engine "black box". At the same time, we need to keep the process intuitive, artist-friendly, and fail-safe.
  • Some common techniques require operating on converged, noiseless data (eg. applying thresholds or ramps to AO and soft shadows), but renderers typically rely on stochastic techniques that converge over multiple samples.
  • Filter-based effects that require sampling contiguous surfaces are extremely common. However, filter-based effects are prone to temporal instability (perspective aliasing and disocclusion artifacts), and not all render techniques can support them (blended/hashed transparency, raytracing).

For these reasons and because of the usual lack of built-in support, NPR is often done as a post-render compositing process, but this has its own issues:

  • Material/object-specific stylization requires a lot of extra setup and render targets.
  • It only really works for primary rays on opaque surfaces. Applying stylization to transparent, reflected, or refracted surfaces is practically impossible.
  • The image is already converged, so operating on it without introducing aliasing issues can be challenging.
  • Many useful data is already lost (eg. lights and shadows are already applied/merged).

However, these are mostly workflow and integration issues rather than fundamental issues at the core rendering level.

Proposal

This proposal focuses on solving the core technical and integration aspects, taking EEVEE integration as the initial target.
The details about how to expose and implement specific shading features are orthogonal to this proposal.
For an overview of the features intended to be supported see this board by Dillon Goo Studios.
The main goal here is to provide a technically sound space inside Blender where these features can be developed.

Big Picture

The NPR engine would be a new type of engine halfway between a regular render engine and a compositing engine.
Instead of operating on final images once the main renderer has finished its job, it works as the final steps of each render sample, acting as a second deferred shading step, before assembling the final image sample and handling sample accumulation.

This leaves the space required to solve the issues mentioned above, without having to build a whole new "full" render engine, and, at the same time, imposing as few hard requirements as possible on the "core" engine.

On the user side, this introduces a new NPR node system, separate from the regular Material nodes, where custom shading and filtering can be applied.
Each NPR node tree generates a Closure-like node that can be used from Material nodes.

Material Nodes NPR Nodes
unnamed unnamed

The images are just for illustrative purposes, don't take the specifics of the nodes themselves as part of the proposal.

This approach provides several advantages:

  • Material nodes stay compatible between engines.
  • There's a clearly communicated distinction between always-available features and NPR-only features.
  • Imposes a clear Material > NPR evaluation order, avoiding cyclic dependency issues where shading nodes could act as inputs of surface properties.
  • Allows supporting filter nodes, texture sockets, and any other required NPR-specific feature without having to fight against the material nodes design.

Implementation

Each NPR material generates a list of its required "G-Buffer" input data.

The "core" engine is responsible for generating this G-Buffer and passing it to the NPR engine at the end of each sample. This works similarly to the already existing compositing passes.

In addition, the NPR engine is also capable of running its own scene synchronization to generate its own engine-specific data when needed.

Then the NPR engine runs separate screen-space passes for each NPR material type (only on the pixels covered by that material) and assembles the final image.

G-Buffer packing

Something to keep in mind is that since we are working with per-sample data, render targets can be much more compressed than in regular compositing.

For example, storing one shadow mask for each overlapping light might seem too expensive, but we only need ceil(log2(shadow_rays+1)) bits for each shadow mask. In the case of EEVEE that's 1 bit by default, and 3 bits at worst.

Tier System

Any stochastically rendered surface (jittered transparency, rough reflections/refractions, ray-traced DoF and motion-blur) prevents the NPR system from applying filters to them.
For this reason and as a fallback when a true BSDF is required, we need a tier system.

There would be 3 tiers:

  • Full: All NPR features supported.
    For opaque and layered transparency surfaces, and for 0 roughness reflections and refractions.
  • Per-Pixel: Custom shading and compositing of shading features are supported, but filters and sample accumulation nodes are not, and are executed as a simple no-op pass-trough.
    For stochastically rendered surfaces.
  • Fallback: The regular output from the Material nodes.
    For features where a true energy-conserving BSDF is needed (eg. diffuse GI, rough reflection/refractions).

This is handled automatically, so users only need to be aware that in some contexts filters will be skipped, and that the base material output can still be used.

EEVEE-side features

Other than outputting the required render targets, there are some features that, even if they're only used on the NPR side, would have to be implemented or would be more practical to implement at the core engine side:

  • Depth offset. Requires modifying the fragment depth in the material surface shaders.
  • Self-Shadows. Requires rendering an object ID texture along the shadow-map.
  • Shading position offset. Needs to be taken into account for shadow tagging.

There are also some required features that could be implemented on the NPR engine side, but are already planned for EEVEE anyway (light linking and light nodes).

Layered Transparency

Supporting the "Full" level features on transparent surfaces would require a layered OIT solution at the core engine level, or implementing it at the NPR engine level and passing the depth buffer of each layer to the core engine.

Reflections & Refractions

To support correct interaction between PBR and NPR materials, NPR shading should also be applied to NPR surfaces reflected and refracted onto other materials (including PBR ones). This implies that the NPR engine should handle the final composition of PBR materials as well.

A "generic" solution could be requesting render passes for secondary rays too (so each ray depth level has its own render target) and applying NPR shading in reverse order.

PBR Render AOV (Depth 2) AOV (Depth 1) AOV (Depth 0)
imagen imagen imagen imagen

However, since EEVEE only supports screen-space tracing for now (and probes should already have the NPR shading applied at bake time), a reflection and refraction UV may be enough. And since horizon scanning is meant to be used for diffuse and rough reflections, it should be fine to use them as-is without any NPR post-processing.

Custom sample accumulation and resolve

Threshold before sample accumulation (Incorrect) Threshold after sample accumulation (Correct but aliased)
imagen imagen

A possible solution to solve the "operating on converged, noiseless data" issue is accumulating samples in a "geometrically-aliased" way, but storing per-sample weights (MSAA style) so anti-aliasing can still be applied as a last step.
This allows the user to operate on converged shading data without having to worry about aliasing issues.

Single sample Accumulation Threshold Resolve (WIP)
imagen imagen imagen imagen
imagen imagen imagen imagen

However, applying this method to every NPR material input may be too costly, and there are legit reasons for wanting to operate directly on per-sample data, anyway.
So a possible solution could be to expose this functionality as a node, and use different socket types for per-sample vs converged data (for example using the same socket color but different shape).

Open questions

  • Name for the project?

  • Do we implement the NPR engine as a separate draw engine?
    As a library that EEVEE can use?
    By inheriting EEVEE classes with virtual functions where needed?

  • Is a future Cycles integration a possibility we should consider at all?

  • Can/Should we re-use parts of the material and compositing nodes implementation?

  • How does Grease Pencil fit here?
    Is the planned EEVEE-GP integration enough, so the NPR engine can just treat GP as any other surface?

Roadmap

The initial version of NPR nodes would work as an improvement of the ShaderToRGB workflow, with access to separate shading features and the ability to perform custom shading, but without filter nodes.
We can implement "filters" at the shader level at this stage though, as long as they operate on "global" inputs (like depth, or normal) and the whole node tree can be compiled to a single shader pass.

From there we can iterate on adding all the required shading features, and finally implement filtering nodes and the custom sample accumulation/resolve solution.

  1. Design.
  2. Base framework and EEVEE integration (First preview release).
    1. Node System (Per-Pixel tier support).
    2. Custom Nodes.
    3. Built-in Nodes.
  3. EEVEE-side features
    1. Light linking.
    2. Self-Shadows.
    3. Shading position offset.
    4. Depth offset.
    5. Light nodes.
  4. Filter Nodes (Single Pass)
  5. Filter Nodes (Full-Tier support) (Multiple Passes)
    1. Custom sample accumulation/resolve solution.
    2. Custom filter nodes.
    3. Built-in filter nodes.
  6. Layered OIT.
# Design Goals - Provide first-class support for stylized shading and rendering, unconstrained by PBR rules. - Be expressive enough to allow arbitrary art styles, instead of just providing built-in support for a fixed set of common effects. - Extend (instead of replace) the already existing physically-based material/render workflows and implementations. - PBR and Stylized materials should interact and be composed together automatically in a visually consistent and coherent way. # NPR Challenges - NPR requires way more customization support than PBR, and requires exposing parts of the rendering process that are typically kept behind the render engine "black box". At the same time, we need to keep the process intuitive, artist-friendly, and fail-safe. - Some common techniques require operating on converged, noiseless data (eg. applying thresholds or ramps to AO and soft shadows), but renderers typically rely on stochastic techniques that converge over multiple samples. - Filter-based effects that require sampling contiguous surfaces are extremely common. However, filter-based effects are prone to temporal instability (perspective aliasing and disocclusion artifacts), and not all render techniques can support them (blended/hashed transparency, raytracing). For these reasons and because of the usual lack of built-in support, NPR is often done as a post-render compositing process, but this has its own issues: - Material/object-specific stylization requires a lot of extra setup and render targets. - It only really works for primary rays on opaque surfaces. Applying stylization to transparent, reflected, or refracted surfaces is practically impossible. - The image is already converged, so operating on it without introducing aliasing issues can be challenging. - Many useful data is already lost (eg. lights and shadows are already applied/merged). However, these are mostly workflow and integration issues rather than fundamental issues at the core rendering level. # Proposal This proposal focuses on solving the core technical and integration aspects, taking EEVEE integration as the initial target. The details about how to expose and implement specific shading features are orthogonal to this proposal. For an overview of the features intended to be supported see [this board](https://miro.com/app/board/uXjVNRq8YR4=/) by Dillon Goo Studios. The main goal here is to provide a technically sound space inside Blender where these features can be developed. ## Big Picture The NPR engine would be a new type of engine halfway between a regular render engine and a compositing engine. Instead of operating on final images once the main renderer has finished its job, it works as the final steps of each render sample, acting as a second deferred shading step, before assembling the final image sample and handling sample accumulation. This leaves the space required to solve the issues mentioned above, without having to build a whole new "full" render engine, and, at the same time, imposing as few hard requirements as possible on the "core" engine. On the user side, this introduces a new NPR node system, separate from the regular Material nodes, where custom shading and filtering can be applied. Each NPR node tree generates a Closure-like node that can be used from Material nodes. | Material Nodes | NPR Nodes | | --- | --- | | ![unnamed](/attachments/368cedf7-06a8-45e0-b397-5032a51ffdc2) | ![unnamed](/attachments/207e776d-d651-4a97-8875-36486742ad9c) | > *The images are just for illustrative purposes, don't take the specifics of the nodes themselves as part of the proposal.* This approach provides several advantages: - Material nodes stay compatible between engines. - There's a clearly communicated distinction between always-available features and NPR-only features. - Imposes a clear Material > NPR evaluation order, avoiding cyclic dependency issues where shading nodes could act as inputs of surface properties. - Allows supporting filter nodes, texture sockets, and any other required NPR-specific feature without having to fight against the material nodes design. ## Implementation Each NPR material generates a list of its required "G-Buffer" input data. The "core" engine is responsible for generating this G-Buffer and passing it to the NPR engine at the end of each sample. This works similarly to the already existing compositing passes. In addition, the NPR engine is also capable of running its own scene synchronization to generate its own engine-specific data when needed. Then the NPR engine runs separate screen-space passes for each NPR material type (only on the pixels covered by that material) and assembles the final image. ### G-Buffer packing Something to keep in mind is that since we are working with per-sample data, render targets can be much more compressed than in regular compositing. For example, storing one shadow mask for each overlapping light might seem too expensive, but we only need `ceil(log2(shadow_rays+1))` bits for each shadow mask. In the case of EEVEE that's 1 bit by default, and 3 bits at worst. ### Tier System Any stochastically rendered surface (jittered transparency, rough reflections/refractions, ray-traced DoF and motion-blur) prevents the NPR system from applying filters to them. For this reason and as a fallback when a true BSDF is required, we need a tier system. There would be 3 tiers: - Full: All NPR features supported. For opaque and layered transparency surfaces, and for 0 roughness reflections and refractions. - Per-Pixel: Custom shading and compositing of shading features are supported, but filters and sample accumulation nodes are not, and are executed as a simple no-op pass-trough. For stochastically rendered surfaces. - Fallback: The regular output from the Material nodes. For features where a true energy-conserving BSDF is needed (eg. diffuse GI, rough reflection/refractions). This is handled automatically, so users only need to be aware that in some contexts filters will be skipped, and that the base material output can still be used. ### EEVEE-side features Other than outputting the required render targets, there are some features that, even if they're only used on the NPR side, would have to be implemented or would be more practical to implement at the core engine side: - Depth offset. Requires modifying the fragment depth in the material surface shaders. - Self-Shadows. Requires rendering an object ID texture along the shadow-map. - Shading position offset. Needs to be taken into account for shadow tagging. There are also some required features that could be implemented on the NPR engine side, but are already planned for EEVEE anyway (light linking and light nodes). ### Layered Transparency Supporting the "Full" level features on transparent surfaces would require a layered OIT solution at the core engine level, or implementing it at the NPR engine level and passing the depth buffer of each layer to the core engine. ### Reflections & Refractions To support correct interaction between PBR and NPR materials, NPR shading should also be applied to NPR surfaces reflected and refracted onto other materials (including PBR ones). This implies that the NPR engine should handle the final composition of PBR materials as well. A "generic" solution could be requesting render passes for secondary rays too (so each ray depth level has its own render target) and applying NPR shading in reverse order. | PBR Render | AOV (Depth 2) | AOV (Depth 1) | AOV (Depth 0) | | --- | --- | --- | --- | | ![imagen](/attachments/48d36d1b-b8ac-4cb7-bca3-ca5931e61d91)|![imagen](/attachments/6e3ec3e1-e8c3-4a83-84a2-125a96a9f530)|![imagen](/attachments/b5c288a6-3645-4830-a2a7-55beec51c438)|![imagen](/attachments/f9e4c767-5af4-46a0-9d15-bf29c998674c)| However, since EEVEE only supports screen-space tracing for now (and probes should already have the NPR shading applied at bake time), a reflection and refraction UV may be enough. And since horizon scanning is meant to be used for diffuse and rough reflections, it should be fine to use them as-is without any NPR post-processing. ### Custom sample accumulation and resolve | Threshold before sample accumulation (Incorrect) | Threshold after sample accumulation (Correct but aliased) | | --- | --- | | ![imagen](/attachments/9b29778b-3f58-4ee8-aa99-4d262dc9ebe6) | ![imagen](/attachments/29f00315-81cb-4670-9e8c-ec79a4bc198c) | A possible solution to solve the "operating on converged, noiseless data" issue is accumulating samples in a "geometrically-aliased" way, but storing per-sample weights (MSAA style) so anti-aliasing can still be applied as a last step. This allows the user to operate on converged shading data without having to worry about aliasing issues. | Single sample | Accumulation | Threshold | Resolve (WIP) | | --- | --- | --- | --- | | ![imagen](/attachments/43e8eae0-068d-46c6-aa57-aa7fbf040aa9) | ![imagen](/attachments/b153d844-d2d9-44ee-9621-6be8e73498c8) | ![imagen](/attachments/01487c52-cb0f-4429-8bfe-1a6794446072) | ![imagen](/attachments/1ec5a298-0954-4503-84be-3c9fea2029e8) | | ![imagen](/attachments/90b1c6a0-f41b-43aa-baef-fce9957be1f8) | ![imagen](/attachments/690af08d-8b1a-4d00-9656-cc69986331c4) | ![imagen](/attachments/31d34e81-6897-4914-bab1-609e80a04362) | ![imagen](/attachments/c8a098b0-be5c-42ac-886b-fa6fd993c474) | However, applying this method to every NPR material input may be too costly, and there are legit reasons for wanting to operate directly on per-sample data, anyway. So a possible solution could be to expose this functionality as a node, and use different socket types for per-sample vs converged data (for example using the same socket color but different shape). # Open questions - Name for the project? - Do we implement the NPR engine as a separate draw engine? As a library that EEVEE can use? By inheriting EEVEE classes with virtual functions where needed? - Is a future Cycles integration a possibility we should consider at all? - Can/Should we re-use parts of the material and compositing nodes implementation? - How does Grease Pencil fit here? Is the planned EEVEE-GP integration enough, so the NPR engine can just treat GP as any other surface? # Roadmap The initial version of NPR nodes would work as an improvement of the ShaderToRGB workflow, with access to separate shading features and the ability to perform custom shading, but without filter nodes. We can implement "filters" at the shader level at this stage though, as long as they operate on "global" inputs (like depth, or normal) and the whole node tree can be compiled to a single shader pass. From there we can iterate on adding all the required shading features, and finally implement filtering nodes and the custom sample accumulation/resolve solution. 0. Design. 1. Base framework and EEVEE integration (First preview release). 1. Node System (Per-Pixel tier support). 2. Custom Nodes. 3. Built-in Nodes. 2. EEVEE-side features 1. Light linking. 2. Self-Shadows. 3. Shading position offset. 4. Depth offset. 5. Light nodes. 3. Filter Nodes (Single Pass) 3. Filter Nodes (Full-Tier support) (Multiple Passes) 1. Custom sample accumulation/resolve solution. 2. Custom filter nodes. 3. Built-in filter nodes. 5. Layered OIT.
Miguel Pozo added this to the EEVEE & Viewport project 2024-04-08 15:48:59 +02:00
Member

@pragma37 Hey :) I really hope that Grease Pencil can become a part of this proposal!

How does Grease Pencil fit here?

Many of the current NPR issues can be said about Grease Pencil as well. On top of this, shading is very limited (Grease Pencil has its own materials, not node based, few options).
Since GPv3 will turn Grease Pencil essentially into CurvesGeometry, to me, the question becomes "how do we render curves in NPR?". So the same thing could be asked about hair for example.

Ideally, we don't enforce a meshing step and have a native way of rendering curves somehow.

@pragma37 Hey :) I really hope that Grease Pencil can become a part of this proposal! > How does Grease Pencil fit here? Many of the current NPR issues can be said about Grease Pencil as well. On top of this, shading is very limited (Grease Pencil has its own materials, not node based, few options). Since GPv3 will turn Grease Pencil essentially into `CurvesGeometry`, to me, the question becomes "how do we render curves in NPR?". So the same thing could be asked about hair for example. Ideally, we don't enforce a meshing step and have a native way of rendering curves somehow.
Author
Member

@filedescriptor The idea is that the NPR renderer doesn't render the meshes itself, it just receives a G-Buffer with the geometry already rendered.
So the NPR renderer doesn't care about meshes vs sculpt meshes vs curves, they're just pre-rendered surfaces.

So if the EEVEE-Grease Pencil integration means GP objects can have a regular EEVEE Material and act like any other surface, then GP doesn't need to be a special case on the NPR side.
But I'm not sure what's exactly the plan with EEVEE-GP.

@filedescriptor The idea is that the NPR renderer doesn't render the meshes itself, it just receives a G-Buffer with the geometry already rendered. So the NPR renderer doesn't care about meshes vs sculpt meshes vs curves, they're just pre-rendered surfaces. So if the EEVEE-Grease Pencil integration means GP objects can have a regular EEVEE Material and act like any other surface, then GP doesn't need to be a special case on the NPR side. But I'm not sure what's exactly the plan with EEVEE-GP.
Member

The NPR engine would be a new type of engine halfway between a regular render engine and a compositing engine. Instead of operating on final images once the main renderer has finished its job, it works as the final steps of each render sample, acting as a second deferred shading step, before assembling the final image sample and handling sample accumulation.

I really love this part. This is one step closer to ideal flexibility in which artists could decide what exactly something needs to look like!

use different socket types for per-sample vs converged data.

Sampling alias is indeed something we have to worry about. Also not all properties needs to be sampled to give a smooth result. A hybrid approach might work best and yes giving choices are good.

Overall I'm just not very sure how this in-between stage could be implemented under blender. Like, you could have direct G-buffer pass through, or "do some filters over g-buffer and pass along", and you might want to have the filter result of some passes to affect other passes? (like blur normal but masked on top with object id). This might take some time I think but I'm really looking forward to it!

Since GPv3 will turn Grease Pencil essentially into CurvesGeometry, to me, the question becomes "how do we render curves in NPR?". So the same thing could be asked about hair for example.

I think curves/lines are a very different topic. Since the renderer here mostly talks about how we turn individual pixels into what we want (although through some filters we can get differentials which are lines), but lines IMO takes a different approach that sampling might not conceptually align with how a lot types of lines are perceived naturally (a lot of times it's regarding visually smooth edges, which it's very hard to threshold though pixel based method to pin point where the line should be). But I guess we can stay in the loop and see if we can come up with some ideas on how to do things like this.

> The NPR engine would be a new type of engine halfway between a regular render engine and a compositing engine. Instead of operating on final images once the main renderer has finished its job, it works as the final steps of each render sample, acting as a second deferred shading step, before assembling the final image sample and handling sample accumulation. I really love this part. This is one step closer to ideal flexibility in which artists could decide what exactly something needs to look like! > use different socket types for per-sample vs converged data. Sampling alias is indeed something we have to worry about. Also not all properties needs to be sampled to give a smooth result. A hybrid approach might work best and yes giving choices are good. Overall I'm just not very sure how this in-between stage could be implemented under blender. Like, you could have direct G-buffer pass through, or "do some filters over g-buffer and pass along", and you might want to have the filter result of some passes to affect other passes? (like blur normal but masked on top with object id). This might take some time I think but I'm really looking forward to it! > Since GPv3 will turn Grease Pencil essentially into CurvesGeometry, to me, the question becomes "how do we render curves in NPR?". So the same thing could be asked about hair for example. I think curves/lines are a very different topic. Since the renderer here mostly talks about how we turn individual pixels into what we want (although through some filters we can get differentials which are lines), but lines IMO takes a different approach that sampling might not conceptually align with how a lot types of lines are perceived naturally (a lot of times it's regarding visually smooth edges, which it's very hard to threshold though pixel based method to pin point where the line should be). But I guess we can stay in the loop and see if we can come up with some ideas on how to do things like this.
Author
Member

@ChengduLittleA I'm not sure I follow your reasoning.

The engine always receives a per-render-sample G-Buffer, not a converged one.
It's actually the task of the NPR engine to generate the converged image.
It works the same as any deferred renderer (including EEVEE-Next), except the deferred pass shader is generated from nodes.

So, if the current order is (simplified):
EEVEE GBuffer -> EEVEE Shading -> EEVEE Sample accumulation -> Compositor
The order with NPR would be:
EEVEE GBuffer -> EEVEE Shading -> NPR Shading -> NPR Sample accumulation -> Compositor

Aliasing would only be an issue in the sense that you can't generate a correct ramp/threshold from a single sample due to shading aliasing (like in this image), that's why the custom sample accumulation and resolve is proposed.

@ChengduLittleA I'm not sure I follow your reasoning. The engine always receives a per-render-sample G-Buffer, not a converged one. It's actually the task of the NPR engine to generate the converged image. It works the same as any deferred renderer (including EEVEE-Next), except the deferred pass shader is generated from nodes. So, if the current order is (simplified): `EEVEE GBuffer -> EEVEE Shading -> EEVEE Sample accumulation -> Compositor` The order with NPR would be: `EEVEE GBuffer -> EEVEE Shading -> NPR Shading -> NPR Sample accumulation -> Compositor` Aliasing would only be an issue in the sense that you can't generate a correct ramp/threshold from a single sample due to shading aliasing ([like in this image](https://projects.blender.org/blender/blender/attachments/9b29778b-3f58-4ee8-aa99-4d262dc9ebe6)), that's why the custom sample accumulation and resolve is proposed.
Contributor

LOVE this initiative! LOVE IT.

As an NPR user myself - I vote to build this directly into EEVEE for a smoother UX and adoptation - since many NPR artists already use EEVEE, so extending and making it more flexible (optionally as an user opt-in if necessary) would be great. Similar nodes, similar workflows, extra opt-in flexibility and creativity.

This allows this UX:

  1. Easy migration of existing NPR projects ("Let's use these new nodes to do X now, no need to migrate/change settings on everything!")
  2. Less render engine bloat and decision making ("Which render engine should I use for X project?")
  3. Less overhead ("As a dev, let's maintain one engine with more features than build a new one from the ground up")
  4. More user customizability without multi-engine use cases ("I want it to be somewhat realistic, like EEVEE Next, but have this X feature for NPR - and do less compositing")
LOVE this initiative! LOVE IT. As an NPR user myself - I vote to build this directly into EEVEE for a smoother UX and adoptation - since many NPR artists already use EEVEE, so extending and making it more flexible (optionally as an user opt-in if necessary) would be great. Similar nodes, similar workflows, extra opt-in flexibility and creativity. ### This allows this UX: 1. Easy migration of existing NPR projects ("Let's use these new nodes to do X now, no need to migrate/change settings on everything!") 2. Less render engine bloat and decision making ("Which render engine should I use for X project?") 3. Less overhead ("As a dev, let's maintain one engine with more features than build a new one from the ground up") 4. More user customizability without multi-engine use cases ("I want it to be somewhat realistic, like EEVEE Next, but have this X feature for NPR - and do less compositing")
Member

@pragma37 Thanks, I see.

@pragma37 Thanks, I see.
Author
Member

@filedescriptor I've just checked with Clément about the EEVEE-GP plans.
There's stuff to figure out, but in any case, I think the right way to go is to solve it at the EEVEE level first.

If we try to do it at the NPR level we're going to have troubles integrating GP and EEVEE objects together, which is the same reason why I wanted to avoid having separate NPR and EEVEE objects.
So I would solve the EEVEE side first, then we can see if there are GP-specific features that should be exposed to the NPR-side.

Still, I think it would make sense to include the EEVEE-GP integration as part of the NPR project planning.


@AndresStephens This is meant to work alongside EEVEE (and maybe other engines in the future).
And NPR nodes are meant to work in tandem with Material Nodes.
One of the reasons to implement NPR nodes as a separate node system is precisely to avoid having Material nodes working only under certain circumstances.
Also, we have not discussed this yet, but I would expect the NPR nodes (especially before implementing filters) to be pretty much the current Material nodes with some nodes removed and others added.

@filedescriptor I've just checked with Clément about the EEVEE-GP plans. There's stuff to figure out, but in any case, I think the right way to go is to solve it at the EEVEE level first. If we try to do it at the NPR level we're going to have troubles integrating GP and EEVEE objects together, which is the same reason why I wanted to avoid having separate NPR and EEVEE objects. So I would solve the EEVEE side first, then we can see if there are GP-specific features that should be exposed to the NPR-side. Still, I think it would make sense to include the EEVEE-GP integration as part of the NPR project planning. --- @AndresStephens This is meant to work alongside EEVEE (and maybe other engines in the future). And NPR nodes are meant to work in tandem with Material Nodes. One of the reasons to implement NPR nodes as a separate node system is precisely to avoid having Material nodes working only under certain circumstances. Also, we have not discussed this yet, but I would expect the NPR nodes (especially before implementing filters) to be pretty much the current Material nodes with some nodes removed and others added.
Contributor

One of the reasons to implement NPR nodes as a separate node system is precisely to avoid having Material nodes working only under certain circumstances.

This is quite weird to be honest. I get the reason its desirable to have nodes work between Cycles and EEVEE, but what is the practical examples of materials used in NPRs needing to work in other conditions? Like who would decide in mid production that "eh, I don't want NPR anymore, let's go photoreal", AND expect the material to work without any adjustments?

New node system is adding so much more complexity on user level, you have to learn and think about and work on entire new concept of "inbetween" editor. It's worst design from user pov imho, and cases it solves is pretty much zero or very low. While having nodes in eevee material editor would make this much much easier, and only situations it would break is if user is doing ridiculous tasks. Majority of user-base shouldn't have to endure complexity for those single-digit cases

> One of the reasons to implement NPR nodes as a separate node system is precisely to avoid having Material nodes working only under certain circumstances. This is quite weird to be honest. I get the reason its desirable to have nodes work between Cycles and EEVEE, but what is the practical examples of materials used in NPRs needing to work in other conditions? Like who would decide in mid production that "eh, I don't want NPR anymore, let's go photoreal", AND expect the material to work without any adjustments? New node system is adding so much more complexity on user level, you have to learn and think about and work on entire new concept of "inbetween" editor. It's worst design from user pov imho, and cases it solves is pretty much zero or very low. While having nodes in eevee material editor would make this much much easier, and only situations it would break is if user is doing ridiculous tasks. Majority of user-base shouldn't have to endure complexity for those single-digit cases

This is quite weird to be honest. I get the reason its desirable to have nodes work between Cycles and EEVEE, but what is the practical examples of materials used in NPRs needing to work in other conditions? Like who would decide in mid production that "eh, I don't want NPR anymore, let's go photoreal", AND expect the material to work without any adjustments?

I don't imagine this is something that has to be black and white. Some parts of an image can be photorealistic/physically based, whereas other parts of the image can be explicitly NPR. Consider elements like magical effects and superpowers. Even in a largely photoreal setting, you have a lot of leeway here stylistically, what a "portal" looks like is almost certainly going to be determined on an artistic basis rather than a physical one.

It's probably important to consider this is a task for furthering NPR design in Blender, but that doesn't mean that Miguel's suggestions are exclusively useful for NPR. There could be applications of this technology to improve physically based looks as well, assuming it wasn't made a unique engine. I think changing the name "NPR Nodes" to "Render Nodes" in the future would be a good idea so that it's clearer to people that it isn't exclusive. Or another, better-fitting name instead, I'm just using the terminology that was in Malt.

New node system is adding so much more complexity on user level, you have to learn and think about and work on entire new concept of "inbetween" editor. It's worst design from user pov imho, and cases it solves is pretty much zero or very low. While having nodes in eevee material editor would make this much much easier, and only situations it would break is if user is doing ridiculous tasks. Majority of user-base shouldn't have to endure complexity for those single-digit cases

This will almost certainly end up very similar to the case of Geometry Nodes. Has Geometry Nodes made Blender more complex? Sure. However, Individual users do not have to learn every in and out of Geometry Nodes to reap the benefits of the system, and with their modifier-based design being expanded to include the concept of tools, they have become more and more accessible to those without the technical know-how. This design methodology would slot right in I think.

To be completely honest, I think that this will make NPR dramatically less complex to perform in Blender, as there is so much finagling you have to do to get around existing design limitations at the moment, there is quite a bit of prerequisite knowledge necessary to get the look you want at times.

> This is quite weird to be honest. I get the reason its desirable to have nodes work between Cycles and EEVEE, but what is the practical examples of materials used in NPRs needing to work in other conditions? Like who would decide in mid production that "eh, I don't want NPR anymore, let's go photoreal", AND expect the material to work without any adjustments? I don't imagine this is something that has to be black and white. Some parts of an image can be photorealistic/physically based, whereas other parts of the image can be explicitly NPR. Consider elements like magical effects and superpowers. Even in a largely photoreal setting, you have a lot of leeway here stylistically, what a "portal" looks like is almost certainly going to be determined on an artistic basis rather than a physical one. It's probably important to consider this is a task for furthering NPR design in Blender, but that doesn't mean that Miguel's suggestions are exclusively useful for NPR. There could be applications of this technology to improve physically based looks as well, assuming it wasn't made a unique engine. I think changing the name "NPR Nodes" to "Render Nodes" in the future would be a good idea so that it's clearer to people that it isn't exclusive. Or another, better-fitting name instead, I'm just using the terminology that was in Malt. > New node system is adding so much more complexity on user level, you have to learn and think about and work on entire new concept of "inbetween" editor. It's worst design from user pov imho, and cases it solves is pretty much zero or very low. While having nodes in eevee material editor would make this much much easier, and only situations it would break is if user is doing ridiculous tasks. Majority of user-base shouldn't have to endure complexity for those single-digit cases This will almost certainly end up very similar to the case of Geometry Nodes. Has Geometry Nodes made Blender more complex? Sure. However, Individual users do not have to learn every in and out of Geometry Nodes to reap the benefits of the system, and with their modifier-based design being expanded to include the concept of tools, they have become more and more accessible to those without the technical know-how. This design methodology would slot right in I think. To be _completely_ honest, I think that this will make NPR dramatically less complex to perform in Blender, as there is so much finagling you have to do to get around existing design limitations at the moment, there is quite a bit of prerequisite knowledge necessary to get the look you want at times.
Member

@pragma37 Thanks for the explanation :D

The order with NPR would be:
EEVEE GBuffer -> EEVEE Shading -> NPR Shading -> NPR Sample accumulation -> Compositor

What I was thinking is that could we send the results from NPR Sample accumulation back to e.g. material to affect shading that way? (Well probably not cause I guess the structure would be way more complex, but that's what I was intended to describe 😅 )

@pragma37 Thanks for the explanation :D > The order with NPR would be: > `EEVEE GBuffer -> EEVEE Shading -> NPR Shading -> NPR Sample accumulation -> Compositor` What I was thinking is that could we send the results from `NPR Sample accumulation` back to e.g. material to affect shading that way? (Well probably not cause I guess the structure would be way more complex, but that's what I was intended to describe 😅 )
Author
Member

@nickberckley

I get the reason its desirable to have nodes work between Cycles and EEVEE

Nope, the reasons are way more involved than that. This is from a previous design mailing thread.
The context was discussing how to support the features from the Miro board:


(...)

Textures/Filters (Screen-Space normal effects, Custom Refraction, Screen-Space shadow filters)
Do we add texture sockets and filters to material nodes? Do we only support hard-coded built-in effects?
Material nodes are not well prepared to build custom filters (and probably shouldn’t? surface fragment shaders are a bad place to do so from a performance standpoint).

Pre-pass dependency (Cavity, Curvature, Rim, Screen-Space normal effects)
These effects assume there’s a pre-pass, but that’s not always the case.
For example:

  • Transparent materials (would require layer based OIT).
  • Ray-tracing (alternative implementation possible, but it would be either very noisy or very expensive)
  • Shadow maps (see dependency issues)

Dependency issues (almost all features)
Right now material nodes follow (for the most part) a surface properties -> shading -> output order.
However, a material nodes approach where shading features are available as color inputs would mean that hierarchy is completely lost.
For example:

  • You can use shadows to drive transparency or displacement, which should affect shadows, and you get a dependency loop.
  • You can use GI to drive shading which would again cause a dependency loop.
    This means that things that can be easily expressed with nodes will result in “undefined behavior” that doesn’t translate to the actual output, and could easily change based on implementation details.

Energy conservation (GI)
NPR shading is usually non energy conserving, which, at best will convert some objects into lamps, and at worst would cause the whole lighting to blow out.

For these reasons, I’m leaning towards a solution with an explicit distinction between material nodes and NPR nodes. A separate node system (halfway between materials and compositing nodes) with their inputs exposed to material nodes.

(...)

These would provide several advantages:

  • Material nodes are still (almost) fully compatible between engines.
  • No dependency issues. Shading nodes are inputs that only affect the NPR output, and this is exposed in an obvious/intuitive way to the user.
  • Transparency and displacement are configured in the regular material nodes.
  • Features that require a true BSDF(GI) can just use the regular material.
  • Can implement proper support for filter nodes and texture sockets, or any other NPR specific feature we need without having to fight against the material nodes design.
  • Easier to support custom (glsl) nodes, since we are operating in a simpler context with just plain values and texture data.
  • Future proof, would work with ray-tracing or any other method that can write several layers of AOVs.
  • Doesn’t put too many constraints on the underlying engine, so could share most of its implementation with EEVEE.
  • Leaves the door open for eventually working with Cycles and render engine addons.

Cons:

  • It’s an extra step that requires writing extra texture data and memory to store it.
  • Can’t use UVs or Vertex Colors without writing them to extra render targets or rendering the mesh again.

This thread is for discussing the technical design, I deliberately left as many UX decisions as possible out of this proposal because figuring out the technical integration/implementation and its implications is already complex enough.
We will ask for user feedback once we get into the UX side, so please, let's keep the bikeshedding aside from now.

@nickberckley > I get the reason its desirable to have nodes work between Cycles and EEVEE Nope, the reasons are way more involved than that. This is from a previous design mailing thread. The context was discussing how to support the features from the Miro board: --- (...) **Textures/Filters (Screen-Space normal effects, Custom Refraction, Screen-Space shadow filters)** Do we add texture sockets and filters to material nodes? Do we only support hard-coded built-in effects? Material nodes are not well prepared to build custom filters (and probably shouldn’t? surface fragment shaders are a bad place to do so from a performance standpoint). **Pre-pass dependency (Cavity, Curvature, Rim, Screen-Space normal effects)** These effects assume there’s a pre-pass, but that’s not always the case. For example: * Transparent materials (would require layer based OIT). * Ray-tracing (alternative implementation possible, but it would be either very noisy or very expensive) * Shadow maps (see dependency issues) **Dependency issues (almost all features)** Right now material nodes follow (for the most part) a surface properties -> shading -> output order. However, a material nodes approach where shading features are available as color inputs would mean that hierarchy is completely lost. For example: * You can use shadows to drive transparency or displacement, which should affect shadows, and you get a dependency loop. * You can use GI to drive shading which would again cause a dependency loop. This means that things that can be easily expressed with nodes will result in “undefined behavior” that doesn’t translate to the actual output, and could easily change based on implementation details. **Energy conservation (GI)** NPR shading is usually non energy conserving, which, at best will convert some objects into lamps, and at worst would cause the whole lighting to blow out. For these reasons, I’m leaning towards a solution with an explicit distinction between material nodes and NPR nodes. A separate node system (halfway between materials and compositing nodes) with their inputs exposed to material nodes. (...) These would provide several advantages: - Material nodes are still (almost) fully compatible between engines. - No dependency issues. Shading nodes are inputs that only affect the NPR output, and this is exposed in an obvious/intuitive way to the user. - Transparency and displacement are configured in the regular material nodes. - Features that require a true BSDF(GI) can just use the regular material. - Can implement proper support for filter nodes and texture sockets, or any other NPR specific feature we need without having to fight against the material nodes design. - Easier to support custom (glsl) nodes, since we are operating in a simpler context with just plain values and texture data. - Future proof, would work with ray-tracing or any other method that can write several layers of AOVs. - Doesn’t put too many constraints on the underlying engine, so could share most of its implementation with EEVEE. - Leaves the door open for eventually working with Cycles and render engine addons. Cons: - It’s an extra step that requires writing extra texture data and memory to store it. - Can’t use UVs or Vertex Colors without writing them to extra render targets or rendering the mesh again. --- This thread is for discussing the technical design, I deliberately left as many UX decisions as possible out of this proposal because figuring out the technical integration/implementation and its implications is already complex enough. We will ask for user feedback once we get into the UX side, so please, let's keep the bikeshedding aside from now.
Author
Member

@ChengduLittleA

What I was thinking is that could we send the results from NPR Sample accumulation back to e.g. material to affect shading that way? (Well probably not cause I guess the structure would be way more complex, but that's what I was intended to describe 😅 )

It's not just a complexity issue. If you do something like that the image won't ever converge.
That's why the split between aliased/converged sockets would be needed, you can't just mix aliased and converged data.

@ChengduLittleA > What I was thinking is that could we send the results from NPR Sample accumulation back to e.g. material to affect shading that way? (Well probably not cause I guess the structure would be way more complex, but that's what I was intended to describe 😅 ) It's not just a complexity issue. If you do something like that the image won't ever converge. That's why the split between aliased/converged sockets would be needed, you can't just mix aliased and converged data.

How much of the shader pipeline would be considered to be implemented?

How much of the shader pipeline would be considered to be implemented?

If the current goal is toon shading in some recent games (Genshin, etc.), using the shading features in PBR renderers such as EEVEE is enough. This should be the main goal since I don't see many other practically used styles of NPR on the market.

One pass for each material instance is against the idea of deferred shading. However, tagging pixels with material/object ID can improve performance.

A proper NPR render engine will eventually handle more than GBuffers, such as rigging, extracting features such as curves from the mesh, and user interactions like sketching.
Otherwise, I don't think such an engine has any advantage over Unity, which is very convenient for iterating different shading styles, and already have many existing solutions & projects.

If the current goal is toon shading in some recent games (Genshin, etc.), using the shading features in PBR renderers such as EEVEE is enough. This should be the main goal since I don't see many other practically used styles of NPR on the market. One pass for each material instance is against the idea of deferred shading. However, tagging pixels with material/object ID can improve performance. A proper NPR render engine will eventually handle more than GBuffers, such as rigging, extracting features such as curves from the mesh, and user interactions like sketching. Otherwise, I don't think such an engine has any advantage over Unity, which is very convenient for iterating different shading styles, and already have many existing solutions & projects.
Author
Member

@WangZiWei-Jiang
The goal is to support arbitrary stylized shading in a way that works consistently with all render features, not just for primary rays.

Rigging and tool-based features are not the job of render engines in Blender, such features should follow the already established Blender workflows.

Geometry-based features would still be possible, but they should work consistently with the rest of the rendering features.

I don't agree with your Unity statement, but that's completely off-topic.

@WangZiWei-Jiang The goal is to support arbitrary stylized shading in a way that works consistently with all render features, not just for primary rays. Rigging and tool-based features are not the job of render engines in Blender, such features should follow the already established Blender workflows. Geometry-based features would still be possible, but they should work consistently with the rest of the rendering features. I don't agree with your Unity statement, but that's completely off-topic.
Author
Member

@HannahFantasia Sorry, I don't understand the question.

@HannahFantasia Sorry, I don't understand the question.

@pragma37 Would it be possible with this approach to do something like a Gouraud shader?

@pragma37 Would it be possible with this approach to do something like a Gouraud shader?
Author
Member

@HannahFantasia Ah, now I see what you meant.
No, the geometry is not supposed to be rendered by the NPR engine, so it can't support custom vertex shaders.

@HannahFantasia Ah, now I see what you meant. No, the geometry is not supposed to be rendered by the NPR engine, so it can't support custom vertex shaders.

@pragma37 Could it be possible to think something similar to simulation zone as NPR zone like?

@pragma37 Could it be possible to think something similar to simulation zone as NPR zone like?
Author
Member

@Traslev Maybe? We could allow reading the render targets and AOVs from the previous frame.
I can see a few use cases and it shouldn't be that hard to support.

@Traslev Maybe? We could allow reading the render targets and AOVs from the previous frame. I can see a few use cases and it shouldn't be that hard to support.
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No Assignees
9 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#120403
No description provided.