Realtime Viewport Compositor #91293

Closed
opened 2021-09-09 20:38:30 +02:00 by Clément Foucault · 55 comments

The project is still in design phase.


Team
Commissioner: @fclem
Project leader: ?
Project members: -

Description
Big picture:
This feature will enable the user to run the compositor on the viewport output and have realtime feedback.

Use cases:

  • Simple vignetting and color correction.
  • Adding Bloom (Glare) and lens-flare effects.
  • Masking and VFX compositing.
  • NPR adjustment to scene shading and multi scene composition.

Design:
From the User perspective this will be as simple as just having a new checkbox in the shading popover to enable the viewport compositor. The compositor will be renderer agnostic and should work on top EEVEE as well as Cycles or other renderer.
The compositor will only work in Camera view for now. This is needed to convert absolute positions to relative positions which is needed if the viewport has not the same aspect ratio as the camera.

Engineer plan:
This feature needs to be based upon the eevee-rewrite branch to use the rewritten gpu_codegen.cc which will ease the implementation.

Work plan

The third and fourth milestones are independent from each other and could be tackled in any order.

Note that time estimates are coming from @fclem and consider the developer is familiar with the codebase.

Milestone 1 - Single Pass Compositing
The first step is to have a basic support for single pass compositing node-tree. This means only supporting compositing nodes that only modify pixels in place. The ones that move (i.e: Distortion) or scatter (i.e: Blur) them will not be supported. Only one input image will be supported.

Capture d’écran du 2021-02-28 18-30-20.png

    • Create the Compositor engine and inject it the right place in the drawing pipeline.
    • Enable if the compositor is enabled and it has compatible node-tree. (return always compatible at this stage). Have a toggle to enable compositor in shading popover.
    • Have GPUMaterial support Compositor node trees. Have an equivalent of ntreeGPUMaterialNodes (ntreeGPUCompositeNodes) in node_composite_tree.c. Create GPU functions for some easy nodes. Need to store the GPUMaterials somewhere (scene?).
    • Display error/info message in viewport if node-tree incompatible (unsupported nodes). We need this error message because there is no way to see compilation errors as pink shader in the scene.
    • Could have stubs for all nodes and tag the material as invalid during evaluation of unsupported nodes. This would make it possible to differentiate between compilation errors and unsupported nodes.
    • Disable support if more than one input is used.
    • Render layer input could be as simple as just defining a uniform sampler in the shader lib file and binding the render color to it during rendering.
    • Add support for most single pass nodes.

Time estimate: 2-3 weeks (+1 week for all listed nodes)

Milestone 2 - Multi Layer Input
In a second time we need to find how to request many view layers, scenes, or renderpasses from the realtime engines. These should serve as inputs to the render tree. The draw manager should manage this somehow.

One limitation we will directly hit is the sampler limit. Only 16 to 24 texture slots are available on modern hardware. This limitation could be lifted during stage 3 or by using the bindless texture extension in the GPU module (still not planned).

External engine support should be as simple as creating one RenderEngine instance per Render Layer.

Capture d’écran du 2021-02-28 18-31-05.png

    • Gather needed inputs during nodetree evaluation. Might need another type of image node link that saves the Scene name and layer name.
    • Needs to declare the uniform samplers in generated GLSL. Make use of GPU_vertformat_safe_attr_name to get a glsl friendly name.
    • We should check for each sampler if it does not get optimized out by the compiler to avoid querying the data if it will not be used. Note that our nodetree eval function should only visit nodes that are linked to the output node so this might not be necessary in practice. But in some cases some checkbox option could disable some paths that makes some resources unused.
    • Draw manager should call the compositor engine to allocate the needed textures and framebuffers before render. Then, the Draw Manager loops over the needed scene / renderlayers and binds the correct framebuffer to draw the scene onto. There might be some tricks needed (like rendering the active view layer last) to ensure the overlays get the right depth to composite with. Also we might need to swap the textures in DefaultTextureList.
    • Depsgraph management is something that is yet to be cleared. My initial guess is that we need to manage multiple depsgraph.

Time estimate: 1-2 weeks once the design is set in stone.

Milestone 3 - Image Texture Support

This is the extended data input stage. Until now we could only use the Render Layer input node. We need to implement other inputs nodes like Masks, Bokeh Textures, Images, Tracks. Implementation design for Masks and Track position is still unfinished.

The most useful ones (Image and Movie Clip) are trivial to support since they already have GPUTexture for them.

Masks on the other hand would be better rendered on the gpu using something like this https://www.shadertoy.com/view/4sKyzW. This could be done directly in the fragment shader using a lot of uniform parameters for each bezier segment. Or, using the stage 4 improvements, have a dedicated shader and buffers for it. Using a signed distance field allows AA and arbitrary feathering.

We also need to support renderpasses. This is a big part of EEVEE's rewrite which needs to be done in order to support them in the viewport. Cycles could do it too, it just needs some changes in the external draw engine to draw the multiple passes. Nevertheless, the Render Layer node should be able to have GLSL outputs for these and be able to request them to the draw manager / render engine.

    • Images/MovieClip support.
    • Masks support could be done way after initial release. No need to rush it.
    • Bokeh texture should be easy as a separate buffer computation (Needs stage 4 first)
    • Render Passes do not need EEVEE's rewrite for testing. One could just clear the buffers to a special color and make sure they are fed correctly.

Time estimate: 1 week (without Masks nor Bokeh).

Milestone 4
Last stage is to support multi-pass rendering and the rest of the nodes. This is one of the trickiest parts as the GPUMaterial would need to be split into multiple passes using unique shaders. We also have to minimize VRAM/Bandwidth usage by allocating the minimal number of intermediate buffers.

Another difference is that the links now represent full buffers. So node inputs need to have a fallback texture if there is no link connected to them or have a shader code variation for float/vec4 inputs instead of sampler2D.

Almost all possible setups would be possible after implementing this.
Note that some nodes might expand to multiple passes themselves to improve performance (i.e: box blur, gaussian blur).

Capture d’écran du 2021-02-28 21-49-10.png

    • Split GPUMaterial into smaller trees. Make GPUMaterial a “meta” type that can contain a list of GPUMaterial. Also need to store the data flow between them. No need to have any new node implemented, only need to split the tree and keep the data flowing. Splitting needs to happen in ntreeGPUCompositeNodes. A simple way to do but not very efficient is to just consider the multi-pass nodes and split the trees at inputs and outputs.
    • Implement most remaining nodes. Some difficult nodes might be left to be implemented later (e.g.: defocus).
    • Make it efficient, remove unneeded buffer writing. It only needs to happen if, the node input is sampled at a different location, or if the size of the output is different (scale node).

Time estimate: 1-2 weeks to split the tree + 1 week to implement simple blur nodes + 1 week for simple deformation nodes + 1 week for optimisation

Branch: - Existing patches or branches.


Relevant links:

The project is still in design phase. --- **Team** **Commissioner:** @fclem **Project leader:** `?` **Project members:** `-` **Description** **Big picture:** This feature will enable the user to run the compositor on the viewport output and have realtime feedback. **Use cases**: * Simple vignetting and color correction. * Adding Bloom (Glare) and lens-flare effects. * Masking and VFX compositing. * NPR adjustment to scene shading and multi scene composition. **Design:** From the User perspective this will be as simple as just having a new checkbox in the shading popover to enable the viewport compositor. The compositor will be renderer agnostic and should work on top EEVEE as well as Cycles or other renderer. The compositor will only work in Camera view for now. This is needed to convert absolute positions to relative positions which is needed if the viewport has not the same aspect ratio as the camera. **Engineer plan:** This feature needs to be based upon the `eevee-rewrite` branch to use the rewritten `gpu_codegen.cc` which will ease the implementation. **Work plan** The third and fourth milestones are independent from each other and could be tackled in any order. Note that time estimates are coming from @fclem and consider the developer is familiar with the codebase. **Milestone 1 - Single Pass Compositing** The first step is to have a basic support for single pass compositing node-tree. This means only supporting compositing nodes that only modify pixels in place. The ones that move (i.e: Distortion) or scatter (i.e: Blur) them will not be supported. Only one input image will be supported. ![Capture d’écran du 2021-02-28 18-30-20.png](https://archive.blender.org/developer/F10391589/Capture_d_écran_du_2021-02-28_18-30-20.png) * - [ ]Create the Compositor engine and inject it the right place in the drawing pipeline. * - [ ]Enable if the compositor is enabled and it has compatible node-tree. (return always compatible at this stage). Have a toggle to enable compositor in shading popover. * - [ ]Have GPUMaterial support Compositor node trees. Have an equivalent of ntreeGPUMaterialNodes (ntreeGPUCompositeNodes) in node_composite_tree.c. Create GPU functions for some easy nodes. Need to store the GPUMaterials somewhere (scene?). * - [ ]Display error/info message in viewport if node-tree incompatible (unsupported nodes). We need this error message because there is no way to see compilation errors as pink shader in the scene. * - [ ]Could have stubs for all nodes and tag the material as invalid during evaluation of unsupported nodes. This would make it possible to differentiate between compilation errors and unsupported nodes. * - [ ]Disable support if more than one input is used. * - [ ]Render layer input could be as simple as just defining a uniform sampler in the shader lib file and binding the render color to it during rendering. * - [ ]Add support for most single pass nodes. Time estimate: 2-3 weeks (+1 week for all listed nodes) **Milestone 2 - Multi Layer Input** In a second time we need to find how to request many view layers, scenes, or renderpasses from the realtime engines. These should serve as inputs to the render tree. The draw manager should manage this somehow. One limitation we will directly hit is the sampler limit. Only 16 to 24 texture slots are available on modern hardware. This limitation could be lifted during stage 3 or by using the bindless texture extension in the GPU module (still not planned). External engine support should be as simple as creating one RenderEngine instance per Render Layer. ![Capture d’écran du 2021-02-28 18-31-05.png](https://archive.blender.org/developer/F10391592/Capture_d_écran_du_2021-02-28_18-31-05.png) * - [ ]Gather needed inputs during nodetree evaluation. Might need another type of image node link that saves the Scene name and layer name. * - [ ]Needs to declare the uniform samplers in generated GLSL. Make use of GPU_vertformat_safe_attr_name to get a glsl friendly name. * - [ ]We should check for each sampler if it does not get optimized out by the compiler to avoid querying the data if it will not be used. Note that our nodetree eval function should only visit nodes that are linked to the output node so this might not be necessary in practice. But in some cases some checkbox option could disable some paths that makes some resources unused. * - [ ]Draw manager should call the compositor engine to allocate the needed textures and framebuffers before render. Then, the Draw Manager loops over the needed scene / renderlayers and binds the correct framebuffer to draw the scene onto. There might be some tricks needed (like rendering the active view layer last) to ensure the overlays get the right depth to composite with. Also we might need to swap the textures in DefaultTextureList. * - [ ]Depsgraph management is something that is yet to be cleared. My initial guess is that we need to manage multiple depsgraph. Time estimate: 1-2 weeks once the design is set in stone. **Milestone 3 - Image Texture Support** This is the extended data input stage. Until now we could only use the Render Layer input node. We need to implement other inputs nodes like Masks, Bokeh Textures, Images, Tracks. Implementation design for Masks and Track position is still unfinished. The most useful ones (Image and Movie Clip) are trivial to support since they already have GPUTexture for them. Masks on the other hand would be better rendered on the gpu using something like this https://www.shadertoy.com/view/4sKyzW. This could be done directly in the fragment shader using a lot of uniform parameters for each bezier segment. Or, using the stage 4 improvements, have a dedicated shader and buffers for it. Using a signed distance field allows AA and arbitrary feathering. We also need to support renderpasses. This is a big part of EEVEE's rewrite which needs to be done in order to support them in the viewport. Cycles could do it too, it just needs some changes in the external draw engine to draw the multiple passes. Nevertheless, the Render Layer node should be able to have GLSL outputs for these and be able to request them to the draw manager / render engine. * - [ ]Images/MovieClip support. * - [ ]Masks support could be done way after initial release. No need to rush it. * - [ ]Bokeh texture should be easy as a separate buffer computation (Needs stage 4 first) * - [ ]Render Passes do not need EEVEE's rewrite for testing. One could just clear the buffers to a special color and make sure they are fed correctly. Time estimate: 1 week (without Masks nor Bokeh). **Milestone 4** Last stage is to support multi-pass rendering and the rest of the nodes. This is one of the trickiest parts as the GPUMaterial would need to be split into multiple passes using unique shaders. We also have to minimize VRAM/Bandwidth usage by allocating the minimal number of intermediate buffers. Another difference is that the links now represent full buffers. So node inputs need to have a fallback texture if there is no link connected to them or have a shader code variation for float/vec4 inputs instead of sampler2D. Almost all possible setups would be possible after implementing this. Note that some nodes might expand to multiple passes themselves to improve performance (i.e: box blur, gaussian blur). ![Capture d’écran du 2021-02-28 21-49-10.png](https://archive.blender.org/developer/F10391594/Capture_d_écran_du_2021-02-28_21-49-10.png) * - [ ]Split GPUMaterial into smaller trees. Make GPUMaterial a “meta” type that can contain a list of GPUMaterial. Also need to store the data flow between them. No need to have any new node implemented, only need to split the tree and keep the data flowing. Splitting needs to happen in ntreeGPUCompositeNodes. A simple way to do but not very efficient is to just consider the multi-pass nodes and split the trees at inputs and outputs. * - [ ]Implement most remaining nodes. Some difficult nodes might be left to be implemented later (e.g.: defocus). * - [ ]Make it efficient, remove unneeded buffer writing. It only needs to happen if, the node input is sampled at a different location, or if the size of the output is different (scale node). Time estimate: 1-2 weeks to split the tree + 1 week to implement simple blur nodes + 1 week for simple deformation nodes + 1 week for optimisation **Branch**: `-` *Existing patches or branches.* --- **Relevant links**: * https://www.shadertoy.com/view/4sKyzW for bezier distance field
Author
Member

Changed status from 'Needs Triage' to: 'Confirmed'

Changed status from 'Needs Triage' to: 'Confirmed'
Author
Member

Added subscribers: @fclem, @Jeroen-Bakker

Added subscribers: @fclem, @Jeroen-Bakker
Member

Added subscriber: @Blendify

Added subscriber: @Blendify

Added subscriber: @GeorgiaPacific

Added subscriber: @GeorgiaPacific

Added subscriber: @easythrees

Added subscriber: @easythrees
Contributor

Added subscriber: @Gilberto.R

Added subscriber: @Gilberto.R

Added subscriber: @AlexeyAdamitsky

Added subscriber: @AlexeyAdamitsky
Member

Added subscriber: @zanqdo

Added subscriber: @zanqdo

Added subscriber: @Defka

Added subscriber: @Defka

Added subscriber: @Low_Polygon42

Added subscriber: @Low_Polygon42

Added subscriber: @dat1

Added subscriber: @dat1

Added subscriber: @VDC

Added subscriber: @VDC
Contributor

Added subscriber: @ok_what

Added subscriber: @ok_what

Added subscriber: @Garek

Added subscriber: @Garek

Added subscriber: @Festivity

Added subscriber: @Festivity

Added subscriber: @Diogo_Valadares

Added subscriber: @Diogo_Valadares

Added subscriber: @Timothy_m

Added subscriber: @Timothy_m

Added subscriber: @thinsoldier

Added subscriber: @thinsoldier

Added subscriber: @Emi_Martinez

Added subscriber: @Emi_Martinez

Added subscriber: @david.cendal

Added subscriber: @david.cendal

Added subscriber: @sozap

Added subscriber: @sozap

Added subscriber: @bent

Added subscriber: @bent

Added subscriber: @1029910278

Added subscriber: @1029910278

Added subscriber: @lictex_1

Added subscriber: @lictex_1

Added subscriber: @AndyCuccaro

Added subscriber: @AndyCuccaro

Added subscriber: @Nurb2Kea

Added subscriber: @Nurb2Kea

Added subscriber: @Pipeliner

Added subscriber: @Pipeliner

Added subscriber: @MattCurtis

Added subscriber: @MattCurtis

Added subscriber: @hsa

Added subscriber: @hsa

Added subscriber: @AntoineP

Added subscriber: @AntoineP

Added subscriber: @AntonioJavierTorralbaMoreno

Added subscriber: @AntonioJavierTorralbaMoreno

Added subscriber: @DuarteRamos

Added subscriber: @DuarteRamos

Added subscriber: @Harvester

Added subscriber: @Harvester
Contributor

Added subscriber: @Raimund58

Added subscriber: @Raimund58

Added subscriber: @Yuro

Added subscriber: @Yuro

Added subscriber: @Nico-Cook

Added subscriber: @Nico-Cook

Removed subscriber: @Nico-Cook

Removed subscriber: @Nico-Cook

Added subscriber: @Nico-Cook

Added subscriber: @Nico-Cook

Added subscriber: @jimmy.b

Added subscriber: @jimmy.b

how is the status of this project? Is there a Branch to test it?
Does it allow for transforms (scale, totate etc) already?

how is the status of this project? Is there a Branch to test it? Does it allow for transforms (scale, totate etc) already?

Is there a Branch to test it?

there is a branch is available at https://builder.blender.org/download/experimental/archive/ but I guess there is not much that you can do with it yet

> Is there a Branch to test it? there is a branch is available at https://builder.blender.org/download/experimental/archive/ but I guess there is not much that you can do with it yet

Added subscriber: @shwaky

Added subscriber: @shwaky

The project is still in design phase.

the first sentence of this task states

>The project is still in design phase. the first sentence of this task states

Added subscriber: @MD.FahadHassan

Added subscriber: @MD.FahadHassan

Added subscriber: @kmcurry

Added subscriber: @kmcurry

Added subscriber: @Roggii-4

Added subscriber: @Roggii-4

Added subscriber: @Macilvoy

Added subscriber: @Macilvoy

Added subscriber: @ayberko

Added subscriber: @ayberko

Removed subscriber: @ayberko

Removed subscriber: @ayberko

Added subscriber: @JacobMerrill-1

Added subscriber: @JacobMerrill-1

image.png
so is there some updates?

![image.png](https://archive.blender.org/developer/F13122012/image.png) so is there some updates?
Member

Added subscriber: @OmarEmaraDev

Added subscriber: @OmarEmaraDev
Member

Changed status from 'Confirmed' to: 'Archived'

Changed status from 'Confirmed' to: 'Archived'
Member

This is now superseded by #99210.

This is now superseded by #99210.
Member

Removed subscriber: @zanqdo

Removed subscriber: @zanqdo
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
44 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#91293
No description provided.