OpenXR: Virtual Monitor experiment #102182

Open
opened 2022-10-31 13:01:30 +01:00 by Jeroen Bakker · 27 comments
Member

During Blender conference there was a discussion explaining a limitation for using Blender in a virtual production.
Most of the time was spend on explaining what should be done, what lacked in looking towards how blender is actually being used in such setup.

This task will gather some feedback from users to find out what is needed to integrate a virtual monitor inside a scene.
The provided solution was to render the virtual camera to an texture, and after that render the main scene camera that could include the texture from the virtual camera.

During the conference I thought of several ways how to do it (from a coding perspective)

  1. Just wait until the Viewport Compositor can handle multiple scenes and "None blocking rendering".
  2. Have a special shading node that could input the texture of the virtual camera. The material can than be put on any geometry.

Although both options are possible and would lead to similar performance I think that only the first option would be acceptable for blender in the longer run. The second option could be added to an experimental branch for testing purposes as it doesn't seem to be that complicated.

The main issue I still have is to have an overview of the exact problem. What I got out of it so far:

  • The virtual camera is controlled by a controller via openXR. It shows the visible frustum of the scene.
  • This part is rendered and displayed in the same scene on an image plane. This image plane will than act as a virtual monitor.
  • The main camera is rendered and displayed to the physical display(s).

Use cases for this:

  • View output of a virtual camera when still having an VR headset on. Eg. look at the camera controller and see what that camera is recording, when inside an XR session.

This is an experiment to see if this is possible and to demonstrate the consequences of this from user and developer perspective. After this experiment it should be possible to use it in an experimental branch. I don't think it should be in master yet as unless the consequences have been discussed and further evaluated.

For discussion of this task use #xr channel on blender chat.
Branch is called temp-xr-virtual-camera-experiment.

Technical details:

  • draw manager instance data is currently stored in 3d viewport, each virtual camera could have its own GPUOffscreen...
  • The virtual camera texture will only exists as GPUTexture. It will not be available on the CPU for performance reasons. GPUTexture is in linear color space.
  • Image planes are only visible in Eevee. as it uses a regular texture and material. To hide internal complexity a new Virtual Camera node will be available in Material shader graph.
  • When rendering the virtual camera, anu geometry using a Virtual Camera Node will not be drawn.

0001-1813.mp4

TODO:

  • Add Virtual Camera Node
  • Add runtime data structures to Camera.
  • Detect existence of virtual camera node and update its texture. We can do this outside the draw manager for clarity as it is a hack.
  • Hook generated GPU texture to the Virtual camera node.
  • Create temp view3d for correct view/win mats.
  • Enable this for OpenXR session.

Known limitations:

  • Performance.
  • Only Eevee is supported. Cycles could be supported and perhaps even without rendering twice.
  • Virtual camera texture is visible in reflections/refractions, shadow. Should be turned of by user.
During Blender conference there was a discussion explaining a limitation for using Blender in a virtual production. Most of the time was spend on explaining what should be done, what lacked in looking towards how blender is actually being used in such setup. This task will gather some feedback from users to find out what is needed to integrate a virtual monitor inside a scene. The provided solution was to render the virtual camera to an texture, and after that render the main scene camera that could include the texture from the virtual camera. During the conference I thought of several ways how to do it (from a coding perspective) 1. Just wait until the Viewport Compositor can handle multiple scenes and "None blocking rendering". 2. Have a special shading node that could input the texture of the virtual camera. The material can than be put on any geometry. Although both options are possible and would lead to similar performance I think that only the first option would be acceptable for blender in the longer run. The second option could be added to an experimental branch for testing purposes as it doesn't seem to be that complicated. The main issue I still have is to have an overview of the exact problem. What I got out of it so far: * The virtual camera is controlled by a controller via openXR. It shows the visible frustum of the scene. * This part is rendered and displayed in the same scene on an image plane. This image plane will than act as a virtual monitor. * The main camera is rendered and displayed to the physical display(s). Use cases for this: * View output of a virtual camera when still having an VR headset on. Eg. look at the camera controller and see what that camera is recording, when inside an XR session. This is an experiment to see if this is possible and to demonstrate the consequences of this from user and developer perspective. After this experiment it should be possible to use it in an experimental branch. I don't think it should be in master yet as unless the consequences have been discussed and further evaluated. For discussion of this task use #xr channel on blender chat. Branch is called `temp-xr-virtual-camera-experiment`. Technical details: * draw manager instance data is currently stored in 3d viewport, each virtual camera could have its own GPUOffscreen... * The virtual camera texture will only exists as GPUTexture. It will not be available on the CPU for performance reasons. GPUTexture is in linear color space. * Image planes are only visible in Eevee. as it uses a regular texture and material. To hide internal complexity a new Virtual Camera node will be available in Material shader graph. * When rendering the virtual camera, anu geometry using a Virtual Camera Node will not be drawn. [0001-1813.mp4](https://archive.blender.org/developer/F13891417/0001-1813.mp4) TODO: - [x] Add Virtual Camera Node - [x] Add runtime data structures to Camera. - [x] Detect existence of virtual camera node and update its texture. We can do this outside the draw manager for clarity as it is a hack. - [x] Hook generated GPU texture to the Virtual camera node. - [x] Create temp view3d for correct view/win mats. - [x] Enable this for OpenXR session. Known limitations: * Performance. * Only Eevee is supported. Cycles could be supported and perhaps even without rendering twice. * Virtual camera texture is visible in reflections/refractions, shadow. Should be turned of by user.
Author
Member

Changed status from 'Needs Triage' to: 'Confirmed'

Changed status from 'Needs Triage' to: 'Confirmed'
Author
Member

Added subscriber: @Jeroen-Bakker

Added subscriber: @Jeroen-Bakker
Jeroen Bakker changed title from Virtual Studio to OpenXR: Virtual Camera integration [Draft] 2022-10-31 13:06:00 +01:00

Added subscriber: @Skyfish

Added subscriber: @Skyfish

Could you add a link to where in the blender conference the VR was discussed?

What is meant by a virtual camera in this context?
I understand it to mean a camera object that you can grab and handle in VR, that displays what it sees in the VR space as though using a real life camera?
You can see examples of this in practice in other VR applications like Oculus/Adobe medium, where you can take a screenshot with a camera-like thing.

Could you add a link to where in the blender conference the VR was discussed? What is meant by a virtual camera in this context? I understand it to mean a camera object that you can grab and handle in VR, that displays what it sees in the VR space as though using a real life camera? You can see examples of this in practice in other VR applications like Oculus/Adobe medium, where you can take a screenshot with a camera-like thing.
Author
Member

Added subscriber: @sebastian_k

Added subscriber: @sebastian_k
Author
Member

@Skyfish it was a discussion that was done outside the official agenda. I added this ticket as I was asked to come with some ideas how this could work. Currently I am looking into clearing up the design what everything means, therefore this is set as a draft design.

@Skyfish it was a discussion that was done outside the official agenda. I added this ticket as I was asked to come with some ideas how this could work. Currently I am looking into clearing up the design what everything means, therefore this is set as a draft design.

Added subscriber: @JacobMerrill-1

Added subscriber: @JacobMerrill-1

realtime composition in AR is also important to overlay the rendered scene on top of the incoming pass through video

realtime composition in AR is also important to overlay the rendered scene on top of the incoming pass through video
Jeroen Bakker changed title from OpenXR: Virtual Camera integration [Draft] to OpenXR: Virtual Camera experiment 2022-11-02 08:05:21 +01:00

Added subscriber: @SimeonConzendorf

Added subscriber: @SimeonConzendorf

Added subscriber: @OtherRealms

Added subscriber: @OtherRealms
Contributor

Added subscriber: @vnapdv

Added subscriber: @vnapdv

Added subscriber: @brecht

Added subscriber: @brecht

For reference here is an explanation of terminology used in virtual production:
https://www.vpglossary.com/

What's a bit confusing here is that the term "virtual camera" in this task is used both for the camera object and the camera feed. These may come from e.g. the same mobile device, but not necessarily, or there may be no camera feed at all. I suggest to use the term "camera feed" when referring to the image coming from the a real camera.

For reference here is an explanation of terminology used in virtual production: https://www.vpglossary.com/ What's a bit confusing here is that the term "virtual camera" in this task is used both for the camera object and the camera feed. These may come from e.g. the same mobile device, but not necessarily, or there may be no camera feed at all. I suggest to use the term "camera feed" when referring to the image coming from the a real camera.
Contributor

Added subscriber: @RedMser

Added subscriber: @RedMser

can the XR camera passthrough be exposed as well for AR as a image source?

meta quest 2 has black and white video, quest pro has pass through hd color.

can the XR camera passthrough be exposed as well for AR as a image source? meta quest 2 has black and white video, quest pro has pass through hd color.

In #102182#1440430, @brecht wrote:
For reference here is an explanation of terminology used in virtual production:
https://www.vpglossary.com/

What's a bit confusing here is that the term "virtual camera" in this task is used both for the camera object and the camera feed. These may come from e.g. the same mobile device, but not necessarily, or there may be no camera feed at all. I suggest to use the term "camera feed" when referring to the image coming from the a real camera.

I agree, more precise terminology is needed.
For example, as far as I know in the goal for this specific there wouldn't even be a "real camera", like no physical one. Not to say that this would not also be useful :)
Though according to that glossary the term "virtual camera" is actully accurate:

A camera in a real-time engine which behaves the same way a real-world camera would with respect to optics, aspect ratio, etc. A Vcam can be manipulated using a tracked device such as a mobile device, tablet, game controller, or a physical object with a tracking reference attached such as a real-world tripod, dolly, crane, drone, etc.

But which term do we use for the viewport camera, the one that is rendered to the VR headset?

> In #102182#1440430, @brecht wrote: > For reference here is an explanation of terminology used in virtual production: > https://www.vpglossary.com/ > > What's a bit confusing here is that the term "virtual camera" in this task is used both for the camera object and the camera feed. These may come from e.g. the same mobile device, but not necessarily, or there may be no camera feed at all. I suggest to use the term "camera feed" when referring to the image coming from the a real camera. I agree, more precise terminology is needed. For example, as far as I know in the goal for this specific there wouldn't even be a "real camera", like no physical one. Not to say that this would not also be useful :) Though according to that glossary the term "virtual camera" is actully accurate: > A camera in a real-time engine which behaves the same way a real-world camera would with respect to optics, aspect ratio, etc. A Vcam can be manipulated using a tracked device such as a mobile device, tablet, game controller, or a physical object with a tracking reference attached such as a real-world tripod, dolly, crane, drone, etc. But which term do we use for the viewport camera, the one that is rendered to the VR headset?

My understanding is that "virtual camera" usually refers to the analog of a physical camera on set. Not a VR headset where someone arbitrarily walks and looks around, but a camera that is framing the shot the way the director wants it to be filmed.

And so if you're making a film for TV or cinema, the output of the virtual camera is usually displayed on a regular monitor. No VR needed.

You can involve a VR headset in this as well, but to me it seems mostly orthogonal to the concept of a virtual camera. You can position the virtual camera with various controller devices, and you can use the virtual camera to render for various display devices.

Re-reading the task, it sounds like the idea is to have some kind of "virtual monitor" that you can place in the scene, in case you want to work fully in a VR headset? That virtual monitor would display the output of a camera in the scene, but it does not need to make assumptions about how that camera is controlled. For example the camera motion could have been keyframed in advance, or the same virtual camera may be used for display on a virtual monitor in VR and a physical monitor at the same time.

What got me confused is that the first few sentences of the task seem to imply this is about solving fundamental issues for virtual production, whereas this seems to be about a very specific workflow that I'm guessing is not the norm.

My understanding is that "virtual camera" usually refers to the analog of a physical camera on set. Not a VR headset where someone arbitrarily walks and looks around, but a camera that is framing the shot the way the director wants it to be filmed. And so if you're making a film for TV or cinema, the output of the virtual camera is usually displayed on a regular monitor. No VR needed. You can involve a VR headset in this as well, but to me it seems mostly orthogonal to the concept of a virtual camera. You can position the virtual camera with various controller devices, and you can use the virtual camera to render for various display devices. Re-reading the task, it sounds like the idea is to have some kind of "virtual monitor" that you can place in the scene, in case you want to work fully in a VR headset? That virtual monitor would display the output of a camera in the scene, but it does not need to make assumptions about how that camera is controlled. For example the camera motion could have been keyframed in advance, or the same virtual camera may be used for display on a virtual monitor in VR and a physical monitor at the same time. What got me confused is that the first few sentences of the task seem to imply this is about solving fundamental issues for virtual production, whereas this seems to be about a very specific workflow that I'm guessing is not the norm.
Member

Added subscriber: @Blendify

Added subscriber: @Blendify

Added subscriber: @satishgoda1

Added subscriber: @satishgoda1
Author
Member

I agree with the term "virtual monitor" It took me some time to understand the case what is needed and agree that we should use consistent and correct names for these terms.

I agree with the term "virtual monitor" It took me some time to understand the case what is needed and agree that we should use consistent and correct names for these terms.
Jeroen Bakker changed title from OpenXR: Virtual Camera experiment to OpenXR: Virtual Monitor experiment 2022-11-09 14:58:46 +01:00

I agree with the term as well. It's a virtual monitor attached to a virtual camera. :)

I agree with the term as well. It's a virtual monitor attached to a virtual camera. :)

So cool, that you started working in this so fast. Because of the released windows build I could test it. I justed started a small scene "A virtual monitor showing the default cube :)" and it worked. But trying to do more, it crashed a lot.

Two spontaneous thoughts:
Should I report bugs/crashes here (or somewhere else) or is it just too early?
Important (and independent settings) for the virtual camera would be aspect and resolution. Changing the fov is one of the things, which crashes.

I'm looking forward to test and help, where I can.

So cool, that you started working in this so fast. Because of the released windows build I could test it. I justed started a small scene "A virtual monitor showing the default cube :)" and it worked. But trying to do more, it crashed a lot. Two spontaneous thoughts: Should I report bugs/crashes here (or somewhere else) or is it just too early? Important (and independent settings) for the virtual camera would be aspect and resolution. Changing the fov is one of the things, which crashes. I'm looking forward to test and help, where I can.

Here's a compact crash report. If I can gather more useful information, tell me.
build: blender-3.5.0-alpha+temp-xr-virtual-camera-experiment.f7cdba506b0b-windows.amd64-release

It crashes when I...

  • browse the materials
  • want to show material properties
  • want to change the fov of the camera (and other settings too)
  • stop the vr session

Here's a video showing it. This all just start to crash after first usage of the virtual monitor node.
record_22-11-2022_10-55-07.mp4

Here's a compact crash report. If I can gather more useful information, tell me. build: blender-3.5.0-alpha+temp-xr-virtual-camera-experiment.f7cdba506b0b-windows.amd64-release It crashes when I... - browse the materials - want to show material properties - want to change the fov of the camera (and other settings too) - stop the vr session Here's a video showing it. This all just start to crash after first usage of the virtual monitor node. [record_22-11-2022_10-55-07.mp4](https://archive.blender.org/developer/F13959510/record_22-11-2022_10-55-07.mp4)
Member

Added subscriber: @muxed-reality

Added subscriber: @muxed-reality
Member

In #102182#1449649, @SimeonConzendorf wrote:
It crashes when I...

  • browse the materials
  • want to show material properties
  • want to change the fov of the camera (and other settings too)
  • stop the vr session

The crash when stopping the VR session should be fixed now with db28a8b3d1.
The other crashes (browsing materials, showing material properties, changing camera settings) are due to a failed assert in gpu_framebuffer.cc L602, although I'm not sure why this occurs.

Here's a crash report using the xr-dev branch , which now includes the features from temp-xr-virtual-camera-experiment:
blender.crash.txt

> In #102182#1449649, @SimeonConzendorf wrote: > It crashes when I... > - browse the materials > - want to show material properties > - want to change the fov of the camera (and other settings too) > - stop the vr session The crash when stopping the VR session should be fixed now with db28a8b3d1. The other crashes (browsing materials, showing material properties, changing camera settings) are due to a failed assert in `gpu_framebuffer.cc` L602, although I'm not sure why this occurs. Here's a crash report using the `xr-dev` [branch ](https://builder.blender.org/download/experimental/xr-dev/), which now includes the features from `temp-xr-virtual-camera-experiment`: [blender.crash.txt](https://archive.blender.org/developer/F13963035/blender.crash.txt)

Hi, i'm very interested by this experiment but for virtual production display
In this use case the problem is to get the projection of the main camera render onto arbitrary geometry (in general just planes or curved plane) and output the result unfolded (to display this image on the wall)
It's a bit like a realtime baking of the current render projected on simple geometry.

For this purpose i need 2 pass rendering like in this demo.

Where can i find the diff of this experiment to try to understand how it's made?
Thank you very much for all the stuff you did guys.

Hi, i'm very interested by this experiment but for virtual production display In this use case the problem is to get the projection of the main camera render onto arbitrary geometry (in general just planes or curved plane) and output the result unfolded (to display this image on the wall) It's a bit like a realtime baking of the current render projected on simple geometry. For this purpose i need 2 pass rendering like in this demo. Where can i find the diff of this experiment to try to understand how it's made? Thank you very much for all the stuff you did guys.
Member

@sebastientricoire Hi, you can use these diffs as a base but they may not apply cleanly as they were based on Blender 3.5-alpha and the xr-dev experimental branch:
3e725b55cf
786aad68b2

@sebastientricoire Hi, you can use these diffs as a base but they may not apply cleanly as they were based on Blender 3.5-alpha and the [xr-dev](https://projects.blender.org/blender/blender/src/branch/xr-dev) experimental branch: https://projects.blender.org/blender/blender/commit/3e725b55cf87287cdd65f99b2d654306b6766c9c https://projects.blender.org/blender/blender/commit/786aad68b2c210bcb2f56e2ae2bf5daa92b669eb
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
13 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#102182
No description provided.