OpenXR: Virtual Monitor experiment #102182
Labels
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset System
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Code Documentation
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Viewport & EEVEE
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Asset Browser Project
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Module
Viewport & EEVEE
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Severity
High
Severity
Low
Severity
Normal
Severity
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
13 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: blender/blender#102182
Loading…
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
During Blender conference there was a discussion explaining a limitation for using Blender in a virtual production.
Most of the time was spend on explaining what should be done, what lacked in looking towards how blender is actually being used in such setup.
This task will gather some feedback from users to find out what is needed to integrate a virtual monitor inside a scene.
The provided solution was to render the virtual camera to an texture, and after that render the main scene camera that could include the texture from the virtual camera.
During the conference I thought of several ways how to do it (from a coding perspective)
Although both options are possible and would lead to similar performance I think that only the first option would be acceptable for blender in the longer run. The second option could be added to an experimental branch for testing purposes as it doesn't seem to be that complicated.
The main issue I still have is to have an overview of the exact problem. What I got out of it so far:
Use cases for this:
This is an experiment to see if this is possible and to demonstrate the consequences of this from user and developer perspective. After this experiment it should be possible to use it in an experimental branch. I don't think it should be in master yet as unless the consequences have been discussed and further evaluated.
For discussion of this task use #xr channel on blender chat.
Branch is called
temp-xr-virtual-camera-experiment
.Technical details:
0001-1813.mp4
TODO:
Known limitations:
Changed status from 'Needs Triage' to: 'Confirmed'
Added subscriber: @Jeroen-Bakker
Virtual Studioto OpenXR: Virtual Camera integration [Draft]Added subscriber: @Skyfish
Could you add a link to where in the blender conference the VR was discussed?
What is meant by a virtual camera in this context?
I understand it to mean a camera object that you can grab and handle in VR, that displays what it sees in the VR space as though using a real life camera?
You can see examples of this in practice in other VR applications like Oculus/Adobe medium, where you can take a screenshot with a camera-like thing.
Added subscriber: @sebastian_k
@Skyfish it was a discussion that was done outside the official agenda. I added this ticket as I was asked to come with some ideas how this could work. Currently I am looking into clearing up the design what everything means, therefore this is set as a draft design.
Added subscriber: @JacobMerrill-1
realtime composition in AR is also important to overlay the rendered scene on top of the incoming pass through video
OpenXR: Virtual Camera integration [Draft]to OpenXR: Virtual Camera experimentAdded subscriber: @SimeonConzendorf
Added subscriber: @OtherRealms
Added subscriber: @vnapdv
Added subscriber: @brecht
For reference here is an explanation of terminology used in virtual production:
https://www.vpglossary.com/
What's a bit confusing here is that the term "virtual camera" in this task is used both for the camera object and the camera feed. These may come from e.g. the same mobile device, but not necessarily, or there may be no camera feed at all. I suggest to use the term "camera feed" when referring to the image coming from the a real camera.
Added subscriber: @RedMser
can the XR camera passthrough be exposed as well for AR as a image source?
meta quest 2 has black and white video, quest pro has pass through hd color.
I agree, more precise terminology is needed.
For example, as far as I know in the goal for this specific there wouldn't even be a "real camera", like no physical one. Not to say that this would not also be useful :)
Though according to that glossary the term "virtual camera" is actully accurate:
But which term do we use for the viewport camera, the one that is rendered to the VR headset?
My understanding is that "virtual camera" usually refers to the analog of a physical camera on set. Not a VR headset where someone arbitrarily walks and looks around, but a camera that is framing the shot the way the director wants it to be filmed.
And so if you're making a film for TV or cinema, the output of the virtual camera is usually displayed on a regular monitor. No VR needed.
You can involve a VR headset in this as well, but to me it seems mostly orthogonal to the concept of a virtual camera. You can position the virtual camera with various controller devices, and you can use the virtual camera to render for various display devices.
Re-reading the task, it sounds like the idea is to have some kind of "virtual monitor" that you can place in the scene, in case you want to work fully in a VR headset? That virtual monitor would display the output of a camera in the scene, but it does not need to make assumptions about how that camera is controlled. For example the camera motion could have been keyframed in advance, or the same virtual camera may be used for display on a virtual monitor in VR and a physical monitor at the same time.
What got me confused is that the first few sentences of the task seem to imply this is about solving fundamental issues for virtual production, whereas this seems to be about a very specific workflow that I'm guessing is not the norm.
Added subscriber: @Blendify
Added subscriber: @satishgoda1
I agree with the term "virtual monitor" It took me some time to understand the case what is needed and agree that we should use consistent and correct names for these terms.
OpenXR: Virtual Camera experimentto OpenXR: Virtual Monitor experimentI agree with the term as well. It's a virtual monitor attached to a virtual camera. :)
So cool, that you started working in this so fast. Because of the released windows build I could test it. I justed started a small scene "A virtual monitor showing the default cube :)" and it worked. But trying to do more, it crashed a lot.
Two spontaneous thoughts:
Should I report bugs/crashes here (or somewhere else) or is it just too early?
Important (and independent settings) for the virtual camera would be aspect and resolution. Changing the fov is one of the things, which crashes.
I'm looking forward to test and help, where I can.
Here's a compact crash report. If I can gather more useful information, tell me.
build: blender-3.5.0-alpha+temp-xr-virtual-camera-experiment.f7cdba506b0b-windows.amd64-release
It crashes when I...
Here's a video showing it. This all just start to crash after first usage of the virtual monitor node.
record_22-11-2022_10-55-07.mp4
Added subscriber: @muxed-reality
The crash when stopping the VR session should be fixed now with
db28a8b3d1
.The other crashes (browsing materials, showing material properties, changing camera settings) are due to a failed assert in
gpu_framebuffer.cc
L602, although I'm not sure why this occurs.Here's a crash report using the
xr-dev
branch , which now includes the features fromtemp-xr-virtual-camera-experiment
:blender.crash.txt
Hi, i'm very interested by this experiment but for virtual production display
In this use case the problem is to get the projection of the main camera render onto arbitrary geometry (in general just planes or curved plane) and output the result unfolded (to display this image on the wall)
It's a bit like a realtime baking of the current render projected on simple geometry.
For this purpose i need 2 pass rendering like in this demo.
Where can i find the diff of this experiment to try to understand how it's made?
Thank you very much for all the stuff you did guys.
@sebastientricoire Hi, you can use these diffs as a base but they may not apply cleanly as they were based on Blender 3.5-alpha and the xr-dev experimental branch:
3e725b55cf
786aad68b2