Virtual Reality (XR/VR/AR/MR) #68998

Open
opened 2019-08-21 16:30:51 +02:00 by Dalai Felinto · 49 comments

NOTE: This section will is WIP and will be updated shortly.

Status:

Team
Commissioner: ?
Project leader:
Project members:
Big picture:

Description
Use cases:

Design:

Engineer plan: -

Work plan

Milestone 1 - Scene Inspection #71347
Time estimate: Mostly done, 2-3 weeks of work left (state: mid-February 2020)

Milestone 2 - Continuous Immersive Drawing #71348
Time estimate: ?

Later

Notes: -


The description here and in sub-tasks is based on a design document by @JulianEisel: VR/XR Big Picture Kickoff . Please don't take it as official document.

Goals and Overall Vision

XR enables an entirely new way to work with computer generated 3D worlds. Instead of trying to take the UIs we’ve already implemented for traditional monitor use and make them work in XR, we should establish an experience on entirely new grounds. Otherwise, what we’ll end up with is just a different, more difficult way to use Blender.
So to enter the new immersive world, we should think about which workflows could benefit the most from XR, and carefully craft a new, improved experience for them. Don’t try to enable huge amounts of features for XR, risking bad outcome - or even total failure - but find out which features matter the most and focus on them first.

Long term, the experience can then be enriched further, with more features added as smaller, more concise projects. Eventually, XR would allow creating 3D content with unseen interactivity. The entirety of Blender’s content creation capabilities available in an immersive 3D experience. However, this better be the long term vision for XR, not the goal for this initial project.

OpenXR

All VR support in Blender will be based on OpenXR , the brand new specification for VR, AR, MR, etc - or short: XR. It is created by the Khronos group and has a huge number of supporters from all over the VR industry. It’s likely going to be the standard for VR/AR/MR/… (XR).

OpenXR splits the workload between the application (Blender) and the OpenXR runtime. The runtime handles all the device specific stuff, provides many common features (e.g. time wrapping) and can add extensions to the OpenXR API. So the devices Blender will support are defined by the available OpenXR (compatible) runtimes. Currently:

  • Windows Mixed Reality
  • Oculus (though, not officially released)
  • Monado (FOSS Linux OpenXR runtime)

For information on how to test our current OpenXR driven implementation, see https://wiki.blender.org/wiki/User:Severin/GSoC-2019/How_to_Test

State of the Project

While there are multiple initiatives related to Blender and VR (e.g [MPX ]] and [ https:*www.marui-plugin.com/blender-xr/ | Marui’s Blender XR ), the first important milestone for XR support in main Blender is being finished with an ongoing Google Summer of Code project. It aims to bring stable and well performing OpenXR based VR rendering support into the core of Blender. It’s not supposed to enable any interaction, it’s solely focused on the foundation of HMD support. For more info, refer to #71365. Further work is being done to complete the first milestone: #71347 (Virtual Reality - Milestone 1 - Scene Inspection).

The one important part that’s missing is controller support, or more precisely, input and haptics support. Although we can do some experiments, it would be helpful to first get a better idea of how VR interaction should work eventually, before much effort is put into supporting this.

This is the foundation that we can build on. It should allow people with experience in the field to join efforts to bring rich immersive experiences to Blender.

Development Process

Proposal is to do a use-case driven development process. That means, we define a number (say 10-20) of specific use cases. These should be chosen wisely, based on what we think has most potential with XR.
First, just vaguely describe what requirements the system has for a use-case, a bit later on, define what tools and options are needed to get the use case done (possibly by a specific persona ).

On top of that, the process should be done iteratively. That means, we define short term goals for an iteration, work on them and evaluate afterwards before we get into the next iteration. The iterations may include implementation work. So we don’t do the waterfall-like “design first”, but there should be enough design work done before first implementations are made.

Any implementation should be driven by a use-case (or multiple) that the team agrees is drafted out well enough.

NOTE: This section will is WIP and will be updated shortly. **Status:** **Team** **Commissioner:** `?` **Project leader:** **Project members:** **Big picture:** **Description** **Use cases:** **Design:** **Engineer plan:** `-` **Work plan** **Milestone 1 - Scene Inspection #71347** Time estimate: Mostly done, 2-3 weeks of work left (state: mid-February 2020) **Milestone 2 - Continuous Immersive Drawing #71348** Time estimate: `?` **Later** **Notes:** - ---- The description here and in sub-tasks is based on a design document by @JulianEisel: [VR/XR Big Picture Kickoff ](https://dev-files.blender.org/file/data/nqh5c22ahvruutmdna5l/PHID-FILE-4nynef6hmrfcwaimvf3h/XR_Big_Picture_Kickoff_%28V2%29). Please don't take it as official document. ## Goals and Overall Vision XR enables an entirely new way to work with computer generated 3D worlds. Instead of trying to take the UIs we’ve already implemented for traditional monitor use and make them work in XR, we should establish an experience on entirely new grounds. Otherwise, what we’ll end up with is just a different, more difficult way to use Blender. So to enter the new immersive world, we should think about which workflows could benefit the most from XR, and carefully craft a new, improved experience for them. Don’t try to enable huge amounts of features for XR, risking bad outcome - or even total failure - but find out which features matter the most and focus on them first. Long term, the experience can then be enriched further, with more features added as smaller, more concise projects. Eventually, XR would allow creating 3D content with unseen interactivity. The entirety of Blender’s content creation capabilities available in an immersive 3D experience. However, this better be the long term vision for XR, not the goal for this initial project. ## OpenXR All VR support in Blender will be based on [OpenXR ](https://www.khronos.org/openxr), the brand new specification for VR, AR, MR, etc - or short: XR. It is created by the Khronos group and has a huge number of supporters from all over the VR industry. It’s likely going to be the standard for VR/AR/MR/… (XR). OpenXR splits the workload between the application (Blender) and the OpenXR runtime. The runtime handles all the device specific stuff, provides many common features (e.g. time wrapping) and can add extensions to the OpenXR API. So the devices Blender will support are defined by the available OpenXR (compatible) runtimes. Currently: * Windows Mixed Reality * Oculus (though, not officially released) * Monado (FOSS Linux OpenXR runtime) For information on how to test our current OpenXR driven implementation, see https://wiki.blender.org/wiki/User:Severin/GSoC-2019/How_to_Test ## State of the Project While there are multiple initiatives related to Blender and VR (e.g [MPX ]] and [[ https:*www.marui-plugin.com/blender-xr/ | Marui’s Blender XR ](https:*vimeo.com/channels/mpx)), the first important milestone for XR support in main Blender is being finished with an ongoing Google Summer of Code project. It aims to bring stable and well performing OpenXR based VR rendering support into the core of Blender. It’s not supposed to enable any interaction, it’s solely focused on the foundation of HMD support. For more info, refer to #71365. Further work is being done to complete the first milestone: #71347 (Virtual Reality - Milestone 1 - Scene Inspection). The one important part that’s missing is controller support, or more precisely, input and haptics support. Although we can do some experiments, it would be helpful to first get a better idea of how VR interaction should work eventually, before much effort is put into supporting this. This is the foundation that we can build on. It should allow people with experience in the field to join efforts to bring rich immersive experiences to Blender. ## Development Process Proposal is to do a use-case driven development process. That means, we define a number (say 10-20) of specific use cases. These should be chosen wisely, based on what we think has most potential with XR. First, just vaguely describe what requirements the system has for a use-case, a bit later on, define what tools and options are needed to get the use case done (possibly by a specific [persona ](https://www.interaction-design.org/literature/article/personas-why-and-how-you-should-use-them)). On top of that, the process should be done iteratively. That means, we define short term goals for an iteration, work on them and evaluate afterwards before we get into the next iteration. The iterations may include implementation work. So we don’t do the waterfall-like “design first”, but there should be enough design work done before first implementations are made. Any implementation should be driven by a use-case (or multiple) that the team agrees is drafted out well enough.
Julian Eisel was assigned by Dalai Felinto 2019-08-21 16:30:51 +02:00
Author
Owner

Added subscriber: @dfelinto

Added subscriber: @dfelinto

#47899 was marked as duplicate of this issue

#47899 was marked as duplicate of this issue
Member
Added subscribers: @JulianEisel, @TheOnlyJoey, @esummers, @Olm-e-9, @StirlingGoetz, @LeeP, @Ozeki, @jherico, @hoxolotl, @ppicazo, @EjnerFergo, @JoelGerlach, @david.rysk, @NicholasBenge, @rpavlik, @godbyk, @VRguy, @LazyDodo
Member

Dalai and I agreed on merging #47899 into this. The old task can still be referred to, but development took new paths that are better discussed here and in the sub-tasks. We'd also like to discuss things on a broader level - the "big picture" for VR in Blender.
Will publish more info soon.

Dalai and I agreed on merging #47899 into this. The old task can still be referred to, but development took new paths that are better discussed here and in the sub-tasks. We'd also like to discuss things on a broader level - the "big picture" for VR in Blender. Will publish more info soon.

Added subscriber: @filibis

Added subscriber: @filibis

Added subscriber: @SamGreen

Added subscriber: @SamGreen

Added subscriber: @Jaydead

Added subscriber: @Jaydead

Added subscriber: @Brachi

Added subscriber: @Brachi

Added subscriber: @JuanFranciscoPaez

Added subscriber: @JuanFranciscoPaez

Added subscriber: @RainerTrummer

Added subscriber: @RainerTrummer
Member

Added subscriber: @muxed-reality

Added subscriber: @muxed-reality

Added subscriber: @SimeonConzendorf

Added subscriber: @SimeonConzendorf

Added subscriber: @Mr_SquarePeg

Added subscriber: @Mr_SquarePeg

Added subscriber: @Zeirus

Added subscriber: @Zeirus

Added subscriber: @MACHIN3

Added subscriber: @MACHIN3

Added subscriber: @jiku

Added subscriber: @jiku

Added subscriber: @JacobMerrill-1

Added subscriber: @JacobMerrill-1

I think having collapsible 'shelves' which hold icons or folders is the way to go

one could grab a 'icon' that is 2d thumbnail on a shelf -> pull it into vr where it becomes a 3d icon,
set it on a desk -> inspect a 'folder' inside it containing all it's attributes
(brush radius, feather, color etc)

further down the line, the actual python running the object will be visible inside one of these 'nests' or logic noodles

this way tools can be shared in a .blend
(some work will need to be done for infrastructure later to make more flexible 'chunks' used to make tools in py calling compiled code)

maybe also have 'active tool button bindings' in a folder*
(like when a tool is 'bound' to a VR handle the tool itself has the shortcuts in it for how to interact with it)

I think having collapsible 'shelves' which hold icons or folders is the way to go one could grab a 'icon' that is 2d thumbnail on a shelf -> pull it into vr where it becomes a 3d icon, set it on a desk -> inspect a 'folder' inside it containing all it's attributes (brush radius, feather, color etc) further down the line, the actual python running the object will be visible inside one of these 'nests' or logic noodles this way tools can be shared in a .blend (some work will need to be done for infrastructure later to make more flexible 'chunks' used to make tools in py calling compiled code) maybe also have 'active tool button bindings' in a folder* (like when a tool is 'bound' to a VR handle the tool itself has the shortcuts in it for how to interact with it)

another important thing to consider is that gesture recognition has went open source.
sign language can become input gestures.

https://venturebeat.com/wp-content/uploads/2019/08/image2-3.gif?w=341&resize=341%2C400&strip=all

https://venturebeat.com/2019/08/19/google-open-sources-gesture-tracking-ai-for-mobile-devices/

another important thing to consider is that gesture recognition has went open source. sign language can become input gestures. https://venturebeat.com/wp-content/uploads/2019/08/image2-3.gif?w=341&resize=341%2C400&strip=all https://venturebeat.com/2019/08/19/google-open-sources-gesture-tracking-ai-for-mobile-devices/

On the Topic of UI.

This has been solved by tools such as [Tvori ]] and [ https:*nvrmind.io/#features | AnimVR . I would recommend looking at what they have done to solve this issue. Things that would be helpful would be the ability to pin, Scale uniformly, and move panels such as the timeline, Asset Management window, etc in 3D space for example.

Now on the topic of Animating rigged characters. It would be to be able to pose and Animate characters in VR using motion controllers.

Something like this example would be most welcome.

Hope this post has been helpful in some way.

On the Topic of UI. This has been solved by tools such as [Tvori ]] and [[ https:*nvrmind.io/#features | AnimVR ](https:*www.youtube.com/watch?v=gyGL6x0aPrc). I would recommend looking at what they have done to solve this issue. Things that would be helpful would be the ability to pin, Scale uniformly, and move panels such as the timeline, Asset Management window, etc in 3D space for example. Now on the topic of Animating rigged characters. It would be to be able to pose and Animate characters in VR using motion controllers. Something like [this example ](https://www.youtube.com/watch?v=_oxrQDBr6Mo) would be most welcome. Hope this post has been helpful in some way.
Author
Owner

@Mr_SquarePeg the idea is to support different use cases, and different UI/UX experiments via our Python Add-on API. This way interested developer teams can maintain complete solutions without us having to pick and choose a single definitive VR user interface.

@Mr_SquarePeg the idea is to support different use cases, and different UI/UX experiments via our Python Add-on API. This way interested developer teams can maintain complete solutions without us having to pick and choose a single definitive VR user interface.

@dfelinto That is understandable. I just want the default UI Solution for VR at least for animation and Scene Layout to be awesome! Which I have no doubt will be accomplished. Hence why I am making these suggestions early on.

@dfelinto That is understandable. I just want the default UI Solution for VR at least for animation and Scene Layout to be awesome! Which I have no doubt will be accomplished. Hence why I am making these suggestions early on.

Added subscriber: @makx

Added subscriber: @makx

@JacobMerrill-1 regarding gesture input:
The link you posted is something different though: here "gesture" means estimating a hand shape from a picture (for example making a "peace" sign).
But in VR you usually don't have cameras tracking your hands and gestures are more like "drawing a 3D shape".
A more closely related example would be the gesture feature in Maya / MARUI: https://www.youtube.com/watch?v=Z2tI3CgG264

@JacobMerrill-1 regarding gesture input: The link you posted is something different though: here "gesture" means estimating a hand shape from a picture (for example making a "peace" sign). But in VR you usually don't have cameras tracking your hands and gestures are more like "drawing a 3D shape". A more closely related example would be the gesture feature in Maya / MARUI: https://www.youtube.com/watch?v=Z2tI3CgG264
@makx - https://www.youtube.com/watch?v=2VkO-Kc3vks

In #68998#820748, @JacobMerrill-1 wrote:
@makx - https://www.youtube.com/watch?v=2VkO-Kc3vks

As far as I am aware the Oculus Quest does not have OpenXR support yet which is what Blender's VR support is based on. That said it would be better to use the controllers included with the Quest and Rift S because they are more accurate at this time, Use the leap motion, or use the Valve Index controllers when they release their OpenXR run time.

> In #68998#820748, @JacobMerrill-1 wrote: > @makx - https://www.youtube.com/watch?v=2VkO-Kc3vks As far as I am aware the Oculus Quest [does not have OpenXR support yet ](https://uploadvr.com/oculus-rift-basic-openxr-support/) which is what Blender's VR support is based on. That said it would be better to use the controllers included with the Quest and Rift S because they are more accurate at this time, Use the leap motion, or use the Valve Index controllers when they release their OpenXR run time.
Julian Eisel removed their assignment 2019-12-09 15:30:57 +01:00
Member

This task is used as parent of all other VR tasks which are planned to be implemented by multiple teams. No reason to have this assigned to anybody.

This task is used as parent of all other VR tasks which are planned to be implemented by multiple teams. No reason to have this assigned to anybody.

Added subscriber: @slumber

Added subscriber: @slumber

Added subscriber: @PaulMcManus

Added subscriber: @PaulMcManus

Added subscriber: @2046411367

Added subscriber: @2046411367

Added subscriber: @satishgoda1

Added subscriber: @satishgoda1

Added subscriber: @leFlo

Added subscriber: @leFlo

Added subscriber: @Skyfish

Added subscriber: @Skyfish

Added subscriber: @wakin

Added subscriber: @wakin

Added subscriber: @Beryesa

Added subscriber: @Beryesa

Added subscriber: @Sombrero

Added subscriber: @Sombrero

Added subscriber: @Somnolent

Added subscriber: @Somnolent

Added subscriber: @jamad

Added subscriber: @jamad

Added subscriber: @Teh_Bucket

Added subscriber: @Teh_Bucket

Since I haven't seen it discussed, I should add that live capture of head, controllers, and additional trackers is very useful for animating bipeds. A simple IK setup and basic python script could be made by the user, so long as additional trackers are accessible within the blender XR system.

Since I haven't seen it discussed, I should add that live capture of head, controllers, and additional trackers is very useful for animating bipeds. A simple IK setup and basic python script could be made by the user, so long as additional trackers are accessible within the blender XR system.

Removed subscriber: @Ozeki

Removed subscriber: @Ozeki

"Since I haven't seen it discussed, I should add that live capture of head, controllers, and additional trackers is very useful for animating bipeds. A simple IK setup and basic python script could be made by the user, so long as additional trackers are accessible within the blender XR system."

  • yeah animating agents in real time may be kinda neat too- like avatar.
    Glycon is a software a friend of mine makes that does this using this same stuff ($paid$)
"Since I haven't seen it discussed, I should add that live capture of head, controllers, and additional trackers is very useful for animating bipeds. A simple IK setup and basic python script could be made by the user, so long as additional trackers are accessible within the blender XR system." - yeah animating agents in real time may be kinda neat too- like avatar. Glycon is a software a friend of mine makes that does this using this same stuff ($paid$)

Added subscriber: @JeffreyKlug

Added subscriber: @JeffreyKlug

Added subscriber: @Astiolo

Added subscriber: @Astiolo

Added subscriber: @PetterLundh

Added subscriber: @PetterLundh

Added subscriber: @Amudtogal

Added subscriber: @Amudtogal

Added subscriber: @Sj

Added subscriber: @Sj

Added subscriber: @bigPigeon

Added subscriber: @bigPigeon

Added subscriber: @s12a

Added subscriber: @s12a
Philipp Oeser removed the
Interest
EEVEE & Viewport
label 2023-02-09 15:15:20 +01:00
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
37 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#68998
No description provided.