Addresses T77127 (Controller Drawing).
Adds VR controller visualization and custom drawing via draw
handlers. Add-ons can draw to the XR surface (headset display) and
mirror window by adding a View3D draw handler of region type 'XR' and
draw type 'POST_VIEW'. Controller drawing and custom overlays can be
toggled individually as XR session options, which will be added in a
future update to the VR Scene Inspection add-on.
For the actual drawing, the OpenXR XR_MSFT_controller_model extension
is used to load a glTF model provided by the XR runtime. The model's
vertex data is then used to create a GPUBatch in the XR session
state. Finally, this batch is drawn via the XR surface draw handler
mentioned above.
For runtimes that do not support the controller model extension, a
a simple fallback shape (sphere) is drawn instead.
Reviewed By: Severin, fclem
Differential Revision: https://developer.blender.org/D10948
Integrates XR input actions with the WM event system. With this commit,
all VR action functionality (operator execution, pose querying, haptic
application), with the exception of custom drawing, is enabled.
By itself, this does not bring about any changes for regular users,
however it is necessary for the upcoming VR add-on update that will
expose default controller actions to users.
For add-on developers, this updates the Python API with access to XR
event data (input states, controller poses, etc.), which can be
obtained via the "xr" property added to the bpy.types.Event struct.
For XR events, this property will be non-null and the event will have
the type XR_ACTION.
Further details:
XR-type window events are queued to the regular window queues after
updating and interpreting VR action states. An appropriate window is
found by either using the window the VR session was started in or a
fallback option.
When handling XR events, mouse-specific processing is skipped and
instead a dedicated XR offscreen area and region (see 08511b1c3d) is
used to execute XR event operators in the proper context.
Reviewed By: Severin
Differential Revision: https://developer.blender.org/D10944
This adds an offscreen View3D window area for the VR view in order to
execute XR events/operators in the proper context. The area is created
as runtime data before XR events are dispatched and set as the active
area during XR event handling.
Since the area is runtime-only, it will not be saved in files and since
the area is offscreen, it will not interfere with regular window areas.
The area is removed with the rest of the XR runtime data on exit, file
read, or when stopping the VR session.
Note: This also adds internal types (EVT_DATA_XR, EVT_XR_ACTION) and
structs (wmXrActionData) for XR events.
Reviewed By: Severin
Differential Revision: https://developer.blender.org/D12472
Addresses T76082.
Since the DirectX backend does not work for AMD gpus
(wglDXRegisterObjectNV() fails to register the shared OpenGL-DirectX
render buffer, displaying a pink screen to the user), the original
solution was to use SteamVR's OpenGL backend, which, as tested
recently, seems to work without any issues on AMD hardware.
However, the SteamVR OpenGL backend (on Windows) was disabled in
fe492d922d since it resulted in crashes with NVIDIA gpus (and still
crashes, as tested recently), so SteamVR would always use the
AMD-incompatible DirectX backend (on Windows).
This patch restores use of the SteamVR OpenGL backend for non-NVIDIA
(AMD, etc.) gpus while maintaining the DirectX workaround for NVIDIA
gpus. In this way, issues are still resolved on the NVIDIA side but
AMD users can once again use the SteamVR runtime, which may be their
only viable option of using Blender in VR.
Reviewed By: Julian Eisel
Differential Revision: https://developer.blender.org/D12409
This reverts 151eed752b. Originally thought it was necessary to
initialize selected/active indices to -1 to prevent out-of-bounds
list access, but this is not needed since null checks are already
performed after obtaining list members via BLI_findlink().
In addition, leaving indices zero-initialized facilitates use of the
Python API, for example when displaying action map information in a
UI list.
Add null check for runtime data since it could already have been
freed via wm_xr_exit() (called on file read) prior to the session
exit callback.
Also, fix potential memory leak by freeing session data in
wm_xr_runtime_data_free() instead of session exit callback.
This fixes two memory leaks related to XR action maps.
1. Freeing of action maps needs to be moved from wm_xr_exit() to
wm_xr_runtime_data_free() since the runtime may have already been
freed when calling wm_xr_exit().
2. Action bindings for action map items were not being freed. This
was mistakenly left out of e844e9e8f3 since the patch needed to be
updated after d3d4be1db3.
This addresses reduced visibility of scenes (as displayed in the VR
headset) that can result from the 8-bit color depth format currently
used for XR swapchain images.
By switching to a swapchain format with higher color depth (RGB10_A2,
RGBA16, RGBA16F) for supported runtimes, visibility in VR should be
noticeably improved.
However, current limitations are lack of support for these higher
color depth formats by some XR runtimes, especially for OpenGL.
Also important to note that GPU_offscreen_create() now explicitly
takes in the texture format (eGPUTextureFormat) instead of a
"high_bitdepth" boolean.
Reviewed By: Julian Eisel, Clément Foucault
Differential Revision: http://developer.blender.org/D9842
Although the relevant structs (wmXrRuntime/XrActionMap/
XrActionMapItem) are zero-allocated, the selected and active action
map indices need to be initialized to -1 to prevent potential
out-of-bounds list access.
Addresses the remaining portions of T77137 (Python API for Controller
Interaction), which was partially completed by D10942.
Adds an XR "action maps" system for loading XR action data from a
Python script. Action maps are accessible via the Python API, and are used
to pass default actions to the VR session during the
xr_session_start_pre() callback.
Since action maps are stored only as runtime data, they will be
cleaned up with the rest of the VR runtime data on file read or exit.
Reviewed By: Julian Eisel, Hans Goudey
Differential Revision: https://developer.blender.org/D10943
Provides two key improvements to runtime controller data.
1. Separates controller poses into two components, "grip" and "aim",
which are both required to accurately represent the controllers
without manual offsets.
Following their OpenXR definitions, the grip pose represents the
user's hand when holding the controller, and the aim pose represents
the controller's aiming source.
2. Runtime controller data is now stored as a dynamic array instead
of a fixed array. This makes the API/functionality more adaptable to
different systems.
Does not bring about any changes for users since only internal
runtime functionality is currently affected.
Reviewed By: Julian Eisel
Differential Revision: http://developer.blender.org/D12073
Provides several important improvements to the runtime action
bindings operation and internal API.
Moves input-specific action data (input thresholds, input regions,
pose offsets/spaces) from actions to more granular action bindings.
This allows a single action to be mapped to a variety of inputs,
without having to share a single input threshold, region, or space.
Also removes the need for action space creation API, as spaces for
pose actions will be automatically created with the bindings.
The correct action data for the current inputs is set by calling
xrGetCurrentInteractionProfile() to get the current profile and then
retrieving the corresponding mapped data.
Does not bring about any changes for users since only internal
runtime functionality is currently affected.
Reviewed By: Julian Eisel
Differential Revision: http://developer.blender.org/D12077
Addresses T76003. When using VR with Eevee and viewport denoising,
scene geometry could sometimes be occluded for one eye. Solution is
to use a separate GPUViewport/GPUOffscreen for each VR view instead
of reusing a single one for rendering.
Reviewed By: Julian Eisel, Clément Foucault
Differential Revision: http://developer.blender.org/D11858
Improves control over the XR reference space by using the stage ref
space (user-defined tracking bounds) instead of local ref space
(position at application launch), if available. Also adds an
"absolute tracking" session option to skip applying eye offsets that
are normally added for placing users exactly at landmarks.
By enabling absolute tracking, users can define the tracking origin
in a way that is not linked to the headset position. Instead, the
tracking values given by the XR runtime are left unadjusted and a
user can manually calibrate an "origin" landmark object to adjust to
their real world space.
Can be useful for applications that use external tracking systems
and those that primarily only need to use controllers and not the
headset (e.g. motion capture).
The absolute tracking option requires an update to the VR
Scene Inspection addon to be accessible by regular users.
Reviewed By: Julian Eisel
Differential Revision: http://developer.blender.org/D10946
Adds internal API for creating and managing OpenXR actions at the
GHOST and WM layers. Does not bring about any changes for users since
XR action functionality is not yet exposed in the Python API (will be
added in a subsequent patch).
OpenXR actions are a means to communicate with XR input devices and
can be used to retrieve button/pose states or apply haptic feedback.
Actions are bound to device inputs via a semantic path binding
(https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#semantic-path-interaction-profiles),
which serves as an XR version of keymaps.
Main features:
- Abstraction of OpenXR action management functions to GHOST-XR,
WM-XR APIs.
- New "xr_session_start_pre" callback for creating actions at
appropriate point in the XR session.
- Creation of name-identifiable action sets/actions.
- Binding of actions to controller inputs.
- Acquisition of controller button states.
- Acquisition of controller poses.
- Application of controller haptic feedback.
- Carefully designed error handling and useful error reporting
(e.g. action set/action name included in error message).
Reviewed By: Julian Eisel
Differential Revision: http://developer.blender.org/D10942
During viewport rendering the color values were clamped in order to
apply the overlay on top of it. This clamping would show the scene
colors washed out.
This patch adds a work around to skip the clamping when the overlays are
turned off.
Parial fix for {T77909}
This replace `GPU_clear()` by `GPU_clear_color()` and `GPU_clear_depth()`.
Since we always set the clear value before clearing, it is unecessary
to track the clear color state.
Moreover, it makes it clearer what we clear the framebuffer to.
Split the depsgraph allocation into a separate function
`BKE_scene_ensure_depsgraph()`. Parameters are only passed to those
functions that actually need them. This removes the the "if that boolean
is `false` this pointer is allowed to be `NULL`" logic and more cleanly
decouples code.
No functional changes.
On VR session start with positional tracking disabled, the pose would
have an offset applied but it was supposed to start exactly at the
landmark position.
Issue is that we applied the offset to cancel out the position offset
reported by the OpenXR runtime incorrectly. We only want to do that if
positional tracking is enabled, because if not we don't even apply the
runtime's position offset. So we'd cancel something out that wasn't
there.
Proper handling of View Layers for the VR session was never implemented.
Now the View Layer of the VR session follows the window the session was
started in.
Note that if this window is closed, we fallback to another window. This
is done to avoid the overhead it would take to maintain a separate
depsgraph for the VR view. Instead we always share some already visible
View Layer (and hence the depsgraph).
We want the session to start exactly at the landmark position, with
no additional offset. Some runtimes (e.g. Windows Mixed Reality) may
give an initial non-[0,0,0] position at session start though.
Also add a comment explaining the purpose of the eye offset variable.
There would always be an unintended offset applied. Per design there
should not be any offset when changing VR Landmarks, the view should
just jump exactly to the Landmark.
Due to the recent changes, we don't have to add, but substract the eye
offset we apply to get the wanted behavior.
Mistake in 607d745a79.
We want the session to start exactly at the landmark position, with
no additional offset. Some runtimes (e.g. Windows Mixed Reality) may
give an initial non-[0,0,0] position at session start though.
Also add a comment explaining the purpose of the eye offset variable.
There would always be an unintended offset applied. Per design there
should not be any offset when changing VR Landmarks, the view should
just jump exactly to the Landmark.
Due to the recent changes, we don't have to add, but substract the eye
offset we apply to get the wanted behavior.
Mistake in 607d745a79.
This addresses warnings from Clang-Tidy's `readability-else-after-return`
rule in the `source/blender/blenlib` module. Not all warnings are
addressed in this commit.
No functional changes.
This replaces header include guards with `#pragma once`.
A couple of include guards are not removed yet (e.g. `__RNA_TYPES_H__`),
because they are used in other places.
This patch has been generated by P1561 followed by `make format`.
Differential Revision: https://developer.blender.org/D8466
Once the base pose was changed (e.g. by changing the active landmark), we'd always run the logic to reset to the base pose. That would mess up the final viewer pose.
Think this only got exposed through 607d745a79.
* Changing to a landmark moves the view exactly to it, rather than
keeping the current position offset.
* Disabling positional tracking moves the viewer back to the landmark
position.
This is a more predictable and practical way to use landmarks. See
feedback in T71347.
On the code side, I did some cleanup so the logic flow is more clear.
Note: This is entirely untested. I currently don't have access to a
device. There might be issues, tomorrow I'll hopefully get feedback.
Draw-manager mutex has to be set before activating OpenGL/GPU context.
Otherwise, parallel jobs (like preview rendering) may try to activate
the context from another thread.
Also: Use WM wrappers for activating/releasing OpenGL context, which
have an additional assert check.
Suggest to backport this for 2.83.1.