The XR action map

Godot has an action map feature as part of the XR system. At this point in time this system is part of the OpenXR module. There are plans to encompass WebXR into this in the near future hence we call it the XR action map system in this document. It implements the built-in action map system of OpenXR mostly exactly as it is offered.

The XR action map system exposes input, positional data and output for XR controllers to your game/application. It does this by exposing named actions that can be tailored to your game/application and binding these to the actual inputs and outputs on your XR devices.

As the XR action map is currently part of the OpenXR module, OpenXR needs to be enabled in your project settings to expose it:

../../_images/openxr_settings.png

You will then find the XR Action Map interface in the bottom of the screen:

../../_images/xr_action_map.webp

Note

Godot's built-in input system has many things in common with the XR action map system. In fact our original idea was to add functionality to the existing input system and expose the data to the OpenXR action map system. We may revisit that idea at some point but as it turns out there were just too many problems to overcome. To name a few:

  • Godot's input system mainly centers around button inputs, XR adds triggers, axis, poses and haptics (output) into the mix. This would greatly complicate the input system with features that won't work for normal controllers or contrast with the current approach. It was felt this would lead to confusion for the majority of Godot users.

  • Godot's input system works with raw input data that is parsed and triggers emitting actions. This input data is made available to the end user. OpenXR completely hides raw data and does all the parsing for us, we only get access to already parsed action data. This inconsistency is likely to lead to bugs when an unsuspecting user tries to use an XR device as a normal input device.

  • Godot's input system allows changes to what inputs are bound to actions in runtime, OpenXR does not.

  • Godot's input system is based on device ids which are meaningless in OpenXR.

This does mean that a game/application that mixes traditional inputs with XR controllers will have a separation. For most applications either one or the other is used and this is not seen as a problem. In the end, it's a limitation of the system.

The default action map

Godot will automatically create a default action map if no action map file is found.

Warning

This default map was designed to help developers port their XR games/applications from Godot 3 to Godot 4. As a result this map essentially binds all known inputs on all controllers supported by default, to actions one on one. This is not a good example of setting up an action map. It does allow a new developer to have a starting point when they want to become familiar with Godot XR. It prevents having to design a proper action map for their game/application first.

For this walkthrough we're going to start with a blank action map. You can simply delete the "Godot action set" entry at the top by pressing the trash can icon. This will clear out all actions. You might also want to remove the controllers that you do not wish to setup, more on this later.

Action sets

Note

Before we dive in, you will see the term XR runtime used throughout this document. With XR runtime we mean the software that is controlling and interacting with the AR or VR headset. The XR runtime then exposes this to us through an API such as OpenXR. So:

  • for Steam this is SteamVR,

  • for Meta on desktop this is the Oculus Client (including when using Quest link),

  • for Meta on Quest this is the Quest's native OpenXR client,

  • on Linux this could be Monado, etc.

The action map allows us to organize our actions in sets. Each set can be enabled or disabled on its own.

The concept here is that you could have different sets that provide bindings in different scenarios. You could have:

  • a Character control set for when you're walking around,

  • a Vehicle control set for when you're operating a vehicle,

  • a Menu set for when a menu is open.

Only the action set applicable to the current state of your game/application can then be enabled.

This is especially important if you wish to bind the same input on a controller to a different action. For instance:

  • in your Character control set you may have an action Jump,

  • in your Vehicle control set you may have an action Accelerate,

  • in your Menu set you may have an action Select.

All are bound to the trigger on your controller.

OpenXR will only bind an input or output to a single action. If the same input or output is bound to multiple actions the one in the active action set with the highest priority will be the one updated/used. So in our above example it will thus be important that only one action set is active.

For your first XR game/application we highly recommend starting with just a single action set and to not over-engineer things.

For our walkthrough in this document we will thus create a single action set called my_first_action_set. We do this by pressing the Add action set button:

../../_images/xr_my_first_action_set.webp

The columns in our table are as follows:

Col

Value

Description

1

my_first_action_set

This is the internal name of the action set. OpenXR doesn't specify specific restrictions on this name other then size, however some XR runtimes will not like spaces or special characters.

2

My first action set

This is a human-readable name for the action set. Some XR runtimes will display this name to the end user, for example in configuration dialogs.

3

0

This is the priority of the action set. If multiple active action sets have actions bound to the same controllers inputs or outputs, the action set with the highest priority value will determine the action that is updated.

Actions

In the XR action map, actions are the entities that your game/application will interact with. For instance, we can define an action Shoot and the input bound to that action will trigger the button_pressed signal on the relevant XRController3D node in your scene with Shoot as the name parameter of the signal.

You can also poll the current state of an action. XRController3D for instance has an is_button_pressed method.

Actions can be used for both input and output and each action has a type that defines its behavior.

  • The Bool type is used for discrete input like buttons.

  • The Float type is used for analogue input like triggers.

These two are special as they are the only ones that are interchangeable. OpenXR will handle conversions between Bool and Float inputs and actions. You can get the value of a Float type action by calling the method get_float on your XRController3D node. It emits the input_float_changed signal when changed.

Note

Where analogue inputs are queried as buttons a threshold is applied. This threshold is currently managed exclusively by the XR runtime. There are plans to extend Godot to provide some level of control over these thresholds in the future.

The Vector2 type defines the input as an axis input. Touchpads, thumbsticks and similar inputs are exposed as vectors. You can get the value of a Vector2 type action by calling the method get_vector2 on your XRController3D node. It emits the input_vector2_changed signal when changed.

The Pose type defines a spatially tracked input. Multiple "pose" inputs are available in OpenXR: aim, grip and palm. Your XRController3D node is automatically positioned based on the pose action assigned to pose property of this node. More about poses later.

Note

The OpenXR implementation in Godot also exposes a special pose called Skeleton. This is part of the hand tracking implementation. This pose is exposed through the skeleton action that is supported outside of the action map system. It is thus always present if hand tracking is supported. You don't need to bind actions to this pose to use it.

Finally, the only output type is Haptic and it allows us to set the intensity of haptic feedback, such as controller vibration. Controllers can have multiple haptic outputs and support for haptic vests is coming to OpenXR.

So lets add an action for our aim pose, we do this by clicking on the + button for our action set:

../../_images/xr_aim_pose.webp

The columns in our table are as follows:

Col

Value

Description

1

aim_pose

This is the internal name of the action. OpenXR doesn't specify specific restrictions on this name other then size, however some XR runtimes will not like spaces or special characters.

2

Aim pose

This is a human-readable name for the action. Some XR runtimes will display this name to the end user, for example in configuration dialogs.

3

Pose

The type of this action.

OpenXR defines a number of bindable input poses that are commonly available for controllers. There are no rules for which poses are supported for different controllers. The poses OpenXR currently defines are:

  • The aim pose on most controllers is positioned slightly infront of the controller and aims forward. This is a great pose to use for laser pointers or to align the muzzle of a weapon with.

  • The grip pose on most controllers is positioned where the grip button is placed on the controller. The orientation of this pose differs between controllers and can differ for the same controller on different XR runtimes.

  • The palm pose on most controllers is positioned in the center of the palm of the hand holding the controller. This is a new