[SURVEY RESULTS] The 2024 edition of State of Software Modernization market report is published!
GET IT here

Making Electron-Based Apps Steam Deck Compatible

readtime
Last updated on
October 3, 2024

A QUICK SUMMARY – FOR THE BUSY ONES

TABLE OF CONTENTS

Making Electron-Based Apps Steam Deck Compatible

Some time ago, our game industry client asked us to implement support for the Steam Deck in the Electron-based game launcher. Our first article in the Steam Deck series explains the details of a Steam Deck compatibility review process and the general requirements that had to be met to get a desired Verified status from Valve.

Except for a few minor bugs, our application was initially functional on Steam Deck using the touchscreen. However, to meet the requirements, it also had to support Steam Deck's physical controls without needing an external mouse or keyboard and without the user changing any application settings.

This article focuses on sharing our experiences with implementing controller support, providing insights into the necessary steps for this process.

Mapping gamepad controls

The Navigator.getGamepads() method from the Gamepad API provides a detailed insight into the state of connected game controllers (including the Steam Deck), particularly through the properties buttons and axes within each Gamepad object. The state of a button is indicated by the pressed property (either true or false) and the floating-point value representing the intensity of the button press. The values in the axes array are floating-point numbers representing the position along the axis. Values other than 0 mean that the control was pressed and the corresponding handlers should be called.

To identify the specific control pressed, we needed to map the state provided by the Navigator.getGamepads() method to the corresponding gamepad. A mapping designed for the Xbox One controller ensures compatibility and consistency with the Steam Deck controls.

Focusable components

In the application, only certain components are designed to be focusable - specifically, those that can be triggered, such as buttons or dropdowns. All the focusable elements needed to be identified, and the handlers for supported controls were defined for each focusable component to respond to these triggers. This means that if, for example, a button A on the Steam Deck is pressed, a corresponding action is executed:

const gamepadButtonHandlers = {
	[buttons.A]: { event: (elementRef) => elementRef.current.click() },
};

Since many components shared the same gamepad button handlers (for instance button B which is conventionally used for navigating up or returning to a higher-level menu), we decided to use the shared context.

const GamepadHandlersProvider = ({ children, gamepadButtonHandlers }) => {
	const inheritedContext = useContext(GamepadHandlersContext) || {};

	const newContext = useMemo(
		() => ({
			handlers: {
				...inheritedContext?.handlers,
				...gamepadButtonHandlers,
			},
		}),
		[gamepadButtonHandlers, inheritedContext?.handlers],
	);

	return (
		<GamepadHandlersContext.Provider value={newContext}>
			{children}
		</GamepadHandlersContext.Provider>
	);
};

According to the Steam Deck requirements, when using Steam Deck's physical controls, on-screen glyphs must match either the Steam Deck button names or the Xbox 360/One button names. To meet this requirement, we added a footer containing the controls and the state changes, which result in specific effects when a particular element is focused.

Navigating between focusable components

One of the major challenges we faced was devising a method to find focusable components located in the direction specified by the Steam Deck D-Pad from the reference element. Let’s explore the evolution from the initial concept of addressing this issue to its ultimate resolution.

Two-level relatives

We initially implemented a two-step process: first, considering elements directly aligned to the right, left, up, or down, then expanding to elements at a 45-degree angle if no match was found. The corner of the selected element had to be within the designated area. In the attached schema, the 'corners' of particular elements located in various directions from the reference element were marked with a circle. The component closest to the reference element was selected based on its size and position obtained from the Element.getBoundingClientRect() method.

Why, then, did this concept fall short of expectations? Despite appearing promising in theory, in practice, the element that a user would intuitively identify as the most logical choice wasn't always the one selected. This was particularly true for groups of elements serving similar functions. Hence, we had to find a better solution, considering not only the distance between the elements but also their functions. That is how we came up with the idea of component groups.

Component groups

To grasp the concept of component groups, consider the following example. Let's say that we are navigating left. Without grouping, three potential focusable components exist: C1, C2, and C3. When grouped, there is only one potential focus, G1, acting as a focus catcher. If the group has not been previously focused, the first element in the group will be focused unless another element is designated always to be focused first within the group. If the group was previously focused, the previously focused element regains focus.

When the current focus is within the group, the navigation initially attempts to focus on a component inside the group. We try to navigate away from the group only if no component is found.

Layer stack

The introduction of component groups successfully addressed the problem of navigating between the components in a two-dimensional plane. Nevertheless, since our application contains various modals and dropdowns, we had to find a way only to allow components in the current context to be focused and not elements outside a specific area of the application, even if they were rendered on the page. This led us to introduce the concept of an application layer stack.

Let's consider an example. Opening a modal from the homepage (L1) should restrict focus only to the elements within that modal (L2). If the modal contains a dropdown (L3), opening the dropdown should allow focusing only on the dropdown's elements. Pressing button B in the dropdown should close the dropdown, returning the focus to the modal. Pressing button B again should close a modal and shift the focus to the homepage as the current layer.

We were able to solve this issue by providing the context to all the elements outside the main layer and saving the layers stack in the application state. As a result, each time a component belonging to the new layer is opened, an action adding this layer to the stack is dispatched. Accordingly, when a component is closed, the layer is popped from the stack so that the active layer is always the last element in the layer's stack.

useEffect(() => {
	if (enabled) {
		focusDispatch({
			type: “SET_LAYER”,
			payload: layerName,
		});
	}
}, [focusDispatch, layerName, enabled]);

useEffect(() => {
	if (!enabled) {
		focusDispatch({
			type: “POP_LAYER”,
		});
	}
}, [focusDispatch, enabled]);

useEffect(
	() => () => {
		const topLayer = layerStack.slice(-1)[0];
		if (topLayer === layerName) {
			focusDispatch({
				type: “POP_LAYER”,
			});
		}
	},
	[layerStack, focusDispatch, layerName],
);

Debugging the application

In the process of implementing support for the Steam Deck in our application, we utilized PS5 controllers connected to our computers to simulate the user experience. By pressing various buttons on the controller, we replicated actions such as navigating through menus, selecting options, and interacting with UI elements as if interacting directly with the Steam Deck. Although the final testing happened on the actual device with the dedicated mapping, we found this approach during development much more convenient than working directly on the Steam Deck. To improve the user experience when using the controllers, we created a custom tool to display the state of all the controls on the screen.

To simulate a Steam Deck environment on our personal computers, it was necessary to set the SteamOS environmental variable to 1 when running the application.

A valuable starting point

Implementing controller support was both challenging and incredibly interesting. Nevertheless, it's essential to understand that this article deliberately focuses on specific aspects rather than providing an exhaustive exploration of every nuance in controller integration. The challenges we faced, especially those linked to our app's structure, may not be universally applicable due to the unique setups of different applications. Given the increasing interest in the Steam Deck, more software engineers may encounter the need to implement controller support in their apps, and that's why we believe the insights shared in this article can offer a valuable starting point for integrating controllers seamlessly.

Frequently Asked Questions

No items found.

Our promise

Every year, Brainhub helps 750,000+ founders, leaders and software engineers make smart tech decisions. We earn that trust by openly sharing our insights based on practical software engineering experience.

Authors

Anna Czekajło-Kozłowska
github
JavaScript Software Engineer

A Fullstack JavaScript Engineer with a strong focus on solving real business problems through code. She's experienced in both web and desktop application development (Electron).

Anna Czekajło-Kozłowska
github
JavaScript Software Engineer

A Fullstack JavaScript Engineer with a strong focus on solving real business problems through code. She's experienced in both web and desktop application development (Electron).

previous article in this collection

No items found.

It's the first one.

next article in this collection

It's the last one.