Unreal Engine Desktop Environment

C++ | Unreal Engine

About


A personal project that is still under development. I wanted to dive deeper into the underlying system of Unreal's "UserWidgets", specifically in C++. My goal was familiarize myself with the structure of this UI system and its further applications in C++ such as creating, changing and destroying elements during runtime.

I wanted to utilize abstraction to largest degree that it was possible by creating re-usable elements that were de-coupled from their parent objects and had self contained logic.

I have always been fascinated by games that take place in what is essentially an operating system for some kind of device like "Hypnospace Outlaw" and "Simulacra", and is what inspired me to take a similar approach for this project.

 

Adding UI to the viewport in C++

One of my goals with this project was to spend as little time as possible inside of the blueprint interface assigning references. I also decided that I was completely forbidden from using the blueprint graph in any way to see just how much I could get done in C++. To my initial surprise, quite a lot. I would even say at this point that it is my preferred way of working with the Widget framework in Unreal.

Another utilization of the HUD class was getting references to Texture2D pointers (essentially images) without having to declare individual pointers in each source file. This also meant that I didn't need to open the derived blueprint and assign the correct Texture to the correct variable for each widget.


Using the AssetManager I would collect all Texture2Ds in a directory into a TMap (dictionary) and assign the key-value to the filename in the directory. This meant that when I wanted to assign a Texture2D somewhere, such as the button for closing a window, I could simply assign the value associated with the "CloseButton" key.

The first issue I wanted to tackle was how to add widgets to the viewport from C++, something I had only done through blueprint graphs before.

To create the widget and add it to the viewport, I needed to have a reference to the blueprint that was derived from the C++ parent. I decided to keep these references in the static HUD class which is globally accessible from anywhere.


All of the derived blueprint classes were based on a common C++ base class that contained no UI code. Instead this class contained declarations and references that would be useful for all the widgets I would create. One of these was the reference to the HUD. This was important for the Desktop to access since it was responsible for creating windows, thereby needing access to the Window blueprint class.

Generating a desktop environment

The structs are created in the desktop component which acts as a root. So far they can be either an image or a folder. Here is also an example of how simple it is to assign an image to a image-type file, simply  reference it in the HUD with the filename.

Every intractable "file" on the desktop is generated from a list of structs that is created before the game starts. The struct has multiple parameters that determine the behaviour and appearance of the file, such as its name and file-type. The struct is declared in the main C++ parent so all created widgets know about it. As I was building the systems I realized it was getting harder to predict where I would need to use the struct so this was the solution I arrived at.

Following creating the structs and assigning them to a FileWidget, the created widgets are added to a list that is looped through to add them to the desktop screen space. A GridPanel is dynamically created and the widgets are added with appropriate padding on each side.

This function is static and is also used to create the GridPanel and add the widgets to any folder that is opened from clicking on a FileWidget with the folder file-type.

Creating self-contained Widgets

I wanted to have a system that allowed for the logic for each element to be self-contained. I achieved this with a hierarchy where a component such as the window would have a reference to its subcomponents, like the button to close or minimize the window.


These subcomponents do not have a reference to their parent, instead communicating "upwards" through delegates. Most of these delegates pass themselves as an argument, such as the window component. This allows for the desktop to know which window has been interacted with.

Another example of how the widgets communicate is the activity bar at the bottom of the desktop. When a window is opened by clicking on an icon, the activity bar will show that the window is open. When the button to close the window is pressed, the button communicates that it has been pressed to the owning window. The window will then send its own delegate to the desktop, which is the owner of the activity bar, that it wants to be removed.

This also means that the UI is very performant as there are no update loops constantly running to check for states, they are all updated through delegate calls, of which there is only one at a time. The one exception to this is the function to drag the window and reposition it on the canvas. However other delegates can't be broadcast at the same time as the function to move the window is dependant on the mouse position and button being held.