Chapter 15. Creating GUIs in Unity

Games are software, and all software needs a user interface. Even if it’s as simple as a button that starts a new game, or a label that shows the player’s current score, your game still needs a way to show the more mundane, “nongame” stuff for the user to interact with.

The good news is that Unity has a really great UI system. Introduced in Unity 4.6, the UI system is extremely flexible and powerful, and is designed for the situations that games typically encounter. For example, the UI system supports PC, console, and mobile platforms; allows a single UI to scale to multiple sizes of screens; is capable of responding to input from the keyboard, mouse, touchscreen, and game controllers; and supports displaying the UI in both screen-space and in world-space.

In short, it’s a pretty incredible toolkit. While we’ve been building GUIs in the games discussed in Parts II and III, we’d like to look at some finer points of the GUI system, so that you’re ready to take full advantage of the features that it offers.

How GUIs Work in Unity

Fundamentally, a GUI in Unity is not terribly different from the other visible objects in your scene. A GUI is a mesh that’s constructed at runtime by Unity, with textures applied to it; additionally, the GUI contains scripts that respond to mouse movement, keyboard events, and touches to update and modify that mesh. The mesh is displayed via the camera.

The GUI system in Unity has several different pieces that work together. ...

Get Mobile Game Development with Unity now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.