Since I wanted to support running the application on a Microsoft Surface device, it was important to cater for multi-directional interaction. User interface (UI) components can eiher be directional or not. For example, the camera view is non-directional in the sense that it is valid for all users regardless of direction (it is perfectly acceptable to be "upside down" when rolling through 180° in a 3D environment). Text labels, on the other hand, are directional and most beneficial when directly facing a user.
The following screenshots illustrate how the user interface supports multiple directions and multiple users.
Figure 2 shows a menu and reticle for each of two users on opposite sides. While there can be many instances of menus and reticles at any one time, a given instance is typically used by one user at a time. It is therefore possible to orient them to the relevant user, either by using the orientation of a Surface tag, or by using the orienation reported by the API for a touch event.
For items such as background labels which are shared between multiple users, it is necessary to pick a consistent orientation for all instances. This "system orientation" can either be determined in code (e.g. by examining the orientation of other directional UI components) or by a user via a menu setting. In Figure 2 the orienation has been chosen to face one of the two users.
While the system orientation is an analogue value (i.e. the background labels, for example, can face any consistent direction), it makes sense to axis-align the orientation of items such as the clock to a side of the screen.