Virtual Earth Geospatial Telemetry Visualisation

6 Mar 2009 00:14
Last Modified
13 Jan 2013 18:24

This post discusses a sample I put together to allow geospatial telemetry data to be visualised using Virtual Earth. The data itself was collected by driving an Aston Martin DB8 Vantage around a track with a GPS receiver. In addition to the location of the car, basic engine telemetry was captured and synchronised with the position data.

The basic idea was to take the data, and "play back" the drive of the car around the track, layering information on a map such as vehicle position, speed, braking position etc. Multiple data sets can be overlaid on the map for comparison. In order to show the vehicle position, a basic 3D car model was chosen. Virtual Earth supports both 2D and 3D map views, the latter of which gave an opportunity to implement a "virtual helicopter" camera which could follow the vehicle around the track.

Video 1. Virtual Earth geospatial telemetry visualisation.

This video shows a couple of laps of telemetry data. The path taken on each lap is drawn on the map (each in a different colour), and each has its own 3D car model (labeled "A" and "B" respectively). The buttons along the bottom of the screen control the "virtual helicopter" camera position and which car the camera is pointing at, and can be seen in more detail in Figure 1 below.

Camera Positions

Figure 1. "Virtual Helicopter" Camera Positions


These angles follow the car a short distance above and in front of/behind respectively.


This angle is directly above the car at a low/high altitude respectively.


This setting fixes the camera at its current point in space, but points to the selected car.

In Car

This setting sets the camera position to be "inside" the car, and points the camera in the direction that the car is traveling.


This setting allows the user to freely move the camera.

As an aside, during development of this sample I initially only had access to a couple of models in .x format. Until I managed to find a suitable model for the car, I had to use the following:

Initial Actors

Figure 2. Initial Actors

Initially it was helpful to add some axes to the models so I could ensure they were oriented correctly - you can see these in Figure 2. I also experimented with transparency for "ghosting" the model(s) which didn't have focus:

Ghost Actors

Figure 3. Ghost Actor(s)

The cube shown in Figure 3 was used as a visual marker (also with axes) to show the camera position when I was in a "Free" camera mode. This was really helpful in ensuring the camera was positioned and tracking objects correctly.

Surface Physics Demo Part 3

19 Feb 2009 23:52
Last Modified
13 Jan 2013 18:13

In Part 1 and Part 2 I focussed on 2D features. This makes a lot of sense for Surface applications, as fundamentally items move in two dimensions, however there are particular scenarios that lend themselves to 3D, one of which I'll describe later in this post.

Video 1. Surface physics demo.

The video is divided into the following sections:


In some cases it is desirable to arrange the items into preset patterns, for example as part of an "Attract Mode" application, or when interacting with physical objects placed on the Surface. This screen defines some basic patterns and "locks" the items to the nearest position in the pattern from their original location. Selecting an item releases the "lock".

3D Rolling Spheres

Spheres lend themselves to an intuitive motion on a 2D-plane, such as when playing marbles, pool etc. When a texture is added to the sphere, it is important to ensure that the sphere "rolls" correctly when moved. Several examples of textures are shown in the video.

Here are some further screenshots.


Figure 1. Marbles


Figure 2. Pool balls

Surface Physics Demo Part 2

18 Feb 2009 22:47
Last Modified
13 Jan 2013 18:11

In Part 1 I introduced a generic framework I have produced for a Surface-enabled WPF layout control which has basic physical interactions.

Aswell as demonstrating some additional physical behaviour, I wanted to focus this post on some Surface-specific features. One of several key tenets of Surface development is multidirectional applications. This is often overlooked, even when developing for Surface as the developer typically uses a standard development PC with a vertically-oriented screen. I should say that the radio buttons down either side of the demo aren't part of the framework I describe here - they are merely present to allow me to illustrate different features over several pages - so should be "ignored" when it comes to a discussion about multidirectional UI.

Let's jump straight into a video.

Video 1. Surface physics demo.

The video is divided into the following sections:

Further Materials

Other than adding some black and white "plastic" rounded tiles, I've included some "crystal" materials. These are 3D models of a typical facetted gem, with some suitable lighting and transparency. The colors are randomly generated each time the page is selected.

Spring Forces

I'd wanted to add these from the start, as they are great fun to play with. Whenever a "spring" tile (another poker chip) is placed in the Surface, any selected items are joined to it via a spring, or piece of elastic. Muliple springs can be connected to multiple objects. When combined with directional forces the springs will "swing" accordingly. A basic spring algorithm is used, with a configurable spring constant and length (quite "loose" and "short" respectively in this sample).

360° Directional Forces

This section illustrates how a "dial" object (you guessed it, another poker chip) can be used to control the direction of a force. When placed on the Surface, the current direction is indicated and can be changed by rotating the object.

360° Directional Lighting

In a similar approach to the directional forces above, a "dial" object is used to control the direction of the dominant light source in the model.

Here are some further screenshots.


Figure 1. Springs

Force direction

Figure 2. Force direction

Gem lighting

Figure 3. Gem lighting

Note that these screenshots are from the Surface Simulator, so the physical objects (i.e. spring object, force, and lighting dials) are necessarily virtual.

In the next article I'll discuss 3D features.

Surface Physics Demo Part 1

18 Feb 2009 00:11
Last Modified
13 Jan 2013 18:06

I've recently been doing some work on a physics engine sample for Microsoft Surface. The principal purpose of this work was to investigate how adding physical characteristics to virtual items adds to the realism of a Surface experience. The work had three main areas of focus:

  1. To add physical behaviour to virtual items in a Surface experience
  2. To combine physical behaviour with 3D WPF templates, textures and lighting sources
  3. To demonstrate how the object recognition capabilities of Microsoft Surface can be used to interact with these items

In addition to these points, I wanted to provide this functionality within a WPF layout control closely analagous to the Surface ScatterView control. In this way, it should be relatively easy to swap out a ScatterView implementation for a physics-based alternative.

Here's a video to illustrate the key features for this post.

Video 1. Surface physics demo.

The video is divided into the following sections:

Basic Interactions

Basic collision detection & response between circles, rectangles, and polygons. Notice that one can hold onto a given shape and flick other items into it. The held shape is affected by the collision but "springs" back to place. The items are all set to be the same density, so the size is proportional to the mass. The walls are "soft" and result in springy collision, rather than a hard collision between objects.

Basic Forces

Illustration of directional forces (basic acceleration) and point forces (a gravity algorithm). Notice that point forces with low drag result in a chaotic rotational motion - heavier items accrete towards the gravity source, and lighter items rotate futher away. Multiple point forces can be added. Combinations of directional and point forces is also possible, e.g. "hanging" an object of suitable mass below a point force. Note also that the physical tags (poker chips) have a virtual presence and result in collisions.

3D Objects and Materials

This section illustrates some lighting and textures, e.g. "wooden" and "marble" tiles. The lighting is consistent between objects, in this case pointing from lower left to upper right with respect to the camera position. Patterns tiles have image textures generated in code. In this example they simply use black and white brushes, but one of the great things about WPF is that I could as easily use different image brushes and, for example, "inlay" different wood textures to form the same pattern.

More Interactions

This section illustrates the use of "fixed" objects, with and without directional forces (actually the "fixed" objects are just non-selectable items of very large mass). It also illustrates some of the object recognition of Microsoft Surface. Ordinary business cards placed on the Surface become part of the "virtual" world. One can also use other physical objects (e.g. brochures, hands etc) to "sweep" items.

Configuration and Debugging

This screen allows configuration of just some of the properties of the physical environment, such as "bounciness" (restitution) of items and walls, directional and angular drag, directional forcees, maximum directional and angular velocities etc. It was also an invaluable tool in debugging the build as it shows positions and vectors graphically.

I've included some screenshots below.

Miscellaneous shapes

Figure 1. Miscellaneous shapes

Wood textures

Figure 2. Wood textures

Pattern textures

Figure 3. Pattern textures

Visual debugging

Figure 4. Visual Debugging

In the next article I'll describe some more physics-related and Surface-specific features of this framework.