Jetty to Jetty app

I released an app ūüôā – for iOS and Android.

It’s a self-guided audio tour of historic sites in Broome, Western Australia, including beautiful stories told by locals. Nyamba Buru Yawuru developed the concept, curated the media, engaged local stakeholders, and were¬†product owners for the app.

Jetty to Jetty screenshots
Jetty to Jetty screenshots

This work was exciting for its value to the Broome and Yawuru community, but also because it was an opportunity to innovate under the constraint of building the simplest thing possible. The simplest thing possible was in stark contrast to the technical whizbangery (though lean delivery) of my previous app project РFireballs in the Sky.

I had fun working on the interaction and visual design challenges under the constraints, and I think the key successes were:

  • Simplifying presentation of the real-world and in-app navigation as a hand-rolled map (drawn in Inkscape), showing all the sites, that scrolls in a single direction.
  • Hiding everything unnecessary during playback of stories, to allow the user to focus on the place and the story.
  • Playback control behaviour across sites and the main map.
  • Not succumbing to the temptation to add geo-location, background audio, or anything else that could have added to the complexity!

My colleague Nathan Jones laid the technical foundations – Phonegap/Cordova wrapping a static site built by Middleman and using CoffeeScript, knockout.js, HAML, Sass and HTML5/Cordova plugin for media. He later went on to extend and open-source (as Jila) this framework for the Yawuru Ngan-ga language app. Most of the development work by Nathan and me was done in early 2014.

While intended to be used in Broome (and yet another reason to visit Broome), the app and its beautiful stories can be enjoyed anywhere.

Fireballs in the Sky wins iAward

Very exciting news for the Fireballs in the Sky app team from Curtin University and ThoughtWorks: the app won the iAwards education category in 2015!

The AIIA iAwards are the premier awards program for innovation in the Australian digital economy.

Believe me, I am excited, even though Phil gets all the speaking parts¬†in this heavily edited interview ūüôā

I have previously posted about the app in:

 

Seeing Stars – Bespoke AR for Mobiles

I presented on the development of the awesome Fireballs in the Sky app (iOS and Android) at YOW! West with some great app developers. See the PDF. (NB. there were a lot of transitions)

Abstract

We‚Äôll explore the development of the Fireballs in the Sky app, designed for citizen scientists to record sightings of meteorites (‚Äúfireballs‚ÄĚ) in the night sky. We‚Äôll introduce the maths for AR on a mobile device, using the various sensors, and we‚Äôll throw in some celestial mechanics for good measure.

We’ll discuss the prototyping approach in Processing. We’ll describe the iOS implementation, including: libraries, performance tuning, and testing. We’ll then do the same for the Android implementation. Or maybe the other way around…

Augmented/Virtual Reality with Horizontal Coordinates in iOS and Android

Augmented reality star maps
Augmented reality star maps

So, you want your mobile or tablet to know where in the world you’re pointing it for a virtual reality or augmented reality application?

To draw 3D geometry on the screen in OpenGL, you can use the rotation matrixes returned by the respective APIs (iOS/Android). The APIs will also give you roll, pitch and yaw angles for the device.

What‚Äôs not easy to do through the APIs is to get¬†three angles¬†that tell you in general¬†where the device is pointing¬†‚Äď that is, the direction in which the rear camera is pointing. You might want this information to capture the location of something in the real world, or to draw a virtual or augmented view of a world on the screen of the phone.¬†The¬†Fireballs in the Sky¬†app (iOS,¬†Android) does both, allowing you to capture the start and end point of a “fireball” (meteor/ite) by pointing your phone at the sky, while drawing a HUD and stars on the phone ¬†screen during the capture process, so you’re confident you’ve got the right part of the sky.

Azimuth and elevation
Azimuth and elevation

Roll, pitch and yaw tell you how the device sees itself Рthey are rotations around lines that go through the device (device axes). But in this case we want to know how the device sees the world Рwe need rotations around lines fixed in the real world (world axes). To know where the device is pointing, we actually want azimuth, elevation and tilt, as shown.

Azimuth and elevation together are commonly known as a horizontal coordinate system.

Tilt angle
Tilt angle

The azimuth, elevation pair of angles gives you enough information to define a direction, and hence capture objects in the real world (assuming the distance to the object does not need to be specified). However, if you want to draw something on the screen of your device, you need to know whether the device is held in landscape orientation, portrait orientation, or somewhere in-between; thus a third angle – tilt – is required.

Azimuth is defined as the compass angle of the direction the device is pointing. Elevation is the angle above horizontal of the direction the device is pointing. Tilt is the angle the device is rotated around the direction in which it is pointing (the direction defined by azimuth and elevation angles).

We can get azimuth, elevation and tilt with the following approach:

  1. Define a world reference frame
  2. Obtain the device’s rotation matrix with respect to this frame
  3. Calculate the azimuth, elevation and tilt angles from the rotation matrix

It will really help to be familiar with the mathematical concept of a vector (three numbers defining a point or direction in 3D space), and be able to convert between radians and degrees, from here on in. Sample code may be published in future.

Define a World Reference Frame

World reference frame
World reference frame

We‚Äôre somewhere in the world, defined by¬†latitude,¬†longitude¬†and¬†altitude. We‚Äôll define a¬†reference frame¬†with its origin at this point. For convenience, we‚Äôd like Z to point straight up into the sky, and X to point to¬†true north. Therefore, Y points west (for a right-handed frame), as shown here. We define¬†unit vectors¬†i,¬†j,¬†k¬†in the principal directions (or¬†axes) X, Y, Z, and we’ll use them later.

\[ \newcommand{\vect}[1]{\mathbf{#1}}
\vect{i} = \left[1,0,0\right], \vect{j} = \left[0,1,0\right], \vect{k} = \left[0,0,1\right]\]

Obtain Device Rotation Matrix

Device rotation with respect to world frame
Device rotation with respect to world frame

What we want eventually is an rotation matrix that is made up of the components of the device axes a, b, c, (also unit vectors) with reference to the world frame we defined. This matrix will allow us to convert a direction in the device frame into a direction in the world frame, and vice versa. This gives us all the information we need to derive azimuth, elevation and tilt angles.

We’ll describe the device axes as:

  • a¬†is “screen right”, the direction from the centre to the right of the screen with the device in portrait
  • b¬†is “screen top”, the direction from the centre to the top of the screen with the device in portrait
  • c¬†is “screen normal”, the direction straight out of the screen (at right angles to the screen, towards the viewer’s eye)

We can write each device axis as a vector sum of the components in each of the principal world frame directions, or we can use the shorthand of a list of numbers:

\[\vect{a} = a_i\vect{i}+a_j\vect{j}+a_k\vect{k} = \left[a_i,a_j,a_k\right]\]

The rotation matrix then has the form:

\[\mathbf{A} = \left[\begin{array}{ccc}
a_i & b_i & c_i \\
a_j & b_j & c_j \\
a_k & b_k & c_k \end{array}\right]\]

To get a matrix of this form in iOS, just use reference CMAttitudeReferenceFrameXTrueNorthZVertical and get the rotation matrix. However, the returned matrix will be the transpose of the matrix above, so you will need to transpose the result of the API call.

In Android, you will need to correct for magnetic declination and a default frame that uses Y as magnetic north, and therefore X as east. Both corrections are rotations about the Z axis. The matrix will similarly be transposed.

Calculate View Angles

Device elevation angle
Device elevation angle

We can calculate the view angles with some vector maths. The easiest angle is elevation, so let’s start there. We find the angle that the screen normal (c) makes with the vertical (k) using the dot product cosine relationship.

\[-\vect{c} \cdot \vect{k} = \cos\left(\frac{\pi}{2}-e\right)\]
\[e = \frac{\pi}{2} – \arccos\left(-\vect{c} \cdot \vect{k}\right)\]

Elevation is in the range [-90, 90]. Note also from the definitions above that such dot products can be extracted directly from the rotation matrix, as we can write:

\[\vect{c} \cdot \vect{k} = c_k \]

Device azimuth angle
Device azimuth angle

Next, we calculate¬†azimuth, for which we need the horizontal¬†projection¬†(cH) of the screen normal (c). We use¬†Pythagoras’ theorem¬†to calculate cH:

\[1 = c_H^2 + c_V^2\]
\[c_H = \sqrt{1 – c_k^2}\]

We then define a vector cP in the direction of c, such that the horizontal projection of this vector is always equal to 1, so we can use this horizontal projection to calculate angles with the horizontal vectors i & j.

\[\vect{c}_P = \frac{\vect{c}}{c_H}\]

Horizontal projection of device screen normal
Horizontal projection of device screen normal

We then calculate the angle the horizontal projection of the screen normal (cP) makes with the north axis (i). We get the magnitude of this angle from this dot product with i, and we get the direction (E or W of north) from the dot product with the west axis (j).

\[\cos{\alpha} = -\vect{c}_P \cdot \vect{i} = \frac{-\vect{c} \cdot \vect{i}}{c_H}\]
\[\alpha’ = \arccos\left(-\frac{c_i}{c_H}\right)\]
\[\newcommand{\sgn}{\text{sgn}}
\alpha = \sgn\left({c_j}\right) \times \alpha’\]

Note that because we’ve only used screen normal direction up until now, we don’t care how the phone is tilted between portrait and landscape.

Device tilt angle
Device tilt angle

Last, we calculate tilt. For this calculation we also need to ensure the projection of the screen right vector aP onto the vertical axis (k) is always equal to 1. As above, we divide a by cH.

\[\vect{a}_P = \frac{\vect{a}}{c_H}\]

We take the angle between aP and the world frame vertical axis k.

\[\cos{\tau} = -\vect{a}_P \cdot \vect{k} = \frac{-\vect{a} \cdot \vect{k}}{c_H}\]
\[\tau’ = \arccos\left(-\frac{a_k}{c_H}\right)\]
\[\tau = \sgn\left({b_k}\right) \times \tau’\]

Note that as the elevation gets closer to +/-90, both the azimuth value and the tilt value will become less accurate because the horizontal projection of the screen normal approaches zero, and the vertical projection of the screen right direction approaches zero. How to handle elevation +/-90 is left as an exercise to the reader.

Sample Code

Sample code may be available in future. However, these calculations have been verified in iOS and Android.