Archive for the ‘ AR ’ Category

Pokémon Go Home – Why Pokémon is not what the heritage sector really needs

Gently I edged toward the beast. It had 4 long semi-transparent wings, the same length as its tube-like body. The body was iridescent in the light, changing colour through blue to green.  Its face was incredibly ugly… a mixture of large bug-like eyes on the side of its head and a gaping mouth filled with prehistoric fangs. It fluttered its wings gently in the breeze, as it cleaned itself by rubbing its long spindly white legs all over its body. I reached down quietly in readiness to capture it, brought my phone up to look through the camera feed. Just as I was about to swipe the screen it got alarmed, its wings became a blur of movement and it took off – flitting away out of sight. I attempted to chase it – but it was gone in a blink of an eye. Another two years would pass before my friend and I got another tip-off about the location of the quasi-mythical beast.

I could be talking about my hunt for the Yanma, the large dragonfly, famous for being able to see in all directions at once and having such extreme wing-speed that it can shatter glass. Except of course I am not talking about my hunt for a Pokémon – I am talking about the day I went out with a very good friend of mine hunting in the Fens for the white-legged damselfly. We didn’t manage to capture a picture of it that day, but we did briefly see it alight on a leaf, which was more than good enough.

White-legged Damselfly by Philipp Weigell

White-legged Damselfly by Philipp Weigell (picture taken by Philipp Weigell) [CC BY 3.0 (http://creativecommons.org/licenses/by/3.0)], via Wikimedia Commons

Pokémon Go fever has been sweeping seemingly everyone in the last couple of weeks. It has been hailed as a saviour to stop idle kids sitting in front of their computer screens, the herald of mass-adoption of mainstream Augmented Reality, and the re-invigorator of an interest in heritage and museum sites.

As someone who has done quite a bit of research on AR and heritage, I would like to wade into the discussion here and question a few things. To go someway toward mitigating the risk of sounding like a grumpy old man who doesn’t understand ‘kids today’ (too late!) –  I would like to point out that I have 3 boys of my own, play all sorts of different games on both smartphone and computer and I once even bought a girlfriend a pair of Pikachu pyjamas. I understand the nostalgic call of Pokémon, the excitement of the treasure hunt and the lure of the screen over the real world.

Pokémon Go as an AR revolution

First I would like to tackle the pedantic bit. Pokémon Go is not really an AR game. The AR element of it (overlaying the Pokémon onto your video feed) is (as someone else has already said) the most basic of AR possible. So much so that it can’t really be called AR at all. There is no interaction with or reading of the real surroundings, the Pokémon don’t hide behind the cooker or pop out from behind the Museum gates. You could be standing in your toilet or at the edge of the Grand Canyon and the Pokémon would still be badly superimposed, via a simple geo-location, onto the video feed. Even the Snapchat AR which superimposes silly things on people’s faces is more AR – as at least it is doing some kind of recognition of the real world (in that case facial recognition).

Calling Pokémon Go an AR revolution is doing a disservice to the potential and power of AR for integrating with the real world. AR has so much more potential than this. Pokémon Go is a locative game, not true AR.

Pokémon Go gets kids outdoors

What’s not to like about this? Even a cynical old git like me surely can’t complain about kids (and adults) getting away from their screens and going outside. Except, of course, they are not getting away from their screens. In fact it is probably worse – by taking the screen outside and searching for Pokémon through it they are not even really taking part in the outside world. The outside world is being entirely mediated through the screen – a small rectangular box guides your every movement. The alternate reality provided by the smartphone is so beguiling that there are people falling in canals, crashing into police cars and even plummeting off the edge of cliffs whilst playing the game. Clearly even though they are outside, they are oblivious to the world around them.

Do we really live in a world where kids can’t be bothered to get off the sofa and go outside without taking a screen or a game with them? What kind of world is this? What is it going to become if the next generation take this as normal? Why is it that hunting for a Squirtle is seen as the utmost of cool – but following a tip-off about the location of a Spoonbill or standing on the end of train platform hunting trains is seen as the ultimate in nerdiness?

I’m not sure I can really see the logic. I guess that Pokémon Go is the epitome of easy and quick satisfaction. Sure you may have to travel a little, to get to a place to capture the computer-generated critter – but when you arrive you don’t have to wait and watch and hope that you glimpse a sight of it. You don’t have to be silent and scan the sky with your binoculars and be PATIENT. If someone has said that the Charmander is there, it is pretty much guaranteed that if you go to those Lat Long coordinates you will find it. Bird-watching is not the same. You can go back to the same hide for days and days and perhaps not spot what you are looking for. It may even be there, but you might not have done quite enough research to differentiate the colour of the wing flash. It is not quick or easy, and because of that it is surely more ultimately satisfying.

Pokémon Go brings all the kids to the (Archaeological) Yard

This then brings me to the final point – Pokémon Go as a way to get people more engaged with heritage sites. We have seen this before, museums and heritage sites jumping on trendy locative game bandwagons to get more people to come to their sites (Andy Dufton and I wrote about this with Foursquare a few years ago). I think it may be a little early to say whether or not this is really going to be a big thing. We will need to see stats on the increases in ticket sales to show that the Pokéhunters are not just going to the museum car park. And if they are paying the ticket price and entering the site, how much are they actually engaging with the archaeology?

Charmanders in the BM

Terry Brock is also hopeful about this:

Terry Brock

As Andrew Reinhard’s archaeogaming foray shows, there is the potential for providing extra contextual information at the ‘cultural’ Pokéstops. However a quick look at his example of the Pokéstop at his local Washington memorial shows only the information that is on the plaque of the monument itself – but then you would have to look away from your screen to read that.

Route of Washington’s March monument (taken from the Archaeogaming blog by Andrew Reinhard) – https://archaeogaming.com/2016/07/09/pokemon-go-archaeogaming/

So let us stand back a little and think about what all this means. I’ve concentrated recently on creating ways for people to use Augmented Reality to engage with, explore and understand heritage sites (take a browse around my website to see some examples). The key for me is that by someone visiting the site physically they can engage both their body AND their mind simultaneously. The AR I use is exclusively made to facilitate this, to show hidden elements of the site, to waft unexpected smells to make you THINK about the space in different ways, to play sounds that have some kind of relevance to what happened in that location in the past.

A visit to an archaeological site by a Pokéhunter is the antithesis of this. When a Pokéhunter arrives at a site (drawn by the lure of a rich Pokéstop) they are in the classic state of Cartesian disconnect. Their body may be there, but their mind is far away, thinking of the next Pokéstop or the phone notification that just came through from their mate about a rare [insert rare Pokémon name here] up the road.

You only have to look at this tweet to see the effects of this:

https://twitter.com/ohmycrayon/status/751778120647180288

This girl is at STONEHENGE, for crying out loud. Instead of taking an interest in how the stones were put up, how they fit into the surrounding landscape, what actually happened in and around them, and, crucially, how the experience of actually being there makes her feel – she is chasing an Eevee. She herself admits her attention is “so divided right now”. If this is happening at one of Britain’s most iconic and engaging monuments – what does it mean for other heritage sites? This girl’s mind is clearly not in the same place as her body. She is engaged in two separate realities, linked only by coordinates on a Google Map. Using Pokémon Go to get bums on seats and through the ticket barriers might be good for sales, but at what cost? If it really takes a Squirtle to get our youth (and adults) to go to a heritage site, then we are doing something very wrong.

What about the Real World?

I’m sorry this post has been rather despairing. I am getting increasingly sad for the state of the world, where people go head over heels hunting virtual creatures, while the real incredible biodiversity is ignored, built over and marginalised. Instead of re-wilding the world with animals, insects, plants and birds we are enchanted by the opposite: introducing the computer and virtual creatures into our diminishing natural and cultural spaces. How can it be that I am in the minority for being bewitched by the hunt for the white-legged damselfly, a beautiful, crazy, prehistoric looking creature – while the vast majority of people are instead happy to jump in their cars, park in the car park of the local baptist church and stare into their phones flicking imaginary red balls at imaginary creatures?

I haven’t even touched on the inevitable monetisation of all this, how long will it be until the big museums have to pay Niantic loads of money to host an incredibly rare Pokéstop and the smaller sites (that are actually crying out for visitors) will be priced out of the Pokémarket?

If you really can’t get your kids (or yourself) out to a heritage site without gamifying it by chasing animals, why not go and find that pair of peregrine falcons roosting in the local church steeple? Or go newt-hunting in your local historic ponds? Perhaps try to spot a red kite above the prehistoric landscape of Dartmoor? You could even use this map of rare bird sightings around the country to plan a day out birding and visiting nearby heritage sites.

But please please please – leave your smartphone behind.

CAAUK 2016 – Embodied GIS and applied Multi-Sensory Archaeology

I recently attended the CAAUK 2016 meeting in Leicester, a great couple of days with a few really interesting papers.

As usual, the rather excellent Dougs Rocks-Macqueen was on hand to record the talks. His videos can be found here – he records all sorts of diverse archaeological conferences, so it is well worth clicking the subscribe button on his account.

In case anyone is interested, I have embedded the video of my talk below – where I discuss the Embodied GIS, using examples from my previous research including Voices Recognition and the Dead Man’s Nose.

Heritage Jamming

As an archaeologist, I’m used to reporting old news, and this is pretty old – however, might be of interest.

In 2014 and 2015 I participated in the University of York’s Heritage Jam. The Heritage Jam is a really excellent initiative, bringing together an eclectic group of archaeologists, gamers, makers and heritage specialists to hack together a project within two days of intense work, locked in a small room. For those not able to travel, there is also the option to participate online.

Heritage Jam Logo

As well as the final prototype each team is expected to produce a paradata document, that outlines the motivations behind the project and also expands a little on the method and technologies used. The intense session really pays dividends and being locked in a room focuses the mind to get a lot of things done – without the constant distractions of the real world.

In 2014, my team won first prize with our ‘Voices Recognition’ project which explored the auralisation of a cemetery in York, and in 2015 I was awarded Highly Commended for my individual entry, the Dead Man’s Nose, a device which I developed, built and use to deliver smells in-situ while investigating archaeological sites. I used it to explore the olfactory landscape of the Moesgaard Museum Archaeological Trail (Denmark) – a link to the video and paradata is here.

Surfing the Hypegeist

This post is written as part of the Call for Papers over at ThenDig, looking at Zeitgeist in archaeological research and how to follow it, keep up with it, or create it. As will be clear from the previous posts on my blog, I am interested in using Mixed and Augmented Reality to aid in archaeological research. Augmented Reality (AR) is currently just over the ‘Peak of Inflated Expectations’ of the Gartner Hype Cycle meaning that it has been hailed previously as the next Big Thing, but has not quite lived up to the hype and so now needs a lot of work to make it a sustainable and useful technology – I have previously written about what this means in terms of archaeology here.

As I have just been awarded my PhD on the use of AR in archaeology I decided to write this post to give some brief reflections on what it has been like trying to surf the Hype Cycle, whilst still producing 85,000 words of scholarly research on the topic. Please check out the post on ThenDig that has some insightful comments from the the two peer reviewers –http://arf.berkeley.edu/then-dig/2014/03/zeitgeist-stuart-eve/. I have reproduced my text below:

Twitter is your enemy

Perhaps a controversial statement, but for one attempting to sit down and write intelligently about something that is currently the zeitgeist Twitter is not your friend. I don’t say this because of the many wasted hours of procrastination that goes into reading and obsessively checking a million and one tweets (although this is certainly true), I say it because when working on something at the bleeding edge of tech Twitter provides hundreds of teasing snippets of the amazing research that other people are doing. This isn’t just other researchers, but also companies and hackers who seem to have all the time (and money) in the world to make cool proof-of-concept videos. While initially amazing and a great source for early ideas and ways in which to give your research the ‘wow-factor’, it quickly becomes disheartening – seeing what other people are achieving whilst you are stuck still making sure your bibliography is formatted correctly. It provokes the need to be blogging/creating/making/hacking almost continually to keep up with everyone, and show that you are somehow simultaneously surfing the Hype Cycle. In my experience there is always going to be someone who has done it better so for anyone who wants to have a life outside of their research, my advice is keep your Twitter usage limited to finding new dubstep tracks and getting irate at the state of the world today.

Remember your roots

One of the key things to remember when using new tech is that no matter how deeply you immerse yourself in the tech world, when you emerge you need to convince other archaeologists that what you have been doing is useful. Archaeologists are notoriously wary of new technology and will be your biggest crtics – and this is A Good Thing. Every new digital method or gadget should only be developed to further archaeological method/theory and our knowledge of the past – not simply for wow-factor or as a result of a ride on a Hypegeist bandwagon. If it won’t work outside in the rain or you can’t convince a colleague of the usefulness of it without resorting to fancy videos or Prezis then don’t bother.

Every surfer loses a wave

Be prepared to fall off the wave, and watch other people riding. It is going to happen anyway and by being patient, sitting back and watching other people ride the wave you can learn just as much as you can by constantly doing. It is less tiring and often very much more rewarding. I have found that acknowledging you are always going to be behind the curve promotes a feeling of calm reflection that is vital for properly researching what you are doing, and gives you the knowledge to choose the right time to jump back on the crest.

Take your time

Whilst blogs are great for working through ideas, writing academically makes you consider every word and sentence and forces you to find other research that backs up or challenges your claims. As someone who researches new technology everyday, a digital detox is almost unheard of. However, taking the time to unplug everything, sit down and write the paper or thesis makes you critically examine everything you are saying or promoting with a clear unhindered perspective.

I am convinced this is the reason that baking is so zeitgiest at the moment. People are craving time away from the digital world to watch their sourdough grow and savouring the time it takes for a loaf to prove and bake puts you back in the real world. Sadly, however, they are tech-ifying sourdough too.

Guest Blog on ASOR

I have just submitted a guest blog post on the American Schools of Oriental Research (ASOR) blog for their ongoing special series on Archaeology in the Digital Age. It’s an introduction to Augmented Reality for Archaeology and also includes some sneak peeks of the results of some of my own AR fieldwork on Bodmin Moor. The original post can be found at http://asorblog.org/?p=4707.

Archaeology, GIS and Smell (and Arduinos)

I have had quite few requests for a continuation of my how-to series, for getting GIS data into an augmented reality environment and for creating an embodied GIS. I promise I will get back to the how-tos very soon, but first I wanted to share something else that I have been experimenting with.

Most augmented reality applications currently on the market concentrate on visual cues for the AR experience, overlaying things on a video feed, etc. There are not a lot that I have found that create or play with smells – and yet smell is one of the most emotive senses. In the presentation of archaeology this has been long known and the infamous and varied smells of the Jorvik Centre are a classic example of smell helping to create a scene. The main reason for this lack of experimentation with smells is presumably the delivery device. AR is quite easy to achieve now within the visual realm mainly because every smartphone has a video screen and camera. However, not every phone has a smell chamber – never mind one that can create the raft of different smells that would be needed to augment an archaeological experience. As a first stab at rectifying this, then, I present the Dead Man’s Nose:

The Dead Men's Nose

The Dead Man’s Nose

The Dead Man’s Nose (DMN) is a very early prototype of a smell delivery device that wafts certain smells gently into your nose based on your location. The hardware is built using an Arduino microcontroller and some cheap computer parts along with any scent of your choice. The software is a very simple webserver that can be accessed via WiFi and ‘fire off’ smells via the webserver’s querystring. This means that it can easily be fired by Unity3D (or any other software that can access a webpage) – so it fits very nicely into my embodied GIS setup.

How does it work?

This little ‘maker hack’ takes it inspiration from projects such as ‘My TV Stinks‘, ‘The Smell of Success‘ and Mint Foundry’s ‘Olly‘. Essentially, I followed the instructions for building an Olly (without the 3D housing) and instead of using an Ethernet shield for the Arduino – I connected it to a WiFi shield and from there joined it to an ad-hoc WiFi network created by my Macbook. With the Macbook, iPad and the DMN on the same network it is very easy to send a message to the DMN from within the Unity gaming engine. As the iPad running the Unity application knows where I am in the world (see the previous blog) it means that I can fire off smells according to coordinates (or areas) defined in a GIS layer. Therefore, if I have an accurate ‘smellscape’ modeled in GIS, I can deploy that smellscape into the real world and augment the smells in the same way that I can augment the visual elements of the GIS data.  The code is very simple for both ends, I am just using the a slightly adjusted sample WiFi shield code on the Arduino end and a small script on the Unity end that pings the webserver when the ‘player’ moves into a certain place on the landscape. When the webserver is pinged, it starts the fan and that wafts the smell around. From a relatively simple setup, it provides the possibility of a very rich experience when using the embodied GIS.

A Field Test

The first thing to do was to find the smells to actually augment using the Dead Man’s Nose. It turns out there are a lot of different places to buy scents, but luckily in this case archaeologists came to the rescue – an article in the excellent Summer 2012 edition of Love Archaeology e-zine pointed me to the website of Dale Air who have over 300 aromas ranging from the mundane (Crusty Bread) to the completely weird (Dragon’s Breath). I purchased a set of samples (Barbeque, Dirty Linen, Woodsmoke, Farmyard, among others) and was ready to go. I was quite surprised, but they do actually smell pretty much as described, especially the Dirty Linen.

As I was just experimenting, the housing for the DMN was very simple (a cardboard box) and there was only one choice of smell and that was sellotaped to the outside of the box…

The Dead Man's Nose, in a box with a BBQ scent attached

The Dead Man’s Nose, in a box with a BBQ scent attached

The prototype was then loaded into a bag (in this case a simple camera bag), which was slung around my neck. I popped the top of the BBQ scent open and then whenever the fan started whirring the sweet, slightly acrid smell of Barbequing meat was gently wafted to my nostrils.

The Dead Man's Nose in a nosebag, ready to go

The Dead Man’s Nose in a nosebag, ready to go

Using my embodied GIS of the roundhouses on Leskernick Hill, Bodmin Moor, I set the DMN to fire off a smell of lovely Barbeque whenever I got within 20m of a roundhouse. I set the fan to run slowly at first and get faster as I got closer to the ‘source’ of the smell. The DMN performed admirably, as I walked within range of the houses I heard the tell-tale whirr of the fan and the next moment I had the lovely scent of cooking ribs. Future models will allow for more than one smell at a time (I just need a couple more computer fans) and also a better housing, a bit of 3D printing is in order!

Now I can use the iPad to view the roundhouses overlaid onto the video feed, plug in my headphones and hear 3D sounds that get louder or quieter depending on where I am in the settlement and also I can augment different smells as I walk around. Not only can I walk around the modern day Bronze Age landscape and see the augmented roundhouses, hear the Bronze Age sheep in the distance, I can also smell the fires burning and the dinner cooking as I get closer to the village….

If there is interest I can put together a how-to for creating the system, but for now I am going to carry on experimenting with it – to refine the delivery and the housing and to clean up the code a little bit.

Embodied GIS HowTo: Part 1 – Loading Archaeological Landscapes into Unity3D (via Blender)

Recently I have been attempting to move closer to what I have coined embodied GIS (see this paper)- that is the ability to use and create conventional GIS software/data and then view it in the real world, in-situ and explore and move through that data and feedback those experiences. As is clear from the subject of this blog I am using Augmented Reality to achieve this aim, and therefore am using a combination of 3D modeling software (blender), gaming-engine software (Unity3D) and conventional GIS software (QGIS). Where possible I have been using Free and Open Source Software (FOSS), to keep costs low – but also to support the community and to show that pretty much anything is possible with a FOSS solution.

One of the main hurdles to overcome when trying to combine these approaches is to figure out the workflow between the 2D/2.5D GIS software, the 3D gaming-engine environment and then finally overlaying all of that information onto the real world. There are many points during the process when data integrity can be lost, resolution of the original data can be affected and decisions on data-loss have to be made. I hope that this blog post (and the subsequent howtos on the next stages of the process) will enable people to identify those points and also to step people through the process so you can do it with your own data.

The first step toward embodied GIS is to move from the GIS software into the gaming engine. There are many ways to do this, but I have used QGIS, some command line GDAL tools and then blender. Over the next few posts I will show how you import elevation data, import/place archaeological information and then view the finished data via the web and also in the landscape itself.

This first post presumes you have at least a working knowledge of GIS software/data.

First you will need a Digital Elevation Model of your landscape. I am using Leskernick Hill on Bodmin Moor as my case study. I have the Ordnance Survey’s Landform PROFILE product which is interpolated from contours at 1:10,000 – resulting in a digital DTM with a horizontal resolution of 10m. To be honest this is not really a great resolution for close-up placement of data, but it works fine as a skeleton for the basic landscape form. The data comes from the OS as a 32bit TIFF file – the import process can’t deal with the floating-point nature of the 32bit TIFF and therefore we need to convert the TIFF to a 16-bit TIFF using the gdal tools. To install GDAL on my Mac I use the KyngChaos Mac OSX Frameworks. Binaries for other platforms are available here. Once you have GDAL installed, running the following command will convert the 32bit to a 16bit TIFF –

gdal_translate -ot UInt16 leskernick_DTM.tif  leskernick_DTM_16.tif

This is the first stage where we are losing resolution of the original data. The conversion from a floating point raster to an integer-based raster means our vertical resolution is being rounded to the nearest whole number – effectively limiting us to a 1m vertical resolution minimum. This is not too much of a problem with the PROFILE data as the vertical resolution is already being interpolated from contour lines of between 10m and 5m intervals – however, it can lead to artificial terracing which we will tackle a bit later. It is a bit more of a problem with higher-resolution data (such as LiDAR data) as you will be losing actual recorded data values – however with the PROFILE data we are just losing the already interpolated values from the contours.

Once the TIFF is converted then you will need to setup a local grid within your GIS software. Unity doesn’t handle large game areas that well – and will start the gamespace at 0,0 – therefore when we import our data it makes things much easier if we also can import our data relative to a 0,0 coordinate origin then to real-world coordinates. This is much easier than it sounds – and just involves using a false easting and northing for your data. In my case I made a simple shapefile of a 10k x 10k square that covered my study area the bottom left coordinates of the square (in the Ordnance Survey GB coordinate system (EPSG:27700)) were 212500, 75000. This means that the coordinates of any data I import into Unity will need to have 212500 subtracted from their eastings and 75000 subtracted from their northings. We can either do this programmatically or ‘in our heads’ when placing objects on the Unity landscape (more on this later in the howtos). It is an advantage having a relatively small study area and also having data in a planar/projected map projection – as the conversion will not need to take account of projections of earth curvature (as it would in a geographic projection such as LatLongs).

Therefore, you can choose to reproject/spatially adjust all of your data using the false eastings and northings within your GIS software – which makes the import a little easier. Or you can do it on an individual layer dataset basis as and when you import into Unity (which is what I do).

Once you have sorted out the GIS side of things, you will need to import the raster into blender – and build the 3D landscape mesh. I’ll try and explain this step-by-step but it is worth finding your way around blender a little bit first (I recommend these tutorials). Also, please bear in mind you may have slightly different window set-up to mine, but hopefully you will be able to find your way around. Please feel free to ask any questions in the comments below.

  1. Open up blender – you should see the default cube view. Delete the cube, by selecting it in the panel to the right – then press ‘X’ and click delete
  2. Now we want to make sure our units are set to metres – do this by clicking the little scene icon in the right-hand panel and then scrolling down to the Units drop-down and click the Metric button.

    Changing units to metric

  3. Now add a plane – using Shift+A Add->Mesh->Plane (or use the Add menu). This will create a Plane of 2mx2m. We want this Plane to be the size of our DEM (in world units) so change the dimensions to be the same, in my case I set X to be ’10km’ and Y to be ’10km’. If you don’t have the dimensions panel on the right, click the ‘N’ key to make it appear.

    Setting the Plane Dimensions

  4. You will notice that your plane has disappeared off into the distance. We need to adjust the clipping values of our viewport. Scroll down the panel with the Dimensions in it until you see the View dropdown. You will see a little section called ‘Clip:’ – change the End value from 1km to say 12km. Now if you zoom out (pinch to zoom out on a trackpad or use the mouse scroll wheel) you will see your Plane in all its very flat glory.
  5. Before we start the interesting bit of giving it some elevation – we need to make sure it is in the right place. Remember that we are using false eastings and northings, so we want the bottom corner of our Plane to be at 0,0,0. To do this first set the 3D cursor to 0,0,0 (in the right-hand panel just beneath where you set the viewport clip values). Now click the ‘Origin’ button in the left-hand Object Tools panel, and click Origin to 3D cursor (the shortcut Shift+Ctrl+Alt+C)
  6. You will also want to make sure the bottom left of the Plane is at 0,0,0. As the origin handle of the Plane is in the middle, for a 10x10km DEM you will need to move the X 5km and the X 5km, by changing the location values in the right-hand properties panel. That should ensure your bottom left corner is sitting nicely at 0,0,0.

    Setting the location

  7. Our Plane currently only has 1 face – meaning we are not going to be able to give it much depth. So now we need to subdivide the Plane to give it more faces – think of this a bit like the resolution of a raster – the more faces the more detailed the model will be (at the cost of file size!). Enter Edit Mode (by pressing Tab). You will see the menu change in the Left Panel – and it will give you a set of Mesh Tools.
  8. Click the Subdivide button – you can choose how much you want to subdivde but I usually make it to be around the same resolution as my DEM. So for a 10k square with 10m resolution we will want a subdivided plane with approx 1,000,000 faces. In Blender terms the closest we can get is 1,048576 faces. This is a BIG mesh – so I would suggest that you do one at high resolution like this – and then also have a lower resolution one for using as the terrain [see the terrain howto – when written!].

    Subdividing the Plane

  9. We now want to finally give the Plane some Z dimension. This is done using the Displace Modifier. First come out of Edit mode – by pressing TAB. Now apply a material to the Plane, by pressing the Material button on the far right panel and hitting the [+New] button.

    The Material and Texture Buttons

  10. Now add a texture to the new material by hitting the Texture button and again hitting the [New+] button. Scroll down the options and change the Type to ‘Image or Movie’. Scroll down further and change the Mapping coordinates from Generated to UV. Now click the Open icon on the panel and browse to the 16bit Tiff you made earlier. The image will be blank in the preview – but don’t worry blender can still read it.

    Applying the Image Texture

  11. Once you have applied the texture – click the Object Modifiers button and choose the Displace Modifier from the Add Modifiers dropdown.

    Object Modifiers Button

  12. When you have the Displace Modifier options up choose the texture you made by clicking the little cross-hatched box in the Texture section and choosing ‘Texture’ from the dropdown. First change the Midlevel value to be ‘0m’. Depending on your DEM size you may start seeing some changes in your Plane already. However, you will probably need to do some experimentation with the strength (the amount of displacement). For my DEM the strength I needed was 65000.203. This is a bit of weird number – but you can check the dimensions of the plane as you change the strength (see screenshot) you want the z value to be as close as possible to 255m (this basically means you will get the full range of the elevation values as the 16bit Tiff has 255 colour values. These should map to real-world heights on import into Unity. You may want to do some checking of this later when in Unity).

    Changing the Strength

  13. Hopefully by this stage your landscape should have appeared on your Plane and you can spin and zoom it around as much as you like…
  14. At this stage you are going to want to save your file! Unity can take a .blend file natively, but let’s export it as an FBX – so we can insert it into Unity (or any 3D modelling program of your choice). Go to File->Export->Autodesk FBX and save it somewhere convenient.

Well done for getting this far! The final steps in this HowTo are simply inserting the FBX into Unity. This is very easy, but I will be presuming you have a bit of knowledge of Unity.

  1. Open Unity and start a new project. Import whichever packages you like, but I would suggest that you import at least the ones I have shown here – as they will be helpful in later HowTos.

    Creating a new Unity Project

  2. Now simply drag your newly created FBX into Unity.  If you have a large mesh the import will probably take quite a long time – for large meshes (greater than 65535 vertices) you will also need the latest version of Unity (>3.5.2) which will auto split the large mesh into separate meshes for you. Otherwise you will have to pre-split it within blender.
  3. Drag the newly imported FBX into your Editor View and you will see it appear – again you can zoom and pan around, etc. Before it is in the right place, however, you will need to make sure it it the correct size and orientation. First change the scale of the import from 0.01 to 1 – by adjusting the Mesh Scale Factor. Don’t forget to scroll down a little bit and click the apply button. After hitting apply you will likely have to wait a bit for Unity to make the adjustments.

    The FBX in Unity

  4. Finally you will need to rotate the object once it is in your hierarchy on the y axis by 180 (this is because Blender and Unity have different ideas of whether Z is up or forward).

    Set the Y rotation

  5. You should then have a 1:1 scale model of your DEM within Unity – the coordinates and heights should match your GIS coordinates (don’t forget to adjust for the false eastings and northings). In my case the centre of my DEM within real-world space is 217500, 80000. The adjustment for the false eastings and northings would be performed as follows:-

actual_coord - false_coord = unity_coord
therefore 217500 - 212500 = 5000 and 80000 - 75000 = 5000
therefore the Unity Coordinates of the centre of the area = 5000,5000

To double-check it would be worth adding an empty GameObject at a prominent location in the landscape (say the top of a hill) and then checking that the Unity coordinates match the real-world coordinates after adjustment for the false values.

I hope that helps a few people, there are a couple of other tutorials using different 3D modelling software on this topic so it is worth checking them out too here and here and one for Blender here.

In the next HowTo I’ll be looking at the various different ways of getting vector GIS data into Unity and adding in different 3D models for different GIS layers so stayed tuned!

New Article in J. of Archaeological Method and Theory

I have just had an article published in the J. of Archaeological Method and Theory – which explains a bit more about my approach to using Augmented Reality within archaeology and how it might aid in a phenomenological approach to the landscape. The article is as a result of a conference I attended last year, ‘In Search of Middle Ground’ (organised by Dot Graves and Kirsty Millican) and forms part of a special issue that will be coming out in paper print a bit later in the year. There are some really interesting papers in the issue (some of which are already available from the journal’s Online-First section), all of which deal with the tricky area that lies between computer-based analysis of the landscape and actually getting out into the field and walking around.

It is good to get some work out there and hopefully start some debates regarding the validity of the approach, although as I read it back now I can see how far my thinking has come already and a few things that need some further development.

The article is available on the journal’s site (for people who have a subscription or institutional access) and also from UCL’s Open Access site (just a pre-print with no fancy formatting). Let me know in the comments below if you have any questions!

Future of Conference Posters?

Last month I entered a poster into the UCL Graduate School Poster Competition and was lucky enough to win first prize. I find conference posters a bit of a strange animal. The poster session always seems to take place over lunchtime or the coffee break and more often than not the person who made the poster isn’t around to talk you through it. You are then usually left with a poster that has masses of text, that either has too much detail or not enough, and the whole thing can get quickly boring.

I wanted to challenge this a little bit, and as my poster subject was my work with AR, I was provided with the perfect opportunity. The poster was a pretty simple (but hopefully striking) design, a pair of old school binoculars looking at some rocks on Leskernick Hill, Bodmin Moor. The area within the binoculars shows some roundhouses  – giving the impression that looking through the bins reveals the ancient landscape.

Graduate School Poster

My winning poster

I tried to keep the text to a bare minimum so that the poster was dominated by the binoculars. However, this being an AR project, there was a bit of a twist. Using the Junaio API I augmented the poster with a video that overlaid the whole thing when viewed through a smartphone or tablet. The video showed the binoculars moving around the poster, revealing more of the roundhouses.

I am increasingly finding that the best way to explain AR is to give someone an example of it. It was a bit of a gamble, as in order to see the AR content the viewer needed to have a smartphone, have an app to scan the QR code on the poster and have good enough internet access to install and run the Junaio app. The main judge of the competition wasn’t at the prize-giving, so I didn’t get any feedback or a chance to ask if they had seen the AR content, but they awarded it first prize so I hope they did!

I am of course not the first person to use AR in a poster, but I am sure that it will become a lot more popular as it really is an excellent way of adding content to a poster, without being too intrusive. I guess at the moment it could be seen as being a little gimmicky, however this isn’t all that bad when trying to attract people to your poster and your research. One of the important things to remember though is that the poster needs to be able to stand on it’s own without the AR content, as it is quite an ask at the moment to get people to download an app on their phone just to learn more about your research.

The process of adding the content via the Junaio app also wasn’t quite as easy as I had hoped, mainly because the video itself had to be made into a 3D object and be of a very low quality and in a special .3g2 format to enable it to be delivered fast to a mobile device. You immediately lose your audience if they have to wait 2 minutes for your content to download and the .3g2 format was specifically designed to look ok on a smartphone screen and be small enough to download quickly. However, as you can see from the video above, the quality is pretty poor. I created the animation using 3D Studio Max, and then rendered it out to a number of tiffs. I then used ffmpeg to render the tiffs to a video and encoded it into the .3g2 format. The Junaio developer website has instructions for how to do all of this, but it is not really for the faint of heart.  Junaio provides a number of sample PHP scripts that can be run on your own server to deliver the content, and their trouble-shooting process is really excellent. So if you have your own webserver and are happy with tweaking some scripts then you can do some really quite nice stuff. I should note that they also have a simple upload interface for creating simple AR ‘channels’ which is a great way of quickly getting things up there – but doesn’t allow you to have total control or upload videos. But, if you want to pop a simple 3D model on your conference poster, then the Junaio Channel Creator is the app for you! The other thing to remember if you want to augment your own conference poster, is that the channels can take up to a week to be approved by Junaio, so you can’t leave it all to the last minute!

I suspect we will be seeing many more AR-enabled conference posters, particularly as AR booklets, magazines and museum guides are becoming more popular. One can envisage holographic type projections of people standing beside their posters talking the viewers through it, or interactive posters where the content changes depending on what and where you touch it. As I keep coming back to on this blog, it is the melding of the paper with the digital that I find so fascinating about AR, the ability to re-purpose old ideas (such as the conference poster) and breathe new life into the concept – but without losing the original purpose and feel of the thing itself. The design of the paper poster stands on its own (for better or worse!) and the AR content just gives the creator the chance to provide further information and give the viewer that extra dimension into their research.

AR and Archaeology: Opportunities, Challenges and the Trench of Disillusionment

I have just come back from giving a guest seminar to the Archaeological Computing Research Group at the University of Southampton and thought I would put up a post with the gist of it. It was really an introduction to Augmented Reality in Archaeology, but was also inspired by the recent article in Wired. In his article Clark Dever explains that AR is currently languishing in the Trough of Disillusionment.

The (Archaeological) Hype Cycle

The (Archaeological) Hype Cycle

What this means is that according to the Gartner Hype Cycle AR as a technology has already reached it’s peak of marketing, expectation and excitement and hasn’t really delivered much. Instead of providing the world with a technology to allow the seamless integration of the real and the virtual, we are left with a few applications that provide a way to overlay virtual information onto a video screen, which are mostly used to direct us to the nearest Starbucks.

I am afraid that I have to agree with Clark Dever, and I feel seem the same about AR. I follow a large number of AR blogs and tweeters and all everyone seems to report on is new apps that basically overlay info onto a screen with no relationship to the real world. A good example is Falcon Gunner, a Star Wars based app which places you in the seat of a gunner on the Millennium Falcon. Whilst it is a really fun game [who doesn’t like shooting down TIE fighters!?] the ‘AR mode’ has absolutely no connection to the real world and basically overlays the game with a transparent background so that it looks like TIE fighters are flying over your sofa. While this is kind of interesting for about 5 minutes, what I really want is the TIEs to interact with the real world – I want them to hide behind the sofa and fly out at me – or fly into a cupboard, hide and wait until I’m not looking and then attack me. I want to feel like I am part of the Star Wars galaxy and it is part of my front room.

Falcon Gunner app

Star Wars Arcade: Falcon Gunner (http://jhaepfenning.wordpress.com/2011/06/30/toilets-are-obsolete-a-falcon-gunner-review/)

Heritage applications are bread and butter for AR, one of the first things that comes to mind when talking about AR is how cool it would be to see what the world used to look like. Indeed, archaeological AR apps are actually some of the better apps that are trying to meld the virtual with the real. For instance, the Museum of London’s Streetmuseum app does a good job of pulling in virtual content (in their case pictures/paintings) and overlaying them into their ‘real’ place in the world.

MoL Streetmuseum (image from: http://www.bullseyehub.com/blog/2011/01/top-6-mobile-apps-for-culture-events/)

But, again, this app just overlays the image in (roughly) the right place – there is no way to enter into the image or interact with it, or have people walking around it, through it, behind it. Instead it is really the equivalent of using your GPS to query a database and get back a picture of where you are. Or indeed going to the local postcard kiosk buying an old paper postcard of, say, St. Paul’s Cathedral and then holding it up as you walk around the cathedral grounds.

In my opinion, AR will continue to languish in the Trench of Disillusionment until we can address the following issues:

  1. The technology needs to be used intelligently.Adding on an ‘AR view’ to an app that simply overlays the app on your video feed is not enough. In addition, simply putting GPS locations into a ‘3D’ space and giving them an icon is equally flawed. Especially when those locations are far away and should be obscured [occluded] by the buildings in the way. It is much easier to navigate to these things using a map (saves you trying to walk through buildings) – and I am not entirely sure how much the AR mode adds to it. We need to think of ways that AR is going to add information or provide a new type of information, not just be a different (and less useful) way of displaying the same old information.

    Panoramio

    Layar's 'AR View' - note the points that are on different streets (some kilometres away), and should be occluded by the buildings.

  2. The AR algorithms need to recognise the real world. Sorry to keep banging on about this, but if the AR content is not respecting the real world (i.e. being occluded by it or wrapping round it or interacting with it in some way) then you lose the point and the feel of the augmentation. We should be using the real world as a template for the AR experience, taking as much of it as possible and then gently melding the virtual world with it – not harshly slapping virtual content on and simply making it move with the motion of the accelerometers. Advances are currently being made toward this, via the use of depth cameras (such as the Kinect) and also computer-vision based algorithms (such as SLAM and SfM). Metaio, the developer of the popular Junaio AR app, are clearly making big leaps in this area as this video shows. We are a little way off this being commercially available, but it shows that the big companies are finding ways to make the meld more seamless.
  3. AR needs to be seamless (and cheap!). The current normal delivery of AR requires either a head-mounted display (HMD) or a smartphone/tablet. Whilst an AR experience will always need some kind of mediation in order to provide the experience, these devices need to be less bulky and also cheaper in order for them to become accessible to a normal person. In archaeology, the majority of the AR apps are likely to involve tourism, or visits to archaeological/historic sites or museums and therefore the delivery technology needs to be cheap and robust, and ubiquitous enough to enable the AR content to be experienced. Perhaps the fabled real-life Google Googles that have been promised by the end of the year will go someway to making this happen.
  4. We need to wrest the technology away from advertisers. Up until now, a lot of AR content has just been a way for marketeers to sell us stuff. That’s fine and its the way of the world. In fact it obviously drives a lot of the technological advances, because after all who is paying for all this stuff? But we need to be careful that we are also doing good research with AR that does not just have the aim of making the killer app to sell loads of stuff. As archaeologists we are in a unique position where we can advance knowledge and use AR to show people our research in-situ or use it as an aid to field practice, rather than just to present out results. As our discipline moves towards attempting to gain a more embodied experience of the past, AR is the perfect technology to aid in that embodiment and to let us experience visions/sounds/smells of past events in the places that they happened. It can be used to help us think about the past as we are excavating it, and may even aid in/change our interpretations as we go along. We don’t have to be led by the nose with the technology and instead we need to bend it to our will, make use of it intelligently for our discipline. Otherwise we are simply going to end up with Matsuda’s dystopic vision of AR Advertising Hell.

While in danger of pushing the metaphor of the Archaeological Hype Cycle to breaking point let me sum up:

AR is like one of those archaeological excavations where you are promised the world and then when you break ground it doesn’t quite deliver. You see the amazing Barrow of Inflated Expectation that promises archaeological finds and fame beyond your wildest dreams, you engage the press, start a website, hit every social media site possible and get everyone (including your funders and institution) excited beyond belief. Then you cut a slot through through the barrow and realise that it isn’t filled with the grave goods of a lost Bronze Age King, instead there is very little in the Trench at all. The press get bored, your website hit-rate plummets, the previously frequent on-site blogging reduces to once a month and your institution starts worrying about your REF submission. You languish in your trench, wondering how you can rescue the project. But then you remember you have taken whole load of environmental samples, the few scraps of wood you recovered are good enough for dendro-analysis, you analyse the complex stratigraphy very carefully and realise it is a unique sequence… 2 or 3 years of careful post-excavation analysis by just a few team members follows, the hard-graft of making the project really work begins to come to fruition and you are left with a mature project that has real results and is pushing the field of archaeology forward. That is where we are with AR now. We need to get our heads down and do that hard-graft, start thinking what we can take from the hype of AR and build it into something that works, helps us during our field practice and dissemination and hopefully pushes archaeological knowledge forward, rather than just being more eye-candy.

Please leave some comments if you can think of or have examples of applications for AR in archaeology or heritage studies that could get us out of the Trench, it would be great to get a discussion going. I have uploaded an HTML version of my Southampton seminar here. Please note, it was exported from Keynote, and therefore the embedded movies only seem to work when viewed in Safari.