Posts Tagged ‘ iOS

The ARtefactKit – Heritage Jam 2017 Winner

Somehow the Heritage Jam run by the University of York has come round around again and gone. As I outlined in this post the Heritage Jam is an opportunity for people to get together and create new heritage visualisations relating to a specific theme. The theme this year was ‘Bones of Our Past’ – as I couldn’t be there in person, I decided to go ahead and put something together for the online competition.

It turns out my entry won first place! I built something that I have wanted to experiment with for a quite a while – an Augmented Reality application that allows you to take a real artefact (in this case a bone) and compare it to a virtual reference collection. By using your phone you can augment a ‘virtual lab’ onto your kitchen table and then use the app to call up a number of different bones from different animals until you can find one that matches.

The AR aspect of it adds something more to the ‘normal’ online virtual reference collections – by allowing you to augment the models in the correct scale in front of you and then twist and turn each one side by side.

In addition, as I am interested in multi-sensory things, I also added in the sounds and smells of the animals – as well as a virtual portal into a 360 degree video of a deer herd in action.

Finally, it has a link through to a set of Open Data from Open Context showing where else in the world similar types of bones have been found.

You can watch the visualisation here:

Please check out the full visualisation and explanation here: http://www.heritagejam.org/new-blog/2017/10/27/the-artefactkit-stu-eve

As with all of these ‘jam’ projects, the app is just a prototype and is quite messy in terms of overall look and feel – but I think it has potential to be quite useful. Now I just need some funding!

ARkit and Archaeology – Hougoumont Farm, Waterloo

For the last 3 years I have had the absolute privilege of being one of the archaeological directors of the current excavations of the battlefield of Waterloo. As part of the incredible project Waterloo Uncovered (http://www.waterloouncovered.com) – we have been taking wounded serving and veteran soldiers, students and professional archaeologists  to the battlefield to conduct the first major systematic excavation of parts of the battlefield that shaped the history of Europe in 1815.

We only have two weeks in the field each year, which means there is not a lot of time to doing anything but excavate, record and backfill (see the dig diaries and videos of all we got up to here). However, this year I managed to find the final afternoon to play with the new Apple ARkit and see what potential there is for archaeological sites.

The short answer is that there is a lot of potential! I have discussed Augmented Reality and archaeology to the nth degree on this blog and in other places (see here for a round-up) – but with the beta release of ARkit as an integrated part in iOS11, Apple may have provided the key to making AR more accessible and easier to deploy. I tried out two experiments using some of the data we have accrued over the excavations. Sadly I didn’t have any time to finesse the apps – but hopefully they should give a hint of what could be done given more time and money  (ahem, any prospective post-doc funders – my contact details are on the right).

Exploring the lost gardens of Hougoumont

The first video shows a very early experiment in visualising the lost gardens of Hougoumont. The farm and gardens at Hougoumont were famously defended by the Allied forces during the battle of Waterloo (18th June 1815). Hougoumont at the time was rather fancy, with a chateau building, large farms buildings and also a formal walled garden, laid out in the Flemish style. One of the participants this year, WO2 Rachel Willis, is currently in the process of leaving the army and studying horticulture at the Royal Horticultural Society. She was very excited to look at the garden and to see if it was possible to recreate the layout – and perhaps even at some point start replanting the garden. To that end she launched herself into the written accounts and contemporary drawings of Hougoumont and we visited a local garden that was set out in a similar fashion. Rachel is in the process of colouring and drawing a series of Charlie Dimmock style renditions of the garden plans for us to work from – but more on that in the future.

Gardens of Gaasbeek Castle

Similar gardens at Gaasbeek Castle

Extract from Wm. Siborne's survey

Extract from Wm. Siborne’s survey of the gardens at Hougoumont

As a very first stab at seeing what we might be able to do in the future, I quickly loaded up one of Rachel’s first sketches into Unity and put a few bushes and a covered walkway in. I then did some ARkit magic mainly by following tutorials here, here, and here. Bear in mind that at the time of writing, ARkit is in beta testing, which means you need to install Xcode Beta, sign up for and install the iOS 11 beta program for the iPhone and also run the latest beta version of Unity. It is firmly at the bleeding edge and not for the faint-hearted! However, those tutorial links should get you through fine and we should only have to wait a few months and it will be publicly released.  The results of the garden experiment are below:

As can be seen, the ARkit makes it very simple to place objects directly into the landscape OUTSIDE – something that has previously only really been possible reliably using a marker-based AR plugin (such as Vuforia). Being able to reliably place AR objects outside (in bright sunshine) has been somewhat of a holy grail for archaeologists, as unsurprisingly we often work outside.  I decided to use a ‘portal’ approach to display the AR content, as I think for the time being it gives the impression of looking through into the past – and gives an understandable frame to the AR content. More practically, it also means it is harder to see the fudged edges where the AR content doesn’t quite line up with the real world! It needs a lot of work to tidy up and make more pretty, but not bad for the first attempt – and the potential for using this system for archaeological reconstructions goes without saying! Of course as it is native in iOS and there is a Unity plugin, it will fit nicely with the smell and sound aspects of the embodied GIS – see the garden, hear the bees and smell the flowers!

Visualising Old Excavation Trenches

Another problem we archaeologists have is that it is very dangerous to leave big holes open all over the place, especially in places frequented by tourists and the public like Hougoumont. However, ARkit might be able to help us out there. This video shows this year’s backfilled trenches at Hougoumont (very neatly done, but you can just still see the slightly darker patches of the re-laid wood chip).

Using the same idea of the portal into the garden, I have overlaid the 3D model one of our previous trenches in its correct geographic location and scale, allowing you to virtually re-excavate the trench and see the foundations of the buildings underneath, along with a culverted drain that we found in 2016. It lines up very well with the rest of the buildings in the courtyard and will certainly help with understanding the further foundation remains we uncovered in 2017. Again, it needs texturing, cleaning and bit of lighting, but this has massive potential as a tool for archaeologists in the field, as we can now overlay any type of geolocated information into the real world. This might be geophysical data, find scatter plots or, as I have shown, 3D models of the trenches themselves.

These are just very initial experiments, but I for one am looking forward to seeing where this all goes. Watch this space!

 

Augmenting a Roman Fort

The following video shows something that I have been working on as a prototype for a larger landscape AR project.

As you can see, by using the Qualcomm AR SDK and Unity3D it is possible to augment some quite complex virtual objects and information onto the model Roman fort. I really like this application, as all I have done is take a book that you can buy at any heritage site (in the UK at least) and simply changed the baseboard design so that the extra content can be experienced. Obviously there was quite a lot of coding behind the scenes in the app and 3D modelling, but from a user point of view the AR content is very easy to see – simply print out the new baseboard, stick it on and load up the app.

For me that is one of the beautiful things about AR, you still have the real world, you still have the real fort that you have made and can play with it whether or not you have an iPad or Android tablet or what-have-you. All the AR does is augment that experience and allow you to play around with virtual soldiers or peasants or horses instead of using static model ones. It also opens up all sorts of possibilities for adding explanations of building types, a view into the day-to-day activities in a fort, or even for telling stories and acting out historical scenarios.

The relative ease of the deployment of the system (now that I have the code for the app figured out!) means this type of approach could be rolled out in all sorts of different situations. Some of my favourite things in museums, for instance, are the old-school dioramas and scale-models. The skill and craftsmanship of the original model will remain, but it could be augmented by the use of the app – and made to come alive.

Housesteads Diorama

The model of Housesteads fort in the Housesteads museum

The same is true of modern day prototyping models or architectural models. As humans we are used to looking at models of things, and want to be able to touch them and move them around. Manipulating them on a computer screen just doesn’t somehow seem quite right. But the ability to combine the virtual data, with the manipulation and movement of the real-life model gives us a unique and enhanced viewpoint, and can also allow us to visualise new buildings or exisiting buildings in new ways.

A particularly important consideration when creating AR content is to ensure that it looks as believable or ‘real’ as possible. The human eye is very good at noticing things that seem out of the ordinary or “don’t feel quite right”. One of the main ways to help with creating a believable AR experience  is to ensure the real-world occludes the virtual objects. That is the virtual content can be seen to move behind the real-world objects (such as the soldiers walking through the model gateway). Also it should be possible to interact with the real-world objects and have that affect the virtual content (such as touching one of the buildings and making the labels appear). This will become particularly important as I move into rolling the system out into a landscape instead of just a scale-model. As I augment the real world with virtual objects, those objects have to interact with the real-world as if they are part of it – otherwise too many Breaks in Presence will occur and the value of the AR content is diminished. An accurate 3D model of the real-world is quite a bit harder to create than that of a paper fort, but if I can pull it off, the results promise to be quite a bit more impressive…

 

ARK and Augmented Reality

Recently I have been working away in the Unity gaming engine using it to make some Augmented Reality applications for the iPhone and iPad. It is surprisingly successful and with at least 3 different ways of getting 3D content to overlay on the iOS video feed (Qualcomm, StringAR and UART) the workflow is more open than ever. I have been attempting to load 3D content at runtime, so that dynamic situations can be created as a result of user interaction – rather than having to have all of the resources (3D models, etc.) pre-loaded into the app. This not only saves on file size of the app, it also means that the app can pull real-time information and data that can be changed by many people at once. However, in order to do that I needed some kind of back-end database…

For those of you that know me, you will know that as well as doing my PhD I work on the development of the open-source archaeological database system known as the Archaeological Recording Kit (ARK). It seemed like a logical step to combine these two projects and use ARK as the back-end database. So that is what I went and did and at the same time created a rudimentary AR interface to ARK. The preliminary results can be seen in the video below:

This example uses the Qualcomm AR API, and ARK v1.0. Obviously at the moment it is marker-based AR (or at least image recognition based), the next task is to incorporate the iDevices’ gyroscope to enable the AR experience to continue even when the QR code is not visible.