Embodied GIS HowTo: Part 1a – Creating RTIs Using Blender (an aside)
This is a bit of an aside in the HowTo series, but nevertheless it should be a useful little tutorial and as I was given a lot of help during the process it is only right to give something back to the community! So this HowTo shows you how to take the 3D model you created in Part1 and create a Reflectance Transformation Imaging (RTI) Image from it. Now if you don’t know what that is then here is the definition from the biggest advocates of the technique for archaeology, Cultural Heritage Imaging (chi) :
RTI is a computational photographic method that captures a subject’s surface shape and color and enables the interactive re-lighting of the subject from any direction.
What this means basically in GIS terms is that you have a fully interactive hillshade to play with and can change the angle of the light on-the-fly. No more need to create hundreds of hillshades with the sun at different angles – this is an all-in-one approach and is way more interactive. It is a really awesome technique for analysing rock-art, artefacts and even documents and can be used to reveal tiny little details that might not be obvious just by examining the object normally. It has also been used by Tom Goskar and Paul Cripps to interactively re-light some LiDAR data that Wessex Archaeology have of Stonehenge (see their paper in here). RTI images are created by surrounding the subject with a dome of lights that are turned on one by one and a photograph taken each time. Every photograph needs a shiny ball (usually a snooker ball) in it – which enables the software to record the angle of each light, and by using some complex maths is then used to merge together all of the images (for a fuller and probably more accurate explanation see Mudge et al 2006).
This technique can also be used virtually (as Tom and Paul have done) by recreating the dome of lights in a 3D modeling package and shining them on a virtual object (often a laser scan) or a chunk of LiDAR data. I am going to show you exactly the same technique that Tom and Paul used, except where they used Vue I’m going to be using blender to create the virtual dome. I have also supplied the .blend file and the python script used – so you should be able to do it all yourself.
Right first things first, open up blender and load the .blend file that you saved from Part 1 – if you haven’t got one then you’ll need a 3D model of some description within blender – the concepts will work the same on any 3D model, but I am presuming for this tutorial that you have a chunk of a Digital Elevation Model.
- Luckily for us blender has a method of easily creating a dome of lights – this is because during the 3D modelling process a light dome is often used to create a warmer, more realistic feeling for ambient lighting (see Radiosity) so we can use this to our advantage. Press Shift+A to create a new mesh and create an ‘icosphere’. To get enough lights we’ll need to subdivide the icosphere once so change the subdivisions (on the left hand panel) to be 3.
- For the purposes of this tutorial I am presuming that you have a chunk of 10kmx10km DEM – therefore in order to light it properly we need to create a dome that will cover the whole thing. Change the dimensions of your dome to be X:15km, Y:15km and Z:10km – you can change this to be as spherical as you want – these settings worked for me. You will also want to move it to the middle of your DEM – so change the location to be X:5km, Y:5km and Z:0.
- Now we have a sphere (albeit squashed) we are going to want to cut the bottom of it to give us our dome. To do this enter Edit Mode (by pressing TAB). Now change your view – so that you are viewing from the front – you either press 1 on your numeric keypad or use the menu View->Front. Press the ‘A ‘ key to clear your selection and then press the ‘B’ key to begin selecting by a border. Draw a box around the bottom of the icosphere and it should select those faces and vertices. Once selected press the ‘X’ key and delete the vertices. Depending on the size of your sphere you may have to zoom forward a little bit to select and delete the faces on the far side of the icosphere. Keep doing this until you are left with a tidy dome sitting above your DEM.
- Once you have your dome we are ready to start adding lights to it. First off if you have any other Lamps in the scene – delete those, so we don’t get confused at later stage. Once deleted, come out of Edit Mode (TAB) and use Shift+A to add a new Lamp. I use a Sun lamp [helps with my year-round tan... ahem.. sorry] – you could have an experiment with other types of lamps too – but the Sun seems to work well. Move the Sun to be at the centre of your DEM (X:5km, Y:5km,Z:0). Rotate the Sun so that it’s Y axis is at 180 degrees.
- In the little panel on the right you want to select the new Sun by clicking on it, then holding down Shift click the icosphere, so you should now have both selected (you can tell because their little icons light up) – now hover you mouse in the centre of the viewport and press Ctrl+P and parent the Sun to the icosphere. The Sun should now become a child of the icosphere in your objects panel (if you expand the icosphere in the panel you will see the Sun as part of its hierarchy).
- The Sun is now the son of the parent, therefore, we want to multiply the number of them and set them on each vertex – blender has a great function for this (DupliVerts) – click on the Object properties of the icosphere, scroll down to Duplication, click Verts and click the Rotation checkbox. You should see a whole host of Suns appear. They should be in the right place on each vertex, but if not you can move the Sun to the centre of the DEM (by clicking on the icosphere in the hierarchy panel and then clicking on the Sun – see Step 4).
- As we are using Suns the direction that they are pointing doesn’t really matter – however, if you are using other types of lamp – spots for instance – you will need to make sure they are pointing in the right direction. [NOTE: if you need to do this here is how, if you are using Suns disregard this step - select the icosphere, enter Edit Mode (TAB) and then choose Normals: Flip Direction from the Mesh Tools panel on the left. That will ensure the lamps are pointing inside the dome. Go back into Object Mode (TAB)].
- Now we have a lovely dome of Suns, we need to detach them from the icosphere, so we can manipulate them individually. This is pretty easy – select the icosphere and then Press Ctrl+Shift+A and you should see the Suns all detach themselves into individual objects (you will see about 90 or so Suns in the hierarchy panel on the right). At this stage you are free to delete or turn off the icosphere as we won’t be needing it anymore.
- Next we need to set up our camera. Images for RTIs are normally taken by a camera set at the top of the dome, pointing directly downwards. Select your camera (there should be one by default in your scene – if not then you can Add one using Shift+A). Change the camera’s location to be directly above the centre of your DEM at the apex of the dome (in my case X:5km, Y:5km, Z:10km). Blender cameras automatically point downards – so there should be no need to add any rotation (if you have any rotation already set change all the values to 0). Before we render out a test image, we’ll need to adjust our camera viewport and clipping range. Press 0 on the numeric keypad or use the menu View->Camera to take a look and see what the camera is seeing. You will likely just get a grey box – this is because the camera is clipping the distance it can see. Select the camera and go to the settings in the right panel – Set the end Clipping range to 10km and you should see your DEM appear.
- Now you are going to want to adjust the Sensor size, to make sure your whole DEM is in the shot – for my 10km DEM the sensor had to be set to 70.
- Try a test render (press F12 or go via the menu Render->Render Image). You should be presented with a lovely render of your DEM, currently lit from all the angles.
- Press F11 again to hide the render view – at this stage you might want to increase the energy setting on your Suns – to get a bit more light on the DEM. Our suns are all still linked together – so you can change the energy setting by clicking on the top Sun in your hierarchy, clicking the Sun object properties (the little sun icon in the object properties panel) and changing the Energy as required (I recommend energy level 5). This should change the energy of all the suns.
- Once you are happy with the energy levels we can render out a test sequence, by using a small python script that turns each sun on individually and renders out an image. Change the bottom panel to be a Text Editor panel (see image).
- Click the [New+] button in the Text Editor panel and cut and paste the following code into the window
import bpy, bgl, blf,sys sceneKey = bpy.data.scenes.keys() filepath = "PUT YOUR ABSOLUTE FILEPATH HERE" # Loop all objects and try to find the Lamps print('Looping Lamps') l=0 # first run through all of the lamps turning them off for obj in bpy.data.objects: if ( obj.type =='LAMP'): obj.hide_render = True l = l + 1 print('You have hidden ' + str(l) + "lamps") # now we can go through and # individually turn them on # and render out a picture for obj in bpy.data.objects: if ( obj.type =='LAMP'): print (obj.name) obj.hide_render = False bpy.data.scenes[sceneKey].render.image_settings.file_format = 'JPEG' bpy.data.scenes[sceneKey].render.filepath = filepath + '//lamp_' + str(obj.name) # Render Scene and store the scene bpy.ops.render.render( write_still=True ) obj.hide_render = True
- Adjust line 3 – so that you have a filepath that fits your system. This is where it will save out the images – but beware if the folder doesn’t exist it will go ahead and create it – so make sure you type carefully. When you are ready to go – click the Run Script button and it should happily go away and render your images for you. If you have problems when running the code the errors should appear in the console. [NOTE FOR MAC USERS: If you are on a Mac to get the console requires you to start blender from a Terminal window. Save your .blend, close blender. Open Terminal.app then change directory to the blender application by running "cd /Applications/blender.app/Contents/MacOS/" (change the path to fit where you installed blender), then run "./blender". Any console messages will now appear in the Terminal window]
- This will give us a nice set of images (one for each camera) that we can use later to create our RTIs.
- You may recall from the beginning of this HowTo that in order to create an RTI image we also need to use a shiny snooker ball. Luckily we can create ones of these with blender as well. Use Shift+A to create a Metaball-> Ball. Make the Ball dimensions 1kmx1kmx1km and move it to the centre of your view (say X:5km, Y:5km, Z:2.5km).
- Now we want to make the ball really shiny and black – so apply a material to the ball (using the Material button in the object properties). Set the Diffuse intensity to be 0.0, the Intensity to be 1.0, the Hardness to it top value (511) and click the little Mirror checkbox. That should give us a nice hard shiny black ball for the RTI software to deal with.
- Now we want to render out a set of images with only the ball in it so that we can ‘train’ the RTI software. You will want to turn the render off on your DEM Plane (press the little camera button next to it in your hierarchy view), so that when you output the images you will only be rendering the ball.
- Change the filepath in the script in your Text Editor panel so that you will be saving the ball images to a different folder (otherwise you will just overwrite your DEM images). Then hit Run Script and you should get a set of rendered images of the ball ready for importing into the RTI software.
- You now have the 2 sets of images ready to create your final RTI image!
- I am not going to go through the minute detail of the steps to create the RTI image, as Cultural Heritage Imaging have already written a detailed how to. So the next step is to download the RTI Builder software and the reference guide from this page and go through the steps outlined within their reference manual.
- A couple of notes on the process, you are going to want to run the first RTI build using the ball images as the input images (put them in a folder called jpeg-exports/ within your RTI project directory). This will create an RTI of the ball – and will produce a .lp file in the assembly-files/ folder of your RTI project directory.
- Once you have produced the .lp file from your ball images, you can then use this .lp file to create a RTI image of your DEM itself. Start a new RTI project and choose Dome LP File (PTM Fitter) on the first page – this will direct you through and allow you to specify the .lp file from your ball project, and the images of the DEM that you rendered from blender. As we have already trained the program using the ball images, it should now just happily go through and create the RTI image from your DEM renders.
- That’s it – here is how mine turned out (a little dark, so probably need more energetic suns)…
You can download my lightdome.blend file that has a 15km x 15km light dome in it – if you don’t want to make your own. If you used this tutorial, post some screenshots of your own RTI images in the comments – I’m interested to see what people get up to! If you have any questions or need further help, don’t hesitate to ask below. Thanks go to Tom Goskar, Paul Cripps and Grant Cox for help and advice in setting up the virtual RTI dome.