More than just a pretty picture: A review of the use of 3D printing, touch tables and virtual environments to engage the public with Lidar and the archaeology of the New Forest

This source preferred by Harry Manley and David John

Authors: Shaw, L., John, D., Manley, H. and Underwood, G.

Start date: 29 March 2016

Over the last five years, the New Forest Higher Level Stewardship (HLS) scheme has utilised remotely sensed data, including Lidar, to identify and record lost and forgotten archaeological monuments. Traditional processing techniques of these data, such as hill shade and slope analysis, allow archaeologists to identify ‘lumps and bumps’ in the landscape, created by human activity over thousands of years. Whilst great for prospection when analysed by professionals, these processed rasters only represent the original three dimensional data as a flat image. Consequently, when viewed by the general public, these interesting and engaging images are often dismissed as just ‘pretty pictures’, with little thought as to how they were produced or what they actually show.

In September 2015, to mark the mid-point of the New Forest HLS, a temporary exhibition was produced to share what had been discovered through this 3D recording technique. As part of this exhibition, researchers based at the New Forest National Park Authority and Bournemouth University looked at different ways they could represent the Lidar data to help the public understand how it was recorded and what it represents. Using developing technologies such a 3D printing, gaming engines and interactive touch tables, researchers produced a number of outputs that allowed visitors to engage with and interpret the 3D Lidar data in a number of multi-sensory ways.

This paper documents the different techniques used to produce multi-sensory outputs to teach the public about Lidar and assesses how effective these approaches are in helping people understand and interpret what they see.

The data on this page was last updated at 04:49 on February 25, 2018.