3D Visualization for Archaeology and Open Educational Resources (OER)

By Chris Givan (JPPM Digital Education Coordinator) and Noah Boone (JPPM Digital Education Content Developer)

Photogrammetry is a technique for creating 3D models, which is increasingly common in cultural and research contexts. At Jefferson Patterson Park and Museum, we’ve been using photogrammetry to create models of archaeological sites and artifacts that may not be accessible to visitors or which may be of interest to folks for whom the park is inaccessible because of its location. Thanks to a project funded by an IMLS CARES Act Grant for Museums and Libraries, we’ve begun to provide photogrammetric models as Open Education Resources (OER) and are exploring how to replicate, at home or in the classroom, the experience of visiting archaeological sites or interacting with artifacts.

Photogrammetry software creates 3-dimensional data by analyzing photos taken at multiple angles around a subject. This can be done using a variety of programs, both proprietary (Agisoft Metashape, Reality Capture, etc) and free-and-open-source (Meshroom, MicMac, etc). This process requires a high degree of overlap between photos by moving the camera or subject in small increments, such as with small rotations on a turntable. The programs identify like points between these photographs and construct a 3-dimensional point cloud (below, left). This point cloud can then be further processed to create a 3-dimensional model that can be viewable and distributable for a variety of purposes (below, right).

Side-by-side comparison of a point-cloud, left, and a mesh, right, in Agisoft Metashape. The rectangles surrounding and overlapping the image are Agisoft’s estimation of where the camera was when a corresponding photograph was taken.

Photogrammetry is incredibly scalable and results are primarily dependent on camera equipment. This method can be used with drone photography for creating models of landscapes and buildings and macro-photography can even be used to create models of insects. Photogrammetry offers many exciting possibilities to look at things in a different light and look at things at angles or scales that would otherwise not be possible.

We’re using our models from photogrammetry in a number of ways. First, we will be making models available as resources for anyone with a use for them on JPPM’s SketchFab page. SketchFab is a website for hosting and sharing 3D models, which includes contributions from cultural institutions around the world. We particularly like SketchFab because museum accounts allow you to restrict downloads if dealing with artifacts or sites for which you have received permission to make the models viewable but not redistributable.

Below are two objects on SketchFab that we have scanned with photogrammetry. The model of the site known as Sukeek’s Cabin includes annotations, an additional benefit of using SketchFab that allows us to add educational content directly to models.

However, publishing on SketchFab does limit interactivity and we want to replicate some of the physicality of visiting sites or seeing artifacts up close. There are also practical limits to what can be included in annotations. To achieve more interactivity we’re using the service Genial.ly. SketchFab models can be embedded directly into Genial.ly “microsites” with rich media or additional interactivity. Below, we used photogrammetry to model an “alphabet plate” found at Sukeek’s Cabin. We’ve used Genial.ly to simulate another dimension of “handling” the object by encouraging viewers to reassemble 2D views of the fragments. Even though this additional interaction is 2-dimensional, it derives from photogrammetry of the plate. On an interesting note, we were able to do this by photographing the plate while it remained in its display at JPPM’s Visitor’s Center, and the interaction we’ve simulated is not actually possible in person given preservation needs.

To enable even more interactivity, we’re using Unity, a game engine for creating both 3D and 2D content. Unity is commonly used for indie games but its streamlined experience and support for computers, mobile devices, and web browsers makes it excellent for education–as does a large community of users and assets to help speed development. By shifting from SketchFab and Genial.ly, where we’re limited to either visualizing a model in 3-dimensions or simulating additional interactions with the model from 2-dimensional perspectives, Unity enables interaction with archaeological sites and artifacts from the first person perspective or with controllers that do a better job approximating the feel of an object.

In the video below, you can see a very early experience of “walking” around the Sukeek’s Cabin site here on park property. Despite the ghostly reconstruction (because parts of it are hypothetical or not known with confidence*), there is still a sense of hominess when inside and the stairs in the corner invite further exploration. In the distance, we have added a representation of the Peterson house. Newly emancipated, Sukeek and family were still living within sight of their former captor’s home. From the first person perspective, the house feels watchful–a feeling difficult to replicate in SketchFab or Genial.ly, missing from the site today, but true to the limits newly-freed families often found on their freedom.

A user explores the virtual environment around the Sukeek’s Cabin site. The photogrammetric model is visible on the ground as are interactive hotspots. A “ghost” of the home can be toggled on and off to get a sense of what it would have looked like.

We use these results in Open Education Resources (OER): free and openly-licensed resources that encourage reuse and remixing. (For more information, see this explainer from the University of Maryland or visit our Provider Set on OER Commons for examples.) For OER, photogrammetry offers a way to present lots of information with each resource. Photos and videos preserve how an artifact or archaeological site looks from a limited set of views, but digital models can preserve how a subject looks from any point of view, even those that may not be practically accessible. Where photogrammetry excels as an educational tool, though, is in approximating being able to tangibly interact with an artifact or site. While most interactions still rely on 2D screens, the opportunity to move and manipulate 3D models within those 2D interfaces helps replicate some of the sense of holding an object. As AR/VR and 3D printers improve, having a 3D model of an artifact or site will only improve in educational effectiveness.

*In addition to the current staff at JPPM, we are indebted to conversations with Kirsti Uunila and Ed Chaney for guidance on how the cabin would have looked.