The Linear B tablets from the Palace of Nestor at Pylos (Greece) are administrative documents from the Bronze Age Greek world and are considered the earliest known form of Greek. These clay tablets were accidently fired in a destructive fire at the end of the late Bronze Age; thus, preserving them in the archaeological record until discovered in the early 1900’s. However, being preserved in this manner resulted in many of the tablets being broken, if not shattered, and are currently in a fragile state.
Objectiv and measuring object
The objective of this project is to publish the entire corpus of the Palace of Nestor Linear B tablets. The final published compilation will include a variety of traditional and innovative methods as reflectance transformation imaging (RTI), illustrations, transcriptions, and structured light scanning. A 3D object will allow scholars to access the tablets without having them in their actual presence; thus, allowing for a greater extent of preservation. In addition, Linear B scholars, who have relied upon the illustrations and transliterations of previous scholars, will be allowed to use the three-dimensional images to draw their own conclusions.
This 3D scanning project is a combined effort between Clemson University’s Warren Lasch Conservation Center and The College of Charleston. The project is headed by Kevin Pluta (University of Texas at Austin) and Dimitri Nakassis (University of Toronto). This application report has been kindly supported by Jami Baxley (College of Charleston) and Ben Rennison (Clemson University’s Warren Lasch Conservation Center).
Measuring system and setup
The scanning portion of the project is carried out through the application of the AICON SmartScan with two colour cameras (2 x 5 megapixel) at the National Museum in Athens (Greece). The fast digitization process of the scanning system is combined with a turntable to access multiple positions of the tablets without the need for human contact, thus preserving deterioration issues due to handling. In addition, data collection is sped up using the option to scan in preview while AICON scan software OptoCat calculates the data in full resolution in the background at the same time. This choice allows for a much quicker workflow.
Structured light scanning is chosen due to its ability to collect high-resolution surface data and colour images in unison. The system collects high data quality and unlike many laser-scanning methods, does not require lengthy post processing to view finalised data. One of the most direct benefits of the technology is that no physical contact is made with the object while scanning. Measurements are more accurate than laser scanning, as less surface error occurs using this method. This means that time for data cleaning and registration is greatly reduced in the post-processing phase.
The tablets being in such a delicate condition and being so numerous, it is deemed appropriate for multiple tablets to be scanned at once. The average scan consists of one to four tablets. Because of the support of the AICON SmartScan, these multiple tablet sessions can be performed with optimal data capture. Since the tablet source is at a significant distance from the post-processing lab, it is essential for the project that data be captured accurately the first time and in an efficient time frame. Each tablet is scanned from six to eight directions, front and back. The software OptoCat automatically combines front and back faces with a RMSE of less than 1 µm (root mean square error).
OptoCat is also used for data post-processing. The software merges the scan data, aligns them, evaluates overlapping data derived from the multiple scans and deletes duplicated data based on quality criteria. This process results in a single image. The software OptoCat allows for fast but accurate post-processing that included reduction and compression, filling of holes, and texture mapping. All data can be saved as STL, PLY, OBJ and other formats.
Texture mapping can be carried out both by using the internal imagery of the scanning sensor as well as using images taken with any kind of external camera. Thanks to the automatic texture mapping process in the OptoCat software, the scanner‘s internal colour images are directly transferred onto the 3D scanning object. The sensor data of previous OptoCat projects can be also used for high-resolution texturing with this software module. When using external imagery, the user first manually specifies the tie points before the automatic optimization and alignment process is started. The texture resolution is independent from the resolution of the object’s 3D data, allowing the creation of a reduced point cloud while maintaining a high-resolution texture.
The capturing of these archaeological data further allows for preservation of our links to past peoples and cultures that otherwise would be lost. The AICON SmartScan offers non-contact scanning in the capturing process as well as three-dimensional objects as results that can be studied instead of the actual artefact itself. The post-processing is separated into distinct phases that are easily managed and provide for accurate surface features and texture of the tablets.
The project shows that the AICON SmartScan can operate even in difficult scanning conditions and collect excellent, reliable surface data. The accuracy of the scanner will allow for scholars to examine, interpret, and draw conclusions upon the Linear B tablets without compromising the integrity of the objects. Many of the tablets were damaged when fired and over the past decades, specialised archaeologists have worked to compile the corpus and join pieces. The scanner is able to pick up the surface features of the tablets with such resolution and accuracy as to allow for scholars to see where these tablets broke and put together the shards. The software‘s option to activate and conceal the colour texture and to simulate different illuminations in the data viewer enables scholars to study the Linear B syllabary in depth.
We would like to thank Jami Baxley (College of Charleston) and Ben Rennison (Clemson University’s Warren Lasch Conservation Center) for their valuable texts and images to compile this report.