parte prima 17 Introduzione. Le strade romane del territorio e della città Patrizia Basso, Brunella Bruno, Piergiovanna Grossi 35 I miliari dell'agro veronese: ipotesi e spunti di riflessione per un inquadramento topografico Piergiovanna Grossi
Methodological reflection on communication in archaeology greatly developed over the past fifteen years. It is now widely accepted that video-narrative medium has a larger potential compared with other media commonly used up to now. The archaeological video can be divided into some different categories - documentary, video update, docudrama - each of them potentially destined to a variety of audiences when the movie is inserted into a narrative framework. By its nature, the archaeological site of Vignale, where the relative poverty of the remains on the ground sharply contrasts with the richness of the 'stories' the site itself can narrate, is an ideal place to test the docudrama-model video. Initially intended to be just an instrument for communicating with and involving local population in the archaeological project as a whole, the video-narrative proved to be a powerful tool in stimulating the research group itself towards a more thoughtful and 'multivocal' recording of the fieldwork done. The output of the project was the making of a brief 'series' of videos, with the general title of 'The Excavation and its Stories. They were initially used as an educational support for younger students in archaeology, but later obtained a wider audience through the web.
In the first part, the paper introduces the section that collects historical syntheses of some of the most relevant issues related to technological applications in archaeology. Databases, GIS, multimedia applications, cataloguing activities of archaeological heritage, museums, and the Internet are the fields chosen to illustrate more than 25 years of research, projects, realizations. The paper stresses common criticisms and recurrent difficulties in these sectors of research, but also important results and achievements for archaeology on the whole. In the second part, the paper briefly discusses the relationship between the Internet and archaeology. Web applications in archaeology started in the early 1990s. Initially, archaeologists were very suspicious of web reliability: the Internet was a useful tool for popularization purposes, not for scientific research. The paper discusses reasons for the failure of some archaeological applications - for example electronic publishing and limited area search engine - and success of others: museum web sites above all, with their effective use of visual and interactive web technologies. Nowadays the Internet is an almost unavoidable tool for every type of archaeological research and it seems to have become the comprehensive frame in which all other technological applications are expressed. Internet technologies could introduce a new communication structure in archaeological research with the use of interactivity and hypermedia. The last challenges in ICT are the so called Web 2.0, social computing and a radically innovative vision of hypertext structure: these research fields could change the way of archaeological culture communication and knowledge transmission.
This paper aims to highlight the importance of 3D printing to support Cultural Heritage and related activities. We will demonstrate the advantages that a conscious employment of techniques and methods, together with the right expertise, could offer to an exhibition. We will detail the steps we took to produce a 1:1 copy of a medieval sphinx for the exhibition Echoes of Egypt: Conjuring the Land of the Pharaohs which took place at the Yale Peabody Museum of Natural History (USA). This paper highlights the project’s workflow, from the digital 3D scan, data processing, 3D printing, to the artistic finishing to prepare the object for display.
The significant technological advances in restoration methods and the potential of three-dimensional graphical representation of monuments have brought huge benefits to the understanding and communication of the archaeological heritage. At the same time, they have formed, and continue to be, a critical issue that has forced the scientific community to question and discuss the methods and principles universally recognized and shared that are applied in virtual archaeological reconstruction. The impetus given by the evolution of the software representation of three-dimensional graphics, also open source, prompted a massive production of images of hypothetical virtual reconstructions which adhere more or less to the historical data. This modus operandi has come up against the fundamental principle of scientific transparency and critical interpretation of the work. To deal with this problem a possible evolution of computer-based visualization applied to archaeology is required to analyze and question the three-dimensional models produced, so that they can be modified on the basis of the new acquisitions and new interpretations. A model that seems appropriate and feasible for achieving this is the Building Information Modelling (BIM), or the model of data and information that constructs a building, based on the standard format open source IFC.
As part of the FIRB 2001 Project, a computer research project on Roman merchants who worked in the Mediterranean area from the 3rd century BC to the 3rd century AD was commenced. Data about the single businessmen were gathered from inscriptions (first of all instrumentum domesticum) and literary sources (both Greek and Latin); data were processed in a relational database, which is briefly described here. This paper, by way of an example, considers merchants who lived during the Republic (over 250 people) and focuses on the economic and social aspects of their activity. Painted inscriptions (tituli picti) or graffiti on Roman amphorae, stamps on amphorae stoppers and marks on anchor stocks inform us of the names of many traders (most of all ingenui or freedmen) involved in the transport and sale of goods such as wine and olive oil. Other inscriptions (mainly epitaphs or religious dedications) refer to a lot of tabernarii who worked in Rome, in other towns of Roman Italy or the Provinces during the late 2nd and 1st centuries BC. Some data from literary sources are also available, mainly concerning important businessmen who operated in the whole Mediterranean basin.
Although computer oriented archaeologists seem to have become somewhat disillusioned with computer simulation as a tool, other social sciences are witnessing a significant wave of enthusiasm for it, particularly in the form of agent-based modelling. My aim in this article is to reach some understanding of just why this paradoxical situation has arisen, and to consider what will and should happen next as regards agent-based modelling in archaeology.
This article proposes a hypertext technology-based system for the management of archaeological documentation. The system is conceived to give archaeologists the means to have on-line all information about an archaeological context, so that this information is available to more than one workstation. In such a way the archaeologist is enabled to operate in real-time, to extract and manipulate on different levels the information on which the research is based.
Classification represents a central topic in archaeological research. In fact, archaeologists seem to spend a great deal of their time in describing and sorting materials, from surveys and excavations, in groups which should serve various ends. In the history of archaeological classification, briefly outlined in the first part of the paper, there has been an endless debate between the researchers following the traditional /qualitative/subjective approach and the proponente of a “new” (now forty years old) paradigm, founded on the formal/quantitative/objective idiom. With the benefit of hindsight, we now know that none of them is at all satisfactory. In fact the traditional approach, despite the empirical validity, has proven very difficult to be formalized; the quantitative approach on the other hand, though being based on sound scientific principles, has presented serious difficulties in its practical applications. The article describes an attempt to implement an informatic tool able to produce formal analyses based on both qualitative and quantitative variables: an intelligent ObjectOriented system with classificatory purposes. The system, called Mosaico, is thoroughly illustrated in the second part of the article. The description concerns all the components of Mosaico, a language for conceptual modelling called TQL++ (Type and Query Language), and a brief explanation of some terms useful for a better understanding of the matter. A working example on the Fibulae from the Quattro Fontanili cemetery concludes the paper.
In this work we use an archaeological information system to record and manage data coming from an excavation. The system includes the excavation methodology, the geographical reference of archaeological elements, the settlements and elements denomination, the directories and files structure and the computer organisation. The authors develop a computer software based on PARADOX to record the archaeological information, including textual documentation, CAD maps and images.
Open access and free reuse of cultural data is one of the more topical challenges for Digital Humanities. Great opportunities may instead be presented by the adoption of free licenses by museums, archives, and libraries, allowing free commercial reuse of digitization, as well as that ‘freedom of panorama’ still denied today in Italy.
Investments in data management infrastructure often seek to catalyze new research outcomes based on the reuse of research data. To achieve the goals of these investments, we need to better understand how data creation and data quality concerns shape the potential reuse of data. The primary audience for this paper centers on scientific domain specialists that create and (re)use datasets documenting archaeological materials. This paper discusses practices that promote data quality in support of more open-ended reuse of data beyond the immediate needs of the creators. We argue that identifier practices play a key, but poorly recognized, role in promoting data quality and reusability. We use specific archaeological examples to demonstrate how the use of globally unique and persistent identifiers can communicate aspects of context, avoid errors and misinterpretations, and facilitate integration and reuse. We then discuss the responsibility of data creators and data reusers to employ identifiers to better maintain the contextual integrity of data, including professional, social, and ethical dimensions.
Perhaps the greatest barrier to effective management of underwater cultural heritage is the lack of data on the nature and location of offshore archaeological resources. This is a problem shared with terrestrial archaeology, but is particularly acute due to the limitations of survey techniques in the underwater environment. In Scotland <15% of known ship losses from the last 200 years have been located and the record is far less comprehensive for earlier periods, verging on a near total data gap. Most known archaeological sites in Scottish waters have been discovered through large-scale sonar survey of relatively low resolution and a considerable bias has been introduced in the archaeological record; this has favored the discovery and documentation of larger and more recent, often upstanding, metal shipwrecks. This article presents the methods and results from a three-year project designed to reduce this bias by demonstrating large-scale prospecting for maritime archaeology through a community-based crowd-sourcing approach. Project SAMPHIRE (the Scottish Atlantic Maritime Past: Heritage, Investigation, Research and Education Project) was geographically focused on the west coast of the Scottish mainland and was undertaken between 2013 and 2015, resulting in a large number of new archaeological discoveries, including shipwrecks, aircraft, and other material of a much more varied nature than what is typically found through large-scale hydrographic surveys.
Between 2011 and 2013, a project for developing the archaeological information system of Verona (called SITAVR) was started by the University of Verona and the Soprintendenza per i Beni Archeologici of Veneto and with the financial support of the Regional Agency and the Bank institute Banca Popolare di Verona. The first step was determined by a collaboration with the Soprintendenza Speciale per i Beni Archeologici di Roma (SSBAR), which since 2007 has been developing an Information System for the Italian capital. Thanks to the support from the colleagues and the conventions between the public administrations involved, it was possible to start the project using the data model and databases created for Rome as a basis. The second step was to study and adapt these artefacts to a smaller town like Verona, taking into consideration the different cataloguing necessities. During this phase, a new methodology (based on GeoUML model) and its tools were used in order to analyze the database of Rome and to create the conceptual schema as a reverse engineering process. The usage of the GeoUML tools allows us to obtain automatically the physical schema and the documentation for the new database of Verona. All the data collected will be available to the general public, both for a better public comprehension of the Information System content and eventually for reuse in other similar projects.
The “Castellum Vervassium” project concerns a series of archaeological investigations regarding the landscape around an ancient settlement now known with the name of Vervò (Val di Non, Trentino, Italy). Among the different analyses (excavation, survey, remote sensing, etc.), in 2010 a sub-project was started to reconstruct a hypothetical ancient road network inside the target landscape. In order to optimize the scientific process, the entire research project was divided into three steps: a topographic study conducted with classical methodology, the determination of the least cost path through LIDAR data and the development of a WebGIS to improve scientific publication of the final result. Every single phase of the work-flow was supported by specific Free/Libre and Open Source software applications. During the classical topographic study, the simple and light GIS OpenJUMP was used to improve precision and to avoid time consuming operations with cartography (without compromising user control in qualitative analyses). For more complex quantity analyses, the software GRASS granted a high quality, mainly thanks to its modular structure. This program satisfied our needs in determining the least cost path between main nodes of the road network and managed huge amount of data analysing a LIDAR DTM of 1 meter accuracy. A WebGIS, based on GeoServer and OpenLayer, made it possible to share the basic topographic and archaeological information of the project with the community. This type of flexible media was the best choice for offering broad access to the data, thanks to different filters and pre-built queries that simplify the internal browsing of the system.
In the process of creating an archaeological information system of the excavations in Cerveteri, the decision was made not only to use a more traditional database, but also to develop a recording methodology that connects the text of the excavation diaries, encoded by the application of a mark-up language (SGML), with the cartographic data. In order to query all the excavation diaries, an information retrieval application was required, with the aim to retrieve not only words but also specific meanings and contexts. In this paper the author describes the creation of an internal software application for providing information retrieval from SGML texts and of its subsequent implementation on a Web server. The paper is divided into two parts: the first describes the application itself and the concepts on which it is based and the second part discusses the technology that has been applied and the results achieved. In order to construct a querying system for the content of the excavation diaries, both ASP and VBSCRIPT technologies have been used, as they are particularly useful for constructing client-server applications for an intranet. Through applying such technologies, it has been possible to connect the textual sources with the digital cartography through specific hypertext links, allowing the visualisation of the search results in a browser such as Explorer or Netscape Navigator. This application has also been designed to allow data diffusion through the Internet.
The development of the “Caere Project”, conducted by the Istituto per l’Archeologia Etrusco-Italica of the Italian National Research Council as part of the “Cultural Heritage” Special Project, has made it possible to establish a unique and comprehensive model for the digitalization of excavation data within a GIS platform. This model has been developed to record, process and publish data coming from the excavations conducted by the Institute in the central area of the urban plateau of the ancient Etruscan town of Cerveteri. From the outset of the project, much attention has been placed upon the discussion of methodological and technical issues, in order to form a framework for data acquisition and processing. The methodologies adopted and processes adhered to are described, with particular reference to the problems of: data representation and encoding, standardisation of the descriptive language, application of Spatial Analysis techniques, creation of a multimedia software for data diffusion and publication.
As part of the Caere Project, the author describes the diverse stages that have characterised the acquisition and encoding in a digital format of the excavation diaries through the application of SGML. This encoding language for electronic documents is focused mostly on describing the internal structure of the data and the information contained in the text. The SGML syntax in some aspects is complex, and inevitably this has been an obstacle to the diffusion of the language. The transcription and the encoding of the diaries have been completed and a flexible querying system of the SGML documents has been created. The decision to use the Internet in order to distribute information has also implied a study of the viability of converting SGML documents into XML, which in the last few years has been replacing SGML, from which it derives. However, the completion of the encoding project of the excavation diaries does not represent the final stage; in fact, it is the new phase that it has initiated which is important: further DTDs will be created which will allow the acquisition and encoding of the descriptions of every find. The user will be able to navigate and explore the textual data and, where a more detailed study is required, analyse the objects together with the topographical information.
The author describes the experimentation of the Text Encoding Initiative Lite for the encoding of published archaeological documents, a part of the research program of the Caere Project. In fact, the experimentation with SGML as a tool for documenting, querying and subsequently interpreting the yearly diaries of the Vigna Parrocchiale excavations suggested expanding the use of this encoding procedure to also include published archaeological reports, particularly those associated with other monumental features in the urban plateau of Cerveteri. As a case study, the encoding scheme of the Text Encoding Initiative (TEI) Lite, integrated by the DTD already defined, has been used in the publication of the excavations conducted in 1912-13 by Raniero Mengarelli, in the same area of the Vigna Parrocchiale, and published in «Studi Etruschi» in 1936. In order to verify the flexibility of this encoding method within different types of archaeological publications, the same procedure has been experimented on another text written in 1937 by Raniero Mengarelli and extracted from «Notizie degli Scavi di Antichità».
The aim of this paper is to describe the principles on which the Caere GIS has been created and to offer an overview of the spatial analyses conducted and the theoretical principles on which they are based. In order to satisfy the ultimate goal of the project, a solution is described for the dissemination of the results across the Internet through GIS technology. Indeed, at the outset of the project, the decision had been taken to create both a GIS application for internal use and a separate dynamic GIS multimedia application for data diffusion across the Internet. Through the GIS platform, thematic maps of the site have been created, exploiting the ability of topological analysis to explore the mutual relationships between structures. The use of the GIS was not restricted solely to this application, as its full potential was exploited through the use of its analytical engine. Several spatial analysis techniques were used (in particular Viewshed analysis) both for the study of the distribution of finds at site level and for the wider analysis of the surrounding territory of Cerveteri. Finally, the on-line publication of the GIS will offer a chance to create a living document, continually reviewed and updated by the author. It will also constitute the first step towards the standardisation of a metalanguage, that will permit effective multimedia communication and the exchange of different data formats and sources.
Nell’ambito del progetto “Nora e il mare”, volto alla ricostruzione dell’antico paesaggio costiero di Nora, si è avviata nel 2021 una nuova fase di collaborazione multidisciplinare dedicata alla componente ambientale. Le nuove indagini, che si sono concentrate nell’area lagunare alle spalle della Peschiera, hanno l’obiettivo di applicare al contesto norense quelle ricerche ambientali, biologiche e geomorfologiche diffuse in altri siti del Mediterraneo.The project “Nora and the sea”, aimed at the reconstruction of the ancient coastal landscape of Nora, is facing a new phase of multidisciplinary collaboration, focused on the environmental side. The new investigations have been taking place in the lagoon area beyond the “Peschiera”. Their goal is gaining for the very context of Nora those environmental, biological and geomorphological researches so far carried out on other sites of the Mediterranean.
The geometric inventory and documentation of rock art present great challenges due to the high number of petroglyphs present in a territory, the distance between them, the state of abandonment of forest areas in many cases, limitations to access, and the geometric characteristics of such art. Structure from motion (SfM) photogrammetry was positioned as an ideal technique for its documentation, but this technique has great variability in its methodology and cost. In this study, an extremely simple and effective method based on SfM photogrammetry with low cost cloud computing software is compared to the use of a terrestrial laser scanner (TLS) and professional SfM workflow to generate 3D models for documenting petroglyphs. The comparison is made on the 3D documentation of the Castriño de Conxo petroglyph in Santiago de Compostela, Spain, and extrapolated to two practical experiments on other petroglyphs. The meshes are compared by analysing visual, geometrical and operational criteria and how they influence the radiance scaling shading. The results show that with a SfM photogrammetry methodology, which is extremely simple and accessible to everyone, it is possible to obtain better results in geometric and visual aspects than those obtained with TLS, and they are valid for a detailed analysis of this type of rock art in a massively social approach of documentation that is not possible through other approaches.
Procrustean methods are used to transform one set of data to represent another set of data as closely as possible. The name derives from the Greek myth where Procrustes invited passers-by in for a pleasant meal and a night's rest on a magical bed that would exactly fit any guest. He then either stretched the guest on the rack or cut off their legs to make them fit perfectly into the bed. Theseus turned the tables on Procrustes, fatally adjusting him to fit his own bed.This text, the first monograph on Procrustes methods, unifies several strands in the literature and contains much new material. It focuses on matching two or more configurations by using orthogonal, projection and oblique axes transformations. Group-average summaries play an important part and links with other group-average methods are discussed. This is the latest in the well-established and authoritative Oxford Statistical Science Series, which includes texts and monographs covering many topics of current research interest in pure and applied statistics. Each title has an original slant even if the material included is not specifically original. The authors are leading researchers and the topics covered will be of interest to all professional statisticians, whether they be in industry, government department or research institute. Other books in the series include 23. W.J.Krzanowski: Principles of multivariate analysis: a user's perspective updated edition 24. J.Durbin and S.J.Koopman: Time series analysis by State Space Models 25. Peter J. Diggle, Patrick Heagerty, Kung-Yee Liang, Scott L. Zeger: Analysis of Longitudinal Data 2/e 26. J.K. Lindsey: Nonlinear Models in Medical Statistics 27. Peter J. Green, Nils L. Hjort & Sylvia Richardson: Highly Structured Stochastic Systems 28. Margaret S. Pepe: The Statistical Evaluation of Medical Tests for Classification and Prediction 29. Christopher G. Small and Jinfang Wang: Numerical Methods for Nonlinear Estimating Equations
A contribution to archaeological resource management. Ghent University has organised an archaeological aerial survey of both provinces of East- and West-Flanders since the beginning of the 1980s. As a result of these activities, some 50,000 photographs have been captured. They reveal thousands of archaeological structures, from the Neolithic through to the most recent periods. Since 1997, financial support has been received from the Flemish Community aimed at the realisation of a GIS based database (Access 97 relational database - Arcview 3.1) and the digitalization of some 50% of the photographs. As a result, it was possible to locate all 50,000 images and connect them with geographical information offered by the support centre GIS Flanders. It is expected in the near future that this information will be available for SMR-purposes and archaeological heritage management. There are also several scientific outputs: one of them is the study of Bronze Age barrows.
Maintenance and restoration activity of archaeological structures are often recorded only on paper and not according to standardised procedures. For this reason a large amount of information produced daily can be neither referred to or elaborated. The geographical location of building materials and deterioration patterns, the relations between decay and environmental data, quantitative information on restoration work, products and techniques, are some of the types of information normally generated in the conservation sector which at present are not being used to improve the quality of the restoration activity or to accomplish the institutional task of strategic programming. The main purpose of this research project was to devise a qualitative and quantitative method to evaluate the behaviour over time of products used for the protection and restoration of architectural surfaces, and to establish a single criterion for certifying their performances. Only after the procedures used to record the different kinds of data (geometrical survey, building materials, deterioration patterns etc.) have been standardised will it be possible to correlate and elaborate them, exchange information through a system of local and remote networks, and produce synthesis outlines. The proposal has been tested on the monumental complex of Khor Rori in the Sultanate of Oman. The fortified city, built at about the end of the first century BC, for the protection of a natural harbour, was located on the main maritime route that crossed the Indian Ocean, the Red Sea and the Nile, connecting Rome and the Mediterranean Basin to India (the Frankincense road). The study and the restoration of Khor Rori, chosen as a “pilot project” by the Italian Ministry of Foreign Affairs and financed by public organisations in Italy and Oman, will help to clarify the historical, commercial, and cultural relations between the Mediterranean and the Southern Arabian peninsula. This study has been conducted with the support of Siatel S.n.C. of Perugia and the Studio Menci of Arezzo.
El interés entre los estudiosos por el pasado romano y preromano de Montoro no es algo nuevo en el panorama de la investigación histórica andaluza, sino que se remonta a varios siglos atrás. Aunque en los últimos años han ido apareciendo algunos trabajos científicos, que aportan una visión renovada y bien sustentada documentalmente de la más remota historia montoreña, es de justicia iniciar esta síntesis sobre la antigua 'Epora' recordando, siquiera someramente, a algunos eruditos que en tiempos pasados dedicaron su atención y curiosidad a recopilar y considerar muchos de los testimonios materiales de lejanas épocas, que fueron surgiendo de forma dispersa y circunstancial en el solar que va a centrar nuestra atención a lo largo de estas líneas. Ya en el siglo XVI Juan Fernández Franco, adalid de la investigación arqueológica cordobesa, dirigió algunos de sus afanes hacia la antigua historia montoreña (1), inaugurando una tradición que eficazmente continuaría en el siglo XVIII Fernando José López de Cárdenas, conocido como el Cura de Montoro (2), quien estudió las fuentes clásicas, publicó algunas inscripciones romanas y corrigió la lectura de otras ya conocidas. Utilizó materiales anteriores, especialmente los aportados por Fernández Franco, de quien redactó una breve semblanza biográfica. También realizó algunas excavaciones, que le permitieron reunir una colección particular de objetos antiguos. Obra suya es, entre otras dedicadas a la Bética romana, la que lleva por título 'Memorias de la antigua 'Epora', hoy la Villa de Montoro'.
Social media are being fast adopted by older adults for extending their social relationships. However along with the adoption, there have been concerns about risky issues regarding privacy leakages and information sharing hazards. Such risks are partially due to the fact that seniors (knowingly or unknowingly) share private information that may be misused by others. In this paper we explore the privacy-preserving actions regarding information sharing for this demography on one social media platform — Facebook. Facebook is the largest social networking platform today and many of its privacy related practices have been in the news recently. More specifically, we study the information sharing behavior of the elderly by observing the extent to which they opt out of sharing information publicly about themselves on their profile pages. In addition, we also observe how much overlap exists between these older Facebook users and their respective friends in terms of their public information sharing habits and explore the differences across gender. Finally for comparative purposes we also collect data on a sample of younger Facebook users and conduct an analysis.
In a virtual archaeology project, a full transparency in methods, techniques and documentation is necessary in order to define quality standards that are crucial for a discipline that promises to inform, amaze and fascinate with increasing effectiveness and accuracy. However, documentation is often insufficient to guarantee a level of reliability. Comparisons, deductions and methods that allow experts to retrace the reconstructive process in all its parts are always needed. Based on the results of a case study carried out on the Monte Sannace site, several methods are described in order to evaluate the level of reliability of the 3D reconstruction. This process is related to qualitative factors not always easy to weigh up, but highly important in compliance with Principle no. 7 of the Seville Charter: transparency of information and specification of the methods applied. From a theoretical point of view, analogies and differences in modern restoration methods are analysed, and the results are described in relation to the communicative and emotional objectives of the project. The reconstruction of the Monte Sannace site represents a significant step towards the full appreciation of a little-known area with important archaeological and naturalistic features.
The increase of 3D acquisition and modeling techniques applied to archeology is due principally to (i) their capacity to survey archeological artifacts with high precision and a non-contact approach and (ii) the possibility to create 3D digital models useful for data analysis, simulation and preservation. These benefits in terms of knowledge oblige the contemporary archaeologist to acquire a better understanding of 3D acquisition and modeling principles and practice. This evidence arises from the necessity of adopting a common language for experts in 3D data management and archaeologists with the principal aim being the understanding of each other’s requirements and sharing of the purposes of the project. In this article the authors propose a concise but exhaustive explanation of the working principles of active and passive 3D acquisition techniques. For each one a description of instruments and methodologies is developed, pointing out pros and cons of every technique. In conclusion, a sensor fusion approach is presented as an interesting solution to increase the instrument performances while obtaining at the same time a quality improvement of 3D acquisition and modeling results. A final multi-resolution application about Pompeii Forum 3D modeling follows and closes the article.
Principal component analysis (PCA) is a widely used multivariate method in archaeology, and is particularly prevalent in archaeometric applications. The paper reviews the use of the methodology in archaeometry, including the choice of data transformation and standardisation. The related methods of factor analysis and correspondence analysis are also briefly considered. Two detailed examples illustrate some of the methods discussed, including uncommon approaches such as the use of ranked data.
Principal component analysis is central to the study of multivariate data. Although one of the earliest multivariate techniques, it continues to be the subject of much research, ranging from new model-based approaches to algorithmic ideas from neural networks. It is extremely versatile, with applications in many disciplines. The first edition of this book was the first comprehensive text written solely on principal component analysis. The second edition updates and substantially expands the original version, and is once again the definitive text on the subject. It includes core material, current research and a wide range of applications. Its length is nearly double that of the first edition. Researchers in statistics, or in other fields that use principal component analysis, will find that the book gives an authoritative yet accessible account of the subject. It is also a valuable resource for graduate courses in multivariate analysis. The book requires some knowledge of matrix algebra. Ian Jolliffe is Professor of Statistics at the University of Aberdeen. He is author or co-author of over 60 research papers and three other books. His research interests are broad, but aspects of principal component analysis have fascinated him and kept him busy for over 30 years.
Objectives: Peer review is a powerful tool that steers the education and practice of medical researchers but may allow biased critique by anonymous reviewers. We explored factors unrelated to research quality that may influence peer review reports, and assessed the possibility that sub-types of reviewers exist. Our findings could potentially improve the peer review process.Methods: We evaluated the harshness, constructiveness and positiveness in 596 reviews from journals with open peer review, plus 46 reviews from colleagues’ anonymously reviewed manuscripts. We considered possible influencing factors, such as number of authors and seasonal trends, on the content of the review. Finally, using machine-learning we identified latent types of reviewer with differing characteristics.Results: Reviews provided during a northern-hemisphere winter were significantly harsher, suggesting a seasonal effect on language. Reviews for articles in journals with an open peer review policy were significantly less harsh than those with an anonymous review process. Further, we identified three types of reviewers: nurturing, begrudged, and blasé.Conclusion: Nurturing reviews were in a minority and our findings suggest that more widespread open peer reviewing could improve the educational value of peer review, increase the constructive criticism that encourages researchers, and reduce pride and prejudice in editorial processes.
Copper mineralized plant fibre cordage (c. 1500) found at an archaeological site was used to study fibre microstructural degradation in response to a specific burial environment and the preservation of textiles through mineralization. The process of cellulose fibre mineralization was simulated in the laboratory in an effort to prepare mineralized plant fibres under known conditions. A model for dyeing cellulosic fibres was adopted to explain the process of fibre mineralization. The characteristics of the microstructures of archaeological and laboratory mineralized fibres were examined and compared with those of modern Indian hemp fibres using scanning electron microspectroscopy (SEM), energy dispersive X-ray spectroscopy (EDS) and X-ray diffraction (XRD). SEM and EDS results reveal some similarities between the archaeological and the laboratory mineralized fibres. Infilling and replacement with copper minerals resulting from the corrosion of associated metals was found on both fibre outer and inner (lumen) surfaces. Possible types of fibre degradation were inferred from the observed physical and chemical microstructures of the mineralized fibres. The simulation of fibre mineralization in the laboratory sheds some light on the study of mechanisms of fibre mineralization in the preservation of archaeological textiles through the replacement with inorganic minerals.
Here we present the Compendium Isotoporum Medii Aevi (CIMA), an open-access database gathering more than 50,000 isotopic measurements for bioarchaeological samples located within Europe and its margins, and dating between 500 and 1500 CE. This multi-isotope (δ13C, δ15N, δ34S, δ18O, and 87Sr/86Sr) archive of measurements on human, animal, and plant archaeological remains also includes a variety of supporting information that offer, for instance, a taxonomic characterization of the samples, their location, and chronology, in addition to data on social, religious, and political contexts. Such a dataset can be used to identify data gaps for future research and to address multiple research questions, including those related with studies on medieval human lifeways (i.e. human subsistence, spatial mobility), characterization of paleo-environmental and -climatic conditions, and on plant and animal agricultural management practices. Brief examples of such applications are given here and we also discuss how the integration of large volumes of isotopic data with other types of archaeological and historical data can improve our knowledge of medieval Europe.
The aim of this study is to obtain an estimate of the volumetric capacity of a selection of ceramic vessels from the Neolithic site of Lugo di Grezzana (Verona, Italy). The method applied involved the use of Blender, a free and open source 3D computer graphics software. This program can calculate the volume from the graphic elaboration of the archaeological drawing of the artifacts. Through the calculation of volume it has been possible to obtain an estimate of the total capacity of the vessels, proposing two types of content. Volumetric estimates were then compared between the diameter and height of each ceramic vessels, to define size classes. The research shows that the internal variability of some ceramic shapes could be the consequence of different functional and/or cultural choices. The methodology tested in this paper could be applied in future research projects.
The remains of the medieval town of Castelmonardo (Calabria, Italy) are located on a hill, a few kilometers SE from the modern town of Filadelfia (Vibo Valentia). Since the 1970s archaeological excavations have been carried out in selected areas of the hill. However, a systematic archaeological investigation of the whole archaeological site has never been conducted before. The paper presents the preliminary results of the first archaeological prospection conducted in Castelmonardo by means of advanced remote sensing techniques, with the goal of achieving a first GIS-based digital mapping of the archaeological site. The recently developed UAV LiDAR technology, consisting in the use of high precision laser scanners mounted on Unmanned Aerial Vehicles (UAV), commonly known as drones, was applied to realize a high resolution digital terrain model (DTM) of the site. Integrating the LiDAR data with web GIS based aerial images, a preliminary archaeological interpretation of the whole archaeological site was conducted, offering a suitable base for further analysis and virtual reconstructions. The work presented here was conducted as part of a recently initiated research project focused on Castelmonardo, led by the Department of Art History and Performing Arts - Sapienza University of Rome, and conducted in cooperation with DigiLab Research and Services - Sapienza University of Rome, Istituzione Comunale Castelmonardo - Filadelfia (Italy), the Ludwig Boltzmann Institute for Archaeological Prospection and Virtual Archaeology - Vienna (Austria), Virtutim srls and the Italian company OBEN s.r.l.
The institutional goal of studying and mapping archaeological potential in SITAR in recent years has been to create an efficient tool to support urban planning and cultural heritage management: The Archaeological Potential Map of Rome. The Soprintendenza of Rome plays a key role in this effort, being responsible for the safeguarding and promotion of the city’s archaeological heritage. By developing a robust model of archaeological potential, the Soprintendenza can better anticipate and mitigate the impact of construction and development projects on archaeological sites. This proactive approach ensures that significant archaeological resources are identified and preserved before they are damaged or destroyed. The tool will facilitate informed decision-making in urban planning, helping to balance the needs of modern development with the preservation of historical sites. Moreover, it will support the regulatory framework that mandates archaeological assessments in high-potential areas, rationalisation of administrative processes and improving compliance with heritage protection regulations. Overall, the creation of an efficient archaeological potential model by the Soprintendenza of Rome underlines the commitment to preserving the city’s cultural heritage while accommodating its continuous urban evolution.
Tell Rifa‘at, in the ‘Azaz Casa, is situated in the midst of the village of the same name, thirty-five kilometres to the north of Aleppo in the Syrian Region of the United Arab Republic.The modern village, with a population of 5–6,000, is the seat of a Mudir Nahir; it lies five kilometres east of the Aleppo-‘Azaz road, close to the Turkish border, and is on the railway line from Aleppo to Istanbul. Trial excavations were made on this site by the Czech philologist Hrozny in 1924, but after three months’ work he abandoned his investigations; no detailed report on this work has so far been published, and apart from a few fragments in the Aleppo Museum no material from the excavation appears to be extant.My attention was drawn to Tell Rifa‘at in 1953 in the course of a survey of sites on either side of the Syro-Turkish frontier; but it was not until 1956 that under the auspices of the Institute of Archaeology, University of London and with the assistance of a grant from the Wenner-Gren Foundation for Anthropological Research and from the Australian Institute of Archaeology, Melbourne, a preliminary investigation of the site was undertaken. A further season had been planned for 1957, but owing to the political situation it was not possible to resume work until the summer of 1960, when a two-month season was carried out with the support of funds from the Museum of Archaeology and Ethnology and the C. H. W. Johns Memorial Fund of Cambridge University, the Ashmolean Museum, the Australian Institute of Archaeology, Melbourne, the Russell Trust, and the City Museum and Art Gallery, Birmingham.
This study is concerned with the exploitation of resources by human groups in the Carmel area over a period of about 50,000 years. To this end an attempt is made to evaluate the changing economic potential of the ‘catchments’ of individual sites, for this enables us to make a comparative analysis of hunting-gathering and agricultural economies. Although the study of prehistoric sites is now usually complemented by some treatment of their setting, the situation in their immediate vicinity—the principal concern of the inhabitants—tends to be neglected or at any rate overshadowed by generalized statements regarding the physiographic, vegetational, climatic or kindred zones of which they form a (not necessarily typical) part. The difficulty that is encountered by attempts to harmonize the findings of the various specialists who nowadays contribute to archaeological studies may, in fact, be due to the limitations of a zonal approach, for the populations or phenomena with which each of them is concerned do not necessarily refer to a single ‘catchment’ area and may have little connection with the picture of the exploited territory that is imprinted on the site record.
Questo libro trova un degno posto all'interno del panorama della letteratura e dell'insegnamento grazie alle parole e alle espressioni che gli donano un carattere distintivo.
Predictive modelling is a set of techniques, used since the 1970s, to predict the location of archaeological sites in uninvestigated areas as an aid to spatial planning, for example, in Cultural Resource Management. Predictive modelling is also used to develop and test scientific models of human locational behaviour, as it is based on either statistical extrapolation from known archaeological data, or on explanatory models of site location preference. In practice, a number of methods can be used in predictive modelling, and the resulting maps of predicted site locations or density can vary in accuracy. The main difficulties in producing accurate and precise predictive models are coupled to the resolution and representativeness of the archaeological and non-archaeological datasets used, the theoretical frameworks underlying the models, and the nature, or lack, of model testing. Nonetheless, predictive models are often found useful to provide basic protection to areas of high sensitivity, and can save costs for development projects or archaeological investigations.
This paper discusses the results of an inductive predictive modelling experiment on Roman settlement data from the middle Tiber valley, Italy. The study forms part of the British School at Rome’s Tiber Valley Project, which since its inception in 1997 has been assessing the changing landscapes of the Tiber Valley from protohistory through to the medieval period. The aim of this present study is to broaden understanding of settlement patterns via predictive modelling, and in particular to evaluate unevenness in field survey coverage, survey bias and past settlement location preferences. The predictive modelling method chosen was an application of the statistical Weights of Evidence extension for ESRI ArcView. The results highlight associations between Roman settlement and environmental themes that provide moderate predictive potential and suggest that further experimentation might prove valuable.
The use of GIS and Spatial Analysis for predictive models is an important topic in preventive archaeology. Both of these tools play an important role in the Support Decision System (SDS) for archaeological research and for providing information useful to reduce archaeological risk. Over the years, a number of predictive models in the GIS environment have been developed and proposed. The existing models substantially differ from each other in methodological approaches and parameters used for performing the analysis. Until now, only few works consider spatial autocorrelation, which can provide more effective results. This paper provides a brief review of the existing predictive models, and then proposes a new methodological approach, applied to the neolithic sites in the Apulian Tavoliere (Southern Italy), that combines traditional techniques with methods that allow us to include spatial autocorrelation analysis to take into account the spatial relationships among the diverse sites.
Predictive modeling is a technique to predict the location of archaeological sites in uninvestigated areas that has been used since the 1970s to aid spatial planning, for example, in cultural resource management. Predictive modeling is also used to develop and test scientific models of human locational behavior, as it is based on either statistical extrapolation of known archaeological data or explanatory models of site location preference. In practice, a number of methods can be used in predictive modeling, and the resulting maps of predicted site density can vary in accuracy. The main difficulties in producing accurate predictive models are coupled with the resolution and representativeness of the archaeological and nonarchaeological datasets used, the theoretical frameworks underlying the models, and the lack of model testing. Nonetheless, predictive models are very useful to provide basic protection to areas of high sensitivity, and can save costs for archaeological investigations.
In archaeological research about feeding modes of past societies, different interests and methodologies have been developed. In their search for knowledge about Mayan foods and cooking methods, scholars such as Herrera Flores and Götz [2014. “La alimentación de los antiguos mayas de la Península de Yucatán: Consideraciones sobre la identidad y la cuisine en la época prehispánica.” Estudios de Cultura Maya 43 (43): 69–98. doi:10.1016/S0185-2574(14)70325-9] have paid attention to available resources, diet, and cuisine. Food is more than food intake, as it also relates to other aspects like health, identity, gender roles, worldview, memory, and emotions. For the Classic Maya site of Sihó, Yucatan, our case study, the research is oriented towards the gastronomy of the elites. Through the study of chemical residues and identification of starch granules in ceramic fragments of five types of containers namely dishes, bowls, jars, vases, and basins. This study aimed at identifying related ingredients, preparation processes and service practices, suggesting particular ways of cooking and consumption patterns. The test results were compared and supplemented with zooarchaeological evidence, iconography, historical and ethnographic records.
The paper presents the new relational database Pre-Biblio on the Prehistory and Quaternary geology of Italy. It will be mainly composed of two correlated databases, «sites» and «bibliography» and some others such as the biographies of the most important scholars, the taxa of fossil remains contained in the sites, the palaeobasins. Each bibliographic reference will be linked to sites, which will be georeferenced on the 1:25.000 topographic maps of the «SIGEC» GIS system of the Ministry of Culture. An accurate survey of Italian and foreign literature regarding archaeology, vertebrate palaeontology, geology and related sciences from Villafranchian to Early Iron Age has been conducted in the most important libraries of Italian Institutions specialized in Quaternary studies and Prehistory. An estimated total of about 25,000 sites, 60,000 references and 800,000 links represents the core of the project, which could be concluded in five years with a team of eight specialists. The paper also provides a preliminary appraisal of the chronological distribution of published sites covering the whole Italian territory, according to which the protohistory (Bronze and First Iron Ages) accounts for half of the estimated body of data.
In this paper, we introduce Linked Open Data (LOD) in the archaeological domain as a means to connect dispersed data sources and enable cross-querying. The technology behind the design principles and how LOD can be created and published is described to enable less-familiar researchers to understand the presented benefits and drawbacks of LOD. Wikidata is introduced as an open knowledge hub for the creation and dissemination of LOD. Different actors within archaeology have implemented LOD, and we present which challenges have been and are being addressed. A selection of projects showcases how Wikidata is being used by archaeologists to enrich and open their databases to the general public. With this paper, we aim to encourage the creation and re-use of LOD in archaeology, as we believe it offers an improvement on current data publishing practices.
The integration of geodata and building models is one of the current challenges in the AECOO (architecture, engineering, construction, owner, operation) domain. Data from Building Information Models (BIM) and Geographical Information Systems (GIS) can’t be simply mapped 1:1 to each other because of their different domains. One possible approach is to convert all data in a domain-independent format and link them together in a semantic database. To demonstrate, how this data integration can be done in a federated database architecture, we utilize concepts of the semantic web, ontologies and the Resource Description Framework (RDF). It turns out, however, that traditional object-relational approaches provide more efficient access methods on geometrical representations than triplestores. Therefore we developed a hybrid approach with files, geodatabases and triplestores. This work-in-progress-paper (extend abstract) demonstrates our intermediate research results by practical examples and identifies opportunities and limitations of the hybrid approach.
Here, we present a new method to scan a large number of lithic artefacts using three-dimensional scanning technology. Despite the rising use of high-resolution 3D surface scanners in archaeological sciences, no virtual studies have focused on the 3D digitization and analysis of small lithic implements such as bladelets, microblades, and microflakes. This is mostly due to difficulties in creating reliable 3D meshes of these artefacts resulting from several inherent features (i.e., size, translucency, and acute edge angles), which compromise the efficiency of structured light or laser scanners and photogrammetry. Our new protocol StyroStone addresses this problem by proposing a step-by-step procedure relying on the use of micro-computed tomographic technology, which is able to capture the 3D shape of small lithic implements in high detail. We tested a system that enables us to scan hundreds of artefacts together at once within a single scanning session lasting a few hours. As also bigger lithic artefacts (i.e., blades) are present in our sample, this protocol is complemented by a short guide on how to effectively scan such artefacts using a structured light scanner (Artec Space Spider). Furthermore, we estimate the accuracy of our scanning protocol using principal component analysis of 3D Procrustes shape coordinates on a sample of meshes of bladelets obtained with both micro-computed tomography and another scanning device (i.e., Artec Micro). A comprehensive review on the use of 3D geometric morphometrics in lithic analysis and other computer-based approaches is provided in the introductory chapter to show the advantages of improving 3D scanning protocols and increasing the digitization of our prehistoric human heritage.
Here, we present a new method to scan a large number of lithic artefacts using three-dimensional scanning technology. Despite the rising use of high-resolution 3D surface scanners in archaeological sciences, no virtual studies have focused on the 3D digitization and analysis of small lithic implements such as bladelets, microblades, and microflakes. This is mostly due to difficulties in creating reliable 3D meshes of these artefacts resulting from several inherent features (i.e., size, translucency, and acute edge angles), which compromise the efficiency of structured light or laser scanners and photogrammetry. Our new protocol StyroStone addresses this problem by proposing a step-by-step procedure relying on the use of micro-computed tomographic technology, which is able to capture the 3D shape of small lithic implements in high detail. We tested a system that enables us to scan hundreds of artefacts together at once within a single scanning session lasting a few hours. As also bigger lithic artefacts (i.e., blades) are present in our sample, this protocol is complemented by a short guide on how to effectively scan such artefacts using a structured light scanner (Artec Space Spider). Furthermore, we estimate the accuracy of our scanning protocol using principal component analysis of 3D Procrustes shape coordinates on a sample of meshes of bladelets obtained with both micro-computed tomography and another scanning device (i.e., Artec Micro). A comprehensive review on the use of 3D geometric morphometrics in lithic analysis and other computer-based approaches is provided in the introductory chapter to show the advantages of improving 3D scanning protocols and increasing the digitization of our prehistoric human heritage.