Reassembling the Social is a fundamental challenge from one of the world's leading social theorists to how we understand society and the "social". Bruno Latour's contention is that the word "social" as used by Social Scientists has become laden with assumptions to the point where it has become a misnomer. When the adjective is applied to a phenomenon, it is used to indicate a stabilized state of affairs, a bundle of ties that in due course may be used to account for another phenomenon. Latour also finds the word used as if it described a type of material, in a comparable way to an adjective such as "wooden" or "steely".Rather than simply indicating what is already assembled together, it is now used in a way that makes assumptions about the nature of what is assembled. It has become a word that designates two distinct things: a process of assembling: and a type of material, distinct from others. Latour shows why "the social" cannot be thought of as a kind of material or domain, and disputes attempts to provide a "social explanation" of other states of affairs. While these attempts have been productive (and probably necessary) in the past, the very success of the social sciences mean that they are largely no longer so. At the present stage it is no longer possible to inspect the precise constituents entering the social domain. Latour returns to the original meaning of "the social" to redefine the notion and allow it to trace connections again. It will then be possible to resume the traditional goal of the social sciences, but using more refined tools. Drawing on his extensive work examining the "assemblages" of nature, Latour finds it necessary to scrutinize thoroughly the exact content of what is assembled under the umbrella of Society. This approach, a "sociology of associations" has become known as Actor-Network-Theory, and this book is an essential introduction both for those seeking to understand Actor-Network-Theory, or the ideas of one of its most influential proponents.
The revolution of digital technologies in the past has focused attention mainly on the technical power and not on the semantic level of informative and communicational aspects. In the field of virtual heritage the risk was/is to enhance the amazing esthetic features despite the informative/narrative feedback and cognition within the virtual worlds. How much information can I get from a virtual system? How does it communicate? How can we process this kind of interactive information? The importance of the virtual reality systems in the applications of cultural heritage should be oriented towards the capacity to change ways and approaches to learning. The Virtual communicates, the user learns and creates new information. Typically we define as linear learning, tools and actions, such as books, audio guides, catalogues and so on (in this case the communication is a linear sequence), and reticular learning VR systems where the user is immersed within reticules of information and visual data. In this paper we try to analyse the relations between virtual reality, cultural heritage and cybernetics according to an ecological approach.
Lo Studiolo di Federico da Montefeltro permette ancora oggi di ammirare il gusto fastoso della corte urbinate. Le tarsie lignee, i ritratti degli Uomini Illustri e il soffitto a lacunari dorati adornano infatti questo spazio di una ricca simbologia celebrativa del Duca, nella sua duplice natura di uomo di guerra e uomo di pace. Il presente contributo ne propone una fruizione virtuale, una narrazione interattiva che svela la figura di Federico attraverso questo suo luogo d’elezione e che si configura come mezzo per mantenerlo accessibile anche laddove una chiusura forzata non ne permetta una fruizione fisica.
The importance of cultural and natural heritage documentation is well recognized at international level, and there is an increasing pressure to document and preserve heritage also digitally. The continuous development of new sensors, data capture methodologies, and multi-resolution 3D representations and the improvement of existing ones can contribute significantly to the 3D documentation, conservation, and digital presentation of heritages and to the growth of the research in this field. The article reviews some important documentation requirements and specifications, the actual 3D surveying and modeling techniques and methodologies with their limitations and potentialities as well some visualization issues involved in the heritage field. Some examples of world heritage sites 3D documentation are reported and discussed.
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.
The current Web of Data, including linked datasets, RDFa content, and GRDDL-enabled microformats is a read-only Web. Although this read-only Web of Data enables data integration, faceted browsing and structured queries over large datasets, we lack a general concept for a read-write Web of Data. That is, we need to understand how to create, update and delete RDF data in a secure, reliable, trustworthy and scalable way. Attempting to change this situation, this paper reviews available components, presents our vision of a uniform architecture for a read-write Web of Data as well as a proof of concept. The paper exposes issues and challenges of the proposed architecture and discusses the next necessary steps.
Although data reusers request information about how research data was created and curated, this information is often non-existent or only briefly covered in data descriptions. The need for such contextual information is particularly critical in fields like archaeology, where old legacy data created during different time periods and through varying methodological framings and fieldwork documentation practices retains its value as an important information source. This article explores the presence of contextual information in archaeological data with a specific focus on data provenance and processing information, i.e., paradata. The purpose of the article is to identify and explicate types of paradata in field observation documentation. The method used is an explorative close reading of field data from an archaeological excavation enriched with geographical metadata. The analysis covers technical and epistemological challenges and opportunities in paradata identification, and discusses the possibility of using identified paradata in data descriptions and for data reliability assessments. Results show that it is possible to identify both knowledge organisation paradata (KOP) relating to data structuring and knowledge-making paradata (KMP) relating to fieldwork methods and interpretative processes. However, while the data contains many traces of the research process, there is an uneven and, in some categories, low level of structure and systematicity that complicates automated metadata and paradata identification and extraction. The results show a need to broaden the understanding of how structure and systematicity are used and how they impact research data in archaeology and in comparable field sciences. The insights into how a dataset’s KOP and KMP can be read is also a methodological contribution to data literacy research and practice development. On a repository level, the results underline the need to include paradata about dataset creation, purpose, terminology, dataset internal and external relations, and eventual data colloquialisms that require explanation to reusers.
Questa nuova edizione critica di tutte le epigrafi in greco rinvenute nel territorio di Ravenna e di Glasse o conservate nei Musei di Ravenna raccoglie quaranta iscrizioni, datate tra il II e l'VIII secolo d.C, corredate da lemma genetico, apparato, traduzione italiana, commento e immagini. Nella prospettiva di un recupero della 'grecità' ravennate, una rinnovata valutazione delle iscrizioni greche di Ravenna può costituire la base per ricomporre il quadro del linguaggio epigrafico dei committenti di origine greca e orientale presenti in città nelle varie epoche del mondo antico.
RAPTOR is a project, still under development, designed to build a simple and versatile tool in order to computerize the administrative procedures of the Italian Superintendence for Archaeological Heritage. Its purpose is to ensure a faster response to all kinds of external requests and to align, as much as possible, the Superintendency offices to the new Code of Digital Administration (CAD). RAPTOR geodatabase is based on open source software PostgreSQL and PostGIS, while the web-interface management is provided by PHP, JavaScript, GeoServer and OpenLayers. In this way all vector data can be entered into the system through specific compilation forms and displayed on a map, where they can also be queried. In short, RAPTOR will provide the users a complete and accurate mapping module, which will be able to show in real time a thematic cartography provided both with known archaeological evidence and negative areas.
RAPTOR is a geo-database built for the management of the archaeological administrative procedure of the Superintendencies. The system allows the recording of any kind of work carried out in any geographical context and the mapping of the archeological outcomes so that the archaeological maps can be constantly updated. A set of the archaeological data recorded in the system can now be freely accessed on the map by external users; archaeologists in particular can see full information of the archaeological sites. In order to support preventive archaeology, a new section of the system currently allows to map the preventive archaeology investigations and to obtain automatically the vector data of the archaeological sites within the project areas. Moreover, RAPTOR enables now to record and show the plans of the archaeological phases of each single site or urban context; on the other hand a new specific section is dedicated to the drawing of the areas of archaeological potential.
RAPTOR (Ricerca Archivi e Pratiche per la Tutela Operativa Regionale) is a geo-database developed in order to supply officials of the Italian Superintendency for Archaeological Heritage with a user-friendly instrument to handle those daily administrative practices that have an impact on the territory. The system, two years after it was presented for the first time during the 2012 ArcheoFOSS, has been tested and developed in order to refine the computer-supported procedure that now enables us to manage the whole variety of work carried out in every kind of geographical context, including urban and marine sites. The mapping of the archaeological results is also envisaged. Part of the computer procedure consists of a quick recording system, which allows the official archaeologist to register the basic data including geographic features of an archaeological site or of areas with no archaeological evidence. At the same time, a more detailed analysis is also possible. Geometries can be linked to the site information sources and the whole available scientific record can be uploaded. In this way, it is also possible to manage the most complex sites. Archaeological firms can log on to the system to upload the excavation reports drawn up in line with the standards outlined by the Superintendencies.
Landscape features are the result of interrelated actions of man-and-nature and can provide ecosystem services that need to be protected. Since urban planning policies can impact negatively on the conservation of cultural ecosystem services, urban plans must map them and make provision for their protection. For the Plan of Franciacorta (22 municipalities in Lombardy), we chose QGIS to set up a geo-database and map cultural heritage information. QGIS can provide more flexibility than a typical map, thanks to its graphics tools. To plan the development of actions to protect the landscape and suggest a range of planning opportunities for municipalities, an integrated representation of the landscape and protected ecological elements can highlight some critical issues: municipal borders can prove an obstacle in the implementation of supra-municipal projects and protected areas can include enclaves potentially vulnerable to urban pressures. Such maps have proved useful in guiding the planning choices in the development of the landscape protection schemes. The geo-location of critical aspects has brought out a range of inter-municipal planning opportunities.
, Ce dictionnaire critique, dirigé par Philippe Sénéchal et Claire Barbillon présente une sélection représentative des historiens et historiennes de l’art actifs en France de la Révolution à la Première Guerre mondiale. Il est en cours de réédition et sera accessible sur OpenEdition Books dans la collection « Traverses ». Ce volume sera complété par … Continued
This paper reviews the evidence for long term trends in anthropogenic activity and population dynamics across the Holocene in the central Mediterranean and the chronology of cultural events. The evidence for this has been constituted in a database of 4608 radiocarbon dates (of which 4515 were retained for analysis following initial screening) from 1195 archaeological sites in southern France, Italy and Malta, spanning the Mesolithic to Early Iron Age periods, c. 8000 to 500 BC. We provide an overview of the settlement record for central Mediterranean prehistory and add to this an assessment of the available archaeological radiocarbon evidence in order to review the traditional narratives on the prehistory of the region. This new chronology has enabled us to identify the most significant points in time where activity levels, population dynamics and cultural change have together caused strong temporal patterning in the archaeological record. Some of these episodes were localized to one region, whereas others were part of pan-regional trends and cultural trajectories that took many centuries to play out fully, revealing prehistoric societies subject to collapse, recovery, and continuing instability over the long-term. Using the radiocarbon evidence, we model growth rates in the various regions so that the tempo of change at certain points in space and time can be identified, compared, and discussed in the context of demographic change. Using other published databases of radiocarbon data, we have drawn comparisons across the central Mediterranean to wider prehistoric Europe, and northern Africa. Finally, we include a brief response to the synchronously published but independently developed paper (Palmisano et al. in J World Prehist 34(3), 2021). While there are differences in our respective approaches, we share the general conclusions that large-scale trends can been identified through meta-analyses of the archaeological record, and these offer new perspectives on how society functioned.
Two new series of radiocarbon dates on human bones from passage graves in the Falbygden area in south-western Sweden are presented. It has long been nearly axiomatic that dolmens appeared earlier than passage graves, but the new dates indicate taht both types of megalithic grave were introduced at the same time in the later part of the early Neolithic. This also means that the oldest types vary between different regions. In Denmark, the oldest type of megalithic grave is the domens, while passage graves were built from the beginning in Falbygden.
The present technique of digital image processing follows the concept of analytical rectification, allowing for the elimination of the geometric distortions from the original image and the retrieval of the correct dimensional information. The image can be produced in various ways: most often, sampling is done with a scanner, but recently a new method is becoming more frequently used, that is the acquisition of digital images directly on ground by means of digital cameras with a CCD image sensor. The processing software is now offered by various producers of photogrammetric equipment, which allows us to carry out surveys of manufactured flat items, starting from a single image, to create a vector graphics superimposed in CAD environment.
La parola ‘storytelling’ è ormai diventata di moda e rischia di perdere i contorni del suo significato. Eppure, lo storytelling e la narrazione in generale si stanno rivelando, al di là delle mode, strumenti comunicativi di eccezionale potenza, tanto da far temere che il loro utilizzo possa trasformarsi in una sorta di manipolazione di massa. Spaziando dall’antropologia alla semiotica, dalla sociologia alle neuroscienze, questo libro mostra come l’attività narrativa sia connaturata all’essere umano e come la nostra organizzazione sociale si fondi anche sul racconto. Partendo da questo presupposto, Raccontare propone una panoramica sulle tecniche di narrazione della realtà e sui loro ambiti di applicazione: dalle organizzazioni ai media, dal teatro al racconto dei territori. Non possiamo vivere senza racconti. Ma, in un mondo in cui le narrazioni sono diventate sempre più pervasive e sofisticate, è necessario imparare a orientarsi. Questo libro è una mappa preziosa per muoverci tra i tanti campi di impiego dello storytelling.
R is a scientific programming language that is widely used by archaeologists. This entry briefly describes the history and distinctive characteristics of the language, and how archaeologists have used it. The importance of R for reproducible research in archaeology is outlined, and future directions for the language in archaeology are indicated.
The general digital reconstruction of the necropolis in Numana was carried on following a methodology targeted to a quick survey at different scales: from the single ceramic or artifact to the whole archaeological landscape. Fostering the application of common computer graphics techniques, an easily replicable process was set up, in order to produce 3D models mainly adopted for archaeological analysis and collection of data that could have been acquired in different times, with different approaches.
The use of quantitative graphs began, in Italian archaeology, between the end of the Fifties and the beginnings of the Sixties in the last century, thanks to the work of Renato Peroni (Bronze and Iron Age) and Alberto Broglio (Palaeolithic). In 1976-1977 Amilcare Bietti and Alberto Cazzella published the first important article on the subject in the journal Dialoghi di Archeologia. The Eighties began with Amilcare Bietti publication of the first monograph on the use of mathematical and statistical methods in archaeology, that were to become very popular in many works inspired by processual archaeology. In 1987 the monograph Archeologia e Calcolatori, by Paola Moscati, was published; three years later the first issue of the homonymous Journal was edited. The last 'chapter' of this history is the introduction of new methods (functional analysis of objects and GIS) between the end of the Nineties and the beginnings of Twenty-first century. From that period onwards, the use of quantitative methods became daily routine practice in archaeology in our country.
Paleoclimate reconstructions have enhanced our understanding of how past climates have shaped present-day biodiversity. We hypothesize that the geographic extent of Pleistocene forest refugia and suitable habitat fluctuated significantly in time during the late Quaternary for chimpanzees (Pan troglodytes). Using bioclimatic variables representing monthly temperature and precipitation estimates, past human population density data, and an extensive database of georeferenced presence points, we built a model of changing habitat suitability for chimpanzees at fine spatio-temporal scales dating back to the Last Interglacial (120,000 BP). Our models cover a spatial resolution of 0.0467° (approximately 5.19 km2 grid cells) and a temporal resolution of between 1000 and 4000 years. Using our model, we mapped habitat stability over time using three approaches, comparing our modeled stability estimates to existing knowledge of Afrotropical refugia, as well as contemporary patterns of major keystone tropical food resources used by chimpanzees, figs (Moraceae), and palms (Arecacae). Results show habitat stability congruent with known glacial refugia across Africa, suggesting their extents may have been underestimated for chimpanzees, with potentially up to approximately 60,000 km2 of previously unrecognized glacial refugia. The refugia we highlight coincide with higher species richness for figs and palms. Our results provide spatio-temporally explicit insights into the role of refugia across the chimpanzee range, forming the empirical foundation for developing and testing hypotheses about behavioral, ecological, and genetic diversity with additional data. This methodology can be applied to other species and geographic areas when sufficient data are available.
The present paper aims at studying Roman sanctuaries from the late 4th century BC to the early 4th century AD in southern coastal Latium, a region of crucial importance for the development of Roman religion. Quantitative GIS-based research was undertaken to study sacred spaces in their natural and cultural landscape context. A first research question concerned the role of good accessibility of sanctuaries as a factor, which could have influenced the choice of construction sites for villas. Further research focused on the visibility of sanctuaries in respect to other elements of the cultural landscape such as villas and roads. Cost-distance and viewshed analyses were undertaken to answer these questions. As the analyses are based on published and archived site data, several issues related to the use of legacy survey data had to be faced. Results show that the role of sanctuaries as factors of attraction might not have been extremely high. While a few major sanctuaries with extraordinary visibility conditions are situated in the study area, the overall trend does not confirm the choice of particularly visible spots as a general rule.
Quantitative Analysis in Archaeology introduces the application of quantitative methods in archaeology. It outlines conceptual and statistical principles, illustrates their application, and provides problem sets for practice. Discusses both methodological frameworks and quantitative methods of archaeological analysisPresents statistical material in a clear and straightforward manner ideal for students and professionals in the fieldIncludes illustrative problem sets and practice exercises in each chapter that reinforce practical application of quantitative analysis
A shared commitment to standardising the process of hypothetically reconstructing lost buildings from the past has characterised academic research in recent years and can manifest at various stages of the reconstructive process and with different perspectives. This research specifically aims to establish a user-independent and traceable procedure that can be applied at the end of the reconstructive process to quantify the average level of uncertainty of 3D digital architectural models. The procedure consists of applying a set of mathematical formulas based on numerical values retrievable from a given scale of uncertainty and developed to simplify reuse and improve transparency in reconstructive 3D models. This effort to assess uncertainty in the most user-independent way possible will contribute to producing 3D models that are more comparable to each other and more transparent for academic researchers, professionals, and laypersons who wish to reuse them. Being able to calculate a univocal numerical value that gives information on the global average uncertainty of a certain reconstructive model is an additional synthetic way, together with the more visual false-colour scale of uncertainty, to help disseminate the work in a clear and transmissible way. However, since the hypothetical reconstructive process is a process based on personal interpretation, which inevitably requires a certain level of subjectivity, it is crucial to define a methodology to assess and communicate this subjectivity in a user-independent and reproducible way.
Since 1950, in the history of Quantitative Archaeology, the data approach has been the essence of the mathematical and statistical applications in Archaeology. In the present paper, it is proposed to focus on the process approach and to point out new fields of mathematical applications in Archaeology. Several archaeological processes are shown, for example, archaeological business process, stratigraphy process, post-depositional process, taphonomic process, technological (manufacturing) process, building process, intersite spatial process (landscape archaeology), exchange process, cultural change process. The list is not exhaustive and has only the purpose of illustrating the interest of such an approach. Several examples of applications are given, which show the differences between the data approach and the process approach. The mathematical techniques, which are used, are mainly the description and the quantification of the processes, elementary statistics, data analysis, stochastic models and the simulation by multi-agent systems.
An epigraph is a complex historical document, whose significance is fully acknowledged only if its textual features (script, language, content, etc.) are studied in combination with the contextual information (on the textual support and its provenance). This is the reason why digital epigraphy lies at the crossroads of different disciplines applying ITs to textual and material sources, such as digital philology, computational linguistics, and computational archaeology. The specific interests and methods of those disciplines have exerted an influence on digital epigraphy, which is apparent in the documentary vs statistical approaches applied over time to the electronic treatment of the (re)source ‘inscription’. The aim of the paper is to trace those trends in the application of qualitative vs quantitative methods in the history of studies of digital epigraphy, highlighting the main moments of change, until the most recent developments.
This article discusses the design of a quick response (QR) coded 3D model of a Babylonian mathematical clay tablet for 3D printing purposes, in an attempt to make better use of advanced 3D visualizations, encourage public engagement and question the influence of tagging and 3D printing on the way hu...
The goal of this article is to provide several practical procedures for working within the GIS environment in the archaeological sector, with specific reference to the excavation site, through open source methodologies and software such as Qgis and PyArchinit.It will also demonstrate how to use the data derived from the survey, processed and managed through Qgis and PyArchinit for enhancement projects such as 3d modeling and 3d mapping through Blender software.
The study of epigraphical and archaeological evidence related to a specific topic, in this case infant mortality, has led to the creation of a geo-referencing project in order to collect, store and analyze information about the young deceased and their families. This paper presents a geo-referenced storage and management system that combines both open source software as QuantumGIS and PgAdmin. The use of a RDBMS has been implemented and purposively structured, taking into account the content and the form of the inscriptions studied.
Horizontal wells make use of the principle of the qanat developed in Persia about 2,500 years ago and still widely used there and in other arid regions of the world. The driven horizontal well offers several important advantages over the hand dug qanat especially for livestock watering places.
For rational management of groundwater a holistic approach, linked to the sustainable management of the ecosystem must be developed. It is demonstrated that ancient methods of groundwater management, such as the qanats system, could provide a good example of human ingenuity to cope with water scarcity in a sustainable manner. The catastrophic earthquake of Bam has drawn the attention of researchers and professionals to a great human heritage related to the sustainable management of groundwater in arid zones and the development of a sophisticated culture of rational resource allocation.
The lack of surface water and a high potential for evapotranspiration are the climatic and hydrological characteristics of the arid and semi-arid Middle East regions. In foothills regions the groundwater, often supplied by partially buried alluvial fan systems, has been intercepted and conveyed through the creation of a highly efficient supply system known as ‘qanat’, ‘karez’ or ‘foggara’. This ancient hydraulic technology spread on a large scale since the 6th century BCE, during the rise and development of the Persian Empire. It consists in the excavation of a series of vertical tunnels, like large wells, which are connected by a gently sloping underground channel bringing water by gravity. Through the centuries, the qanats have been not only a sustainable system for exploitation of groundwater resources but also an important factor for the socio-economic and cultural development of local communities. In this paper the Authors illustrate the main aspects of qanats under historical, cultural and socio-economic perspective and the recent decline of the qanat system. An original and still updating database of these type of hydraulic operas, implemented through the collection and analysis of documentary sources, cartographic data, and on-site measurements is then presented. The database, including qanats detected on regional to local scales, might be regarded as a valuable support for the recovery of these structures and a more efficient governance of water resources.
This book offers a ready solution for those who wish to learn more about this fascinating part of our water history and makes accessible to the wider world the traditional knowledge gained from building and maintaining qanats for more than 2,500 years. There is much more here than a summary of the nature and distribution of qanats, and a more extensive journey through the philosophy, methods, tools, and terminology of qanat design and digging than previously assembled. Where does one begin to dig to ensure that the qanat tunnel will flow with water? How are practical considerations of landscape factored into the design? How are water quality and discharge measured? How does excavation proceed through bedrock and unconsolidated soil and how is this knowledge of geology and pedology acquired? How are vertical wells and tunnels excavated to maintain proper air supply, light, and water flow? How does one deal with special problems like tunnel collapse, the accumulation of gasses and vapors, and the pooling of water during construction? How are tools and gauges designed, maintained, and used? How have qanats been incorporated into other structures like watermills, reservoirs, ice houses, and irrigation networks? And how are qanats cleaned, extended, maintained through the ages, and incorporated into modern water supplies?The great contribution of this work is the story it tells of the ingenuity and practical skills of the qanat masters who for centuries and generations have cut an uncountable number of tunnels through bedrock and alluvium using hand tools and homespun solutions to problems that would vex the most experienced university-trained engineers.
Seyyed Mansur Seyyed Sajjadi, Qanat / Kariz. Storia, tecnica costruttiva ed evoluzione, Istituto Italiano di Cultura - Sezione Archeologica, Theran, 1982. Brossura; pp. 172 con illustrazioni in bianco e nero f.t.; ordinarie bruniture, segni d'uso e del te...
For many researchers, Python is a first-class tool mainly because of its libraries for storing, manipulating, and gaining insight from data. Several resources exist for individual pieces of this data science stack, but only with the Python Data Science Handbook do you get them all—IPython, NumPy, Pandas, Matplotlib, Scikit-Learn, and other related tools.Working scientists and data crunchers familiar with reading and writing Python code will find this comprehensive desk reference ideal for tackling day-to-day issues: manipulating, transforming, and cleaning data; visualizing different types of data; and using data to build statistical or machine learning models. Quite simply, this is the must-have reference for scientific computing in Python.With this handbook, you’ll learn how to use:IPython and Jupyter: provide computational environments for data scientists using PythonNumPy: includes the ndarray for efficient storage and manipulation of dense data arrays in PythonPandas: features the DataFrame for efficient storage and manipulation of labeled/columnar data in PythonMatplotlib: includes capabilities for a flexible range of data visualizations in PythonScikit-Learn: for efficient and clean Python implementations of the most important and established machine learning algorithms
The study of the urban alignment of the settlement of Pyrgi and of the arrangement of the sacred areas was favoured by its abandonment after the phase of Romanization and by the possibility of performing large-scale research over its territory. The harbour and the sanctuary of Pyrgi were a fundamental pole of attraction for foreign haunters as the outpost of the metropolis of Caere. Their development was strictly linked to Pyrgi’s favourable geographical position along the Tyrrhenian maritime routes and to the presence of a water spring. The settlement was founded at the end of the 7th century BC, and was connected to Caere by means of a large road. The excavations conducted since 1957 by the Sapienza University of Rome next to the terminal section of the Caere-Pyrgi road brought to light a large sacred district. The new excavation area (2009-2016) is located in the district between the sanctuary and the settlement. It includes different buildings datable to 600 BC-4th century BC erected along a pebbled road that departs from Caere-Pyrgi and leads towards the harbour. The buildings, together with votive deposits and a fire-altar, outline a residential quarter that was perhaps attended by a priesthood, where ceremonial practices were also performed. The new evidence can be related to the sanctuary itself and sheds light on its overall organisation. The results of recent fieldwork have also contributed to a better understanding of Pyrgi’s urban alignment, possible defensive systems (suggested by the Greek name Pyrgoi) and the topographic relationship with the later Roman maritime colony. Thanks to the involvement of scholars from different disciplinary fields, wide-range research is being carried out to reconstruct the original landscape and the evolution of the coastline, with an aim to determining the morphology of the coast and the harbour in the Etruscan period.
The Project pyArchInit-Python for archaeology began in 2005 with the aim of developing a python plug-in for the open source software Quantum GIS. pyArchInit comes mainly from the needs, increasingly present in the archaeology community, to computerize the archaeological records using software which handle alphanumeric, multimedia and topographical data in a single solution. This package aims to meet these requirements with a unique solution that over time guarantees stability, development, easy installation and updating. The final goal is the creation of a GIS platform with a high interoperability between different operating systems, in which alphanumerical tables, GIS geometries and multimedia data are within a single system. This allows us to maintain the integrity of the raw data as much as possible, providing the archaeologist with an approach which is both very fast and powerful and, in the meantime, offering a system open to changes and customizations by other developers. The database management system of archaeological data is automatically installed both in PostgreSQL and in Spatialite. Different user interfaces, created to support the entering of data, manage the database. It is structured in seven management user interfaces: Stratigraphic Units, Site, Chronology, Infrastructures, Taphonomy Record, Archaeological, Multimedia. The first part of the package includes the management of stratigraphic units (pyarchnit_US module) because of the need to manage on site the documentation of excavations in progress. With pyArchInit we will try to bridge the gap between skills and knowledge acquired on an academic level and daily life in archaeological fieldwork; moreover, it will help the interaction with engineers, urban planners, government, Cultural Office administrators and all the agencies that gravitate around the archaeological world.
In the framework of the project ‘Castelseprio, centre of power’, the authors began excavating the structure known as Casa Piccoli in 2021. The area, already investigated by Piccoli in the 1970s, presents itself as an interesting case study for the application of an open and integrated solution for the management of stratigraphic data, specifically pyArchInit. Being an academic excavation project and, therefore, characterized by both research and training issues, it was decided to progressively and incrementally include the use of pyArchInit within the documentation protocols on site and post-excavation, over the three years of the permit granted by the Ministry of Culture for the excavations. Master’s degree students who participated in the excavation, at the end of the planned period, will have the basic skills to use the plugin also in a professional environment. At the end of the first two years of implementation, a SWOT analysis will show the results obtained within the site for both training and research purposes.
The obsidian sources of central Anatolia, the Aegean and central Europe have been studied in detail over the past 50 years. Various analytical techniques have been employed to discriminate artefacts from each of these and to reconstruct their zones of distribution. This paper presents a pXRF method that allows mass sampling of artefacts focusing on three neighbouring regions, particularly where these zones overlap. Successful discrimination of the obsidian source for products could be achieved using three-dimensional scatter plots of trace elements Rb–Sr–Zr. PXRF can thus be appreciated as a powerful tool in the region, enabling non-destructive on-site analyses in contexts where the export of artefacts is often difficult if not impossible. The ability to rapidly process large assemblages also has major implications for generating data-sets of sufficient resolution to transform archaeological interpretation.
PURE3D Technical Report The PURE3D Technical Report is meant to provide a high-level state of the art summary on 3D scholarly web infrastructures. Within the report you will find a brief environment scan of 3D issues, limitations, challenges, and recommendations based on a review of the scholarly literature, 3D file formats capabilities, 3D
The case of city of Ancona highlighted the lack of specific software for the management and digitization of the archaeological data stored in the archives of the Superintendence. The archives contain many heterogeneous data that can help to understand the history of the archaeological sites, from their discovery up to the information archived from the numerous research or rescue archaeology excavations that have taken place over time. The normalization of all the archival data within a single relational database associated with their specific geographical nature, thanks to an overall view and an in-depth review of the data, shed new light on both edited contexts and archaeological evidence that had not yet received an adequate study and that had not been entered into a system that considers nearby data. This software does not replace existing cataloging systems, such as SIGECweb, but it aims to support the cataloging activity using the same standard and at the same time allowing the Superintendence to use the data for the protection activity and for their study.
Around the mid 1980s, the Italian sector - at the time very limited - of the archaeological sciences interested in geo-topographical problems responded eagerly to the practical and theoretical solutions offered by computer science and by advanced technologies, and became one of the most developed sectors in the European panorama in this particular subject. Twenty years later, we can observe, on one hand, the notable success of this type of applications that has, among other things, contributed to drive towards territorial studies many sectors of Italian research that had not previously been interested in it; and, on the other hand, the extreme fragmentation of the initiatives, that remains an unsolved problem for future developments. Within a single decade in fact we lost those guidelines that would have been able to transform some high but still distant peaks of quality, into a systematically coordinated approach, and, especially, in a common cognitive base, which was perhaps primitive but for this reason, "basic", not only for the development of research, but also for a diffused and shared means of safeguarding our archaeological heritage.
We present a case study of data integration and reuse involving 12 researchers who published datasets in Open Context, an online data publishing platform, as part of collaborative archaeological research on early domesticated animals in Anatolia. Our discussion reports on how different editorial and collaborative review processes improved data documentation and quality, and created ontology annotations needed for comparative analyses by domain specialists. To prepare data for shared analysis, this project adapted editor-supervised review and revision processes familiar to conventional publishing, as well as more novel models of revision adapted from open source software development of public version control. Preparing the datasets for publication and analysis required significant investment of effort and expertise, including archaeological domain knowledge and familiarity with key ontologies. To organize this work effectively, we emphasized these different models of collaboration at various stages of this data publication and analysis project. Collaboration first centered on data editors working with data contributors, then widened to include other researchers who provided additional peer-review feedback, and finally the widest research community, whose collaboration is facilitated by GitHub’s version control system. We demonstrate that the “publish” and “push” models of data dissemination need not be mutually exclusive; on the contrary, they can play complementary roles in sharing high quality data in support of research. This work highlights the value of combining multiple models in different stages of data dissemination.
Studying the operational motivation of a retailer to publicly announce his forecast information, this paper shows that by making forecast information publicly available to both his manufacturer and to the competitor, a retailer is able to credibly share his forecast information—an outcome that cannot be achieved by merely exchanging information within the supply chain. We model a market comprised of an incumbent supply chain facing the possible entry of a competing supply chain. In each supply chain, a retailer sources the product from a manufacturer, and the manufacturers must secure capacity prior to the beginning of the selling season. Due to the superior knowledge of the incumbent retailer about the consumer market, he privately observes a signal about the consumer’s demand, which may be high or low. We first confirm that the retailer cannot credibly share this forecast information only with his manufacturer within the supply chain, since, regardless of the observed signal, the retailer has an incentive to inflate to induce the manufacturer to secure a high capacity level. However, when the information is also shared with the competitor, the incumbent retailer faces the trade-off between the desire to secure an ample capacity level and the fear of intense competition. By making information publicly available, it is possible to achieve truthful information sharing; an incumbent retailer observing a high forecast benefits from the increased capacity level to such an extent that he is willing to engage in intense competition to prove his accountability for the shared information. On the other hand, an incumbent retailer with a low forecast is not willing to engage in intense competition in exchange for the high level of capacity; thus, he truthfully reveals his low forecast to weaken competition. Moreover, we demonstrate that this public information sharing can benefit all the firms in the market as well as consumers. In addition, we show that compared to the advance purchase contract, all the firms except the incumbent manufacturer can be better off using public information sharing under a simple wholesale price contract. This paper was accepted by Yossi Aviv, operations management.