Session 6 — Participation
Friday 09:30 - 11:00
Chair: Michael Pidd
The 'Courtauld Bag', a brass bag inlaid with silver and gold and manufactured in Mosul in the early 14thcentury, is a unique object recognised by specialists as one of the most important examples of Islamic metalwork in the world. A major exhibition, 'Court and Craft: a masterpiece from Northern Iraq', was created around this beautiful object, and ran at the Courtauld institute from February to May of this year.
As part of the exhibition, UCL's 3D imaging group were commissioned to create an animation to be displayed in the gallery alongside the object. The bag was scanned with an Arius Foundation laser scanner, imaged under a PTM1 dome and finally photographed for photogrammetric reconstruction. Despite the shiny, metallic nature of the object, a detailed 3D model was created using structure-from-motion2, while a brand new technique was used for specular reconstruction from the PTM images, creating stunning photo-realistic renderings of small details of the bag. These renderings were combined to create a two and a half minute video which was shown in the exhibition. (The video can be viewed at http://www.courtauld.ac.uk/gallery/exhibitions/2014/Court-and-Craft/model.shtml )
Research is ongoing, the juxtaposition of rendered 'cgi' video and the real object affording a unique opportunity to examine and evaluate the use of modern technology and imaging techniques in a traditional exhibition environment. A prominent artist and senior research fellow at the University of the Arts, London, Jananne Al-Ani, observed the imaging, again affording a unique opportunity to explore the intersection of three disparate disciplines, art, technology and cultural heritage.
Our research now focusses on use and usage of the model, as we investigate the potential of using these techniques within the cultural and heritage sector. This paper will present both the building and the user testing of the model, highlighting best practice and public engagement aspects of using 3D within museums and galleries.
Illustration 1: Detail from the specular reconstruction
1 Hammer, Øyvind, et al. "Imaging fossils using reflectance transformation and interactive manipulation of virtual light sources." Palaeontologia Electronica 5.4 (2002): 9.
2 Kersten, Thomas P., and Maren Lindstaedt. "Image-based low-cost systems for automatic 3D recording and modelling of archaeological finds and objects."Progress in Cultural Heritage Preservation. Springer Berlin Heidelberg, 2012. 1-10.
Illustration 2: Image of the 3D reconstruction of the bag
University of Oxford
This paper will trace the development of text-based humanities projects built and hosted by the academic crowdsourcing organization, Zooniverse.org. Zooniverse began with ‘citizen science’ projects in astrophysics in 2007 and has since developed thirty projects in the sciences and the humanities. The first Zooniverse humanities project, ‘Ancient Lives’ (http://ancientlives.org/), was launched in July 2011. It is a character-by-character transcription project that has recorded over 1.5 million transcriptions of ancient Greek papyri, the work of over 250,000 unique online volunteers. In January 2014 Zooniverse, in partnership with the Imperial War Museum and National Archives (Kew), launched ‘Operation War Diary’ (http://www.operationwardiary.org/) a partial text transcription and tagging project devoted to uncovering the detail of what life was like on the Front in WWI. To date (August 2014) nearly 10,000 unique OWD volunteers have contributed 32 months’ worth of FTE days of work to the project, amounting to nearly 500,000 classifications (tags and transcriptions) that enable new understandings of battles, the spread of illness, and soldiers’ daily lives during the war. Our OWD development team has now also developed a ‘data digger’ which aggregates the responses of N crowd users (7 for OWD) and reveals variation and agreement in user tagging and transcription.
Zooniverse will soon be embarking on full text transcription projects with leading institutions in the USA and UK, including Tate Britain. The Tate project will enable full text transcription and rich indexing of artists’ archival materials, data that will be integrated with the museum’s art catalogues.
This talk will explore the possibilities and potential pitfalls of full-text transcription and present the early stages of our development work. Creating platforms driven by granular tasks is key to the Zooniverse approach, and marks a significant departure from the MediaWiki platform.
Tap with Jazzy Swing and Romantic Rubato: An Interactive Demonstration of Expressive Timing in Music
University of Sheffield
With years of experience and practice, musicians gain implicit knowledge about where to speed up or slow down in music, when to lengthen or shorten notes, and when to place accents for special effect. As listeners, we are accustomed to these micro-scale variations in music, and are not necessarily aware of the extent to which performers vary their timing and tempo for expressive effect. Although the exact characteristics of a performance depend on a musician's individual interpretation of the music, there are certain regularities or rules that are commonly observed across different performances within a particular genre. The rise of digital tools in music analysis has allowed a (semi-)automatic investigation of such performance characteristics, and consequently has deepened our understanding of performance processes.
Our presentation overviews an interactive music performance analysis demonstration that allows participants to explore performance expression in two domains: jazzy swing and romantic rubato. The goal is for participants to increase their sensitivity to what expressive music performance entails by interactively engaging with different performances of the same music. In the first demonstration, participants can listen to, and subsequently tap along with, a jazzy drum pattern played with swing by different performers at a range of tempi. The resulting time-sequence of taps is captured and compared with note-onset information derived from the recordings of expert performers. Feedback to the participant is derived from a histogram distribution of inter-onset-intervals, in particular considering the ratio between the median duration of the two shorter notes in the swing pattern. In the second demonstration, participants listen to and tap along with different performances of an excerpt of a piano prelude by Chopin. As before, the resulting tap-sequence is compared with note onset information from expert performances. Feedback to the participant is now based on the tempo trajectory across the music, in particular looking at the degree of quickening or slackening, and the position of duration peaks and troughs.
Trialled with prospective music students at University Open Day sessions, this work additionally has a wider relevance to the development of digital tools for automated feedback on timing processes in music (and possibly beyond).