Light Echoes | 2023 | Sydney Observatory
Light Echoes is an installation that was developed by Anna Raupach and Diana Chester for installation at the Sydney Observatory for the 2023 Winter Solstice. The work transcribes signatures and sonifies specific star locations found in hundreds of logbooks held in the MAAS collection, at Sydney Observatory, and in the NSW Archives into an augmented reality sky space and spacial sound piece.
The soundscape creates spacial sonification of data of the stars and planets in the night sky, based on data decoded by the “human computers” from astrographic plates created at Sydney Observatory from 1890-1927. The human computers were a workforce of women who were employed to work on the Astrographic Catalogue at Sydney, Melbourne and Perth observatories between 1890 and 1964. Their work in measuring the positions of stars was a major contribution to this international scientific endeavour and is under-acknowledged in published documentation. This work was born out of a Powerhouse Research Fellowship at the Museum of Arts and Applied Sciences, where Anna and I gained access to the Powerhouse’s materials. The Virtual Reality animation and spacial sonification are based on the human computer data logs.
The process of data sonification is achieved by taking particular data points of the number of stars and planets at a particular declination (the angular distance of a point north or south of the celestial equator) in the night sky in a particular year, and developing a soundscape for each declination over a particular part of the earth. The data points become the source materials to create these sonifications using custom software built in Max and deployed in Ableton Live. These programs help to convert parsed data from the log books into MIDI information. The MIDI information is then fed into an expressive MIDI controller where aesthetic decisions are made about the sounds that will represent each individual data set. The data converted to MIDI is then sent through the expressive controller back into the software program and into Ableton, where the data is converted into MIDI and then recorded. The recordings from each declination are then woven together into a soundscape and the sounds, now representing stars, are spacialised across the celestial sphere in a 360 degree fashion like in a planetarium using SPAT Revolution. This is accomplished by placing certain stars closer and others further, some moving around the listener, while others remain static but become louder of softer over time.