This portfolio was developed as part of my doctoral research. The majority of these compositions require Silverlight. Many also incorporate sound — please use headphones or speakers with a good frequency response range. Some of these works may also have broken due to continuing changes in the APIs on twitter, flickr and elsewhere.
Recent versions of the Chrome browser no longer support Silverlight. As time passes, the technology used to code and create many of these pieces is becoming obsolete.
CCTV
An on-screen CCTV monitoring system that reveals London past and present
Memories of Time Past
An autobiographical work, containing an array of images from the researcher’s past. It provides a visualisation of the chance interaction of different images-as-memories from the past, forming and breaking random memory clusters
TimeRadio
An on-screen vintage (and somewhat temporally unstable) radio capable of tuning into any sound ever made. It is inspired by Charles Babbage’s observation: “The air itself is one vast library, on whose pages are for ever written all that man has ever said or woman whispered”
This work uses original field recordings made in London’s Piccadilly Circus and Trafalgar Square. Choose one of the locations using the radio button – Piccadilly Circus is the default. For the chosen location, the original sound recording can be toggled on/off using the check button towards the lower centre of the screen. Each of the twenty circles contains a processed version of the original sound which will itself vary with each invocation. Passing the cursor over a circle will start the sound; passing it over again will stop it. Multiple sounds can be stopped or started at the same time. Sounds currently playing are indicated by an animated circle. NOTE: this will take some time to load given the size of the sound files.
An on-screen vintage TV capable of tuning into a variety of images of old London
A constantly updating display of captured layers of contemporary London, retrieving and displaying in real time images tagged with London interest from flickr, tweets from Twitter and sounds from Freesound
An on-screen book, the pages of which can be turned. Each double-spread contains an old memory/image related to the researcher on the left-hand page and some text related to the theme of “I remember ….” on the right-hand page. The work is accompanied by an original soundscape
An exploration of what the old Guildhall School of Music and Drama once meant to the researcher … The piece is non-interactive and presented as an authored work
An alternative exploration of what the old Guildhall School of Music and Drama once meant to the researcher. Unlike the first piece, this work utilises the random selection of sounds and their layering, differing each time the work is launched
This explores another way of enabling a user to understand what the old Guildhall School of Music and Drama once meant to the researcher. Unlike the first two versions, this work adopts an interactive technique whereby users can start and stop sounds, choosing which to layer, via on-screen globes
An exploration of memory; layers of environmental sounds (children, birds, the sea) and half-recollected ambiences and music from the past interweave with more ambiguous sounds (digital clocks or medical machines?) and meanings (what sort of ‘stroke’ is being spoken about?). And are the sounds we hear actuality recordings, or simulations …? Designed for large screen projection and amplified audio, the premiere of this work took place in Leicester in February 2009
A layered environment in which various works and techniques can be explored. The environment itself is designed as a series of layers and intended to create a context in which the compositions can best be experienced. It draws upon various earlier experiments into ways of presenting time and content, including rotational, cyclic and linear
An HTML5/Javascript re-coding of the earlier Silverlight lens technique. Does not require Silverlight, but does require a browser supporting recent open standards.
An HTML5/Javascript re-coding of the earlier Silverlight slider technique. Does not require Silverlight, but does require a browser supporting recent open standards.
Using mouse movements (left/right), this blends a contemporary London street scene with selected image content from the same scene in c. 1904
Using mouse movements (left/right), this blends a contemporary London street scene with image content from the same scene in c. 1904
Using mouse movements to control a lens able to “see through time” this overlays a contemporary London street scene with image content from the same scene in c. 1904
Using mouse movements around the screen, hidden sounds in the landscape can be heard. Sounds are not just those heard at the time of the image, but those from any point of the past (and future) of the location
A view of Trafalgar Square today and in 1920, with the 1920’s view modified by user interaction with a pixel shader
An alternative technique for revealing palimpsests of the same place over time. A mouse click initiates an alpha filtration technique that progressively reveals the older image beneath the newer (and vice versa when the mouse is clicked again). This work uses a contemporary London street scene with selected image content from the same scene in c. 1904
Explores a technique for navigating between maps of the same place, revealing palimpsests of the same area over time using a mouse scrollwheel. This example uses 3 maps of the same place
Explores a technique for navigating between maps of the same place over time, revealing palimpsests of the same place over time using left/right movements of the mouse as a slider control. This example uses 3 maps of the same place
Explores a “3D” element using digitally created anaglyphic techniques, requiring the use of red/cyan glasses. It aims to produce a heightened effect as the user engages in revealing the palimpsest past of a place beneath the present.
An extended experimental composition bringing together synthetic and authentic images of London and synthetic and authentic sounds into a work with integral user interaction
An audio-visual composition intended to represent the lowest physical level internal flow of memories within the brain, where reality is created, stored and recalled
An audio-visual composition operating at the next layer above “plasma” (described above). Represents individual low-level memories floating around and their chance bonding and interaction with others, via spontaneous and often ephemeral synaptic connections
An audio-visual composition operating at the next layer above “bionodes” (described above). Represents individual memories (images) of the researcher floating around and their chance bonding and interaction with others, via spontaneous and often ephemeral synaptic connections
Experimentation using computer visualisation techniques to create unusual effects – in this example, a somewhat Magritte-esque effect of multiple clocktowers appearing ….
An alternative aspect of experimentation of computer visualisation techniques, producing multiple layers of dynamically sized images
An updated technique for 3D-like rotational cubes, in this example including a counter-rotating object within the cube operating within its own rotational space
This takes the 3D cube example above and applies experimental pixel shader techniques, modifying the rotational images in real time
An audio-visual composition that progressively reveals variable size slivers of past memories (images) which layer on top of each other, building up a rich visual tapestry of snatched insights into the researcher’s life
An audio-visual composition that progressively reveals variable size visual slivers of a specific place (the Chiswick Empire) using both contemporary and older images
An audio-visual composition that progressively reveals variable size visual slivers of the Chiswick Empire as it once was, the current image (where the building no longer stands) slowly fading to be replaced by the Empire as it was, as a local resident recalls his memories of the place during its heyday
This work is a revised version of the original Tyburn Tree (see further down)
Part of a creative visualisation of the exploration of the nature of internal memory and recollection for incorporation into an extended work. See Personal fMRI.
A creative exploration of techniques developed to support a later extended work built around autobiographical memories of people and places from the researcher’s life. See Memories of Times Past.
This work is an iterative enhancement to “3D homes”. The images are animated, they start small and grow in size as they move towards the front of the screen. Clicking on an image produces a random soundscape and causes the selected image to be pinned to the middle of the screen until the user clicks on another image, or the soundscape completes, at which point the screen returns to its previous formation
HTML5 is a draft specification which, when complete, may be able to do much of what is currently achieved through proprietary browser plug-ins such as Microsoft Silverlight and Adobe Flash. This activity evaluated the extent to which some of the original techniques developed in this research may be implementable in HTML5 and the extent to which such techniques will work interoperably between differing browsers
This is an experimental website landing page that provides an overview of selected compositions in the form of a rotating visual carousel. It enables users to click on any of the rotating thumbnails, which freezes the carousel and enables them to read a brief narrative description of the work in a larger pane view of the item chosen. They can then choose whether to launch the selected composition (which will open in a separate browser window), or close the enlarged pane and continue exploring other available compositions. The carousel is accompanied by sound that creates an atmospheric context for the work. The sound stops when one of the compositions is launched and only resumes after the composition has completed and the child browser window terminated
User movement of an on-screen horizontal slider transitions between a contemporary image of where the Fleet River used to run in London with an overlaid, moving illustration of a river appearing. The Fleet today is buried below the ground – but prior to it being piped and landscaped over, this was a navigable tributory of the river Thames. It aims to make users more aware of one of the hidden rivers of London – still present, but hidden below the current surface landscape
This work is similar to the one above, but uses a still image rather than the moving river image
This applies a different technique – instead of sideways movements or the lens, it uses the deepzoom technique: as users click into the picture they move ‘inside’ it, finding more layers hidden within that progressively take them ‘back in time’. The reverse zoom technique brings users progressively out of the layers and back to the present day
User movement of an on-screen horizontal slider transitions between a contemporary image of a Chiswick (west London) street at the far left extreme of slider movement and one from the nineteenth century (sourced from an original post card) at the far right extreme of slider movement. At intermediate stages, the on-screen visualisation provides a merged view of the two images, the balance determined by the user’s movement and positioning of the slider
The user can select an on-screen “lens” by clicking on it with the mouse. The potential for this interaction is emphasised by a sound playing when the mouse cursor passes over the lens, as well as the cursor changing from a pointer to a hand. Whilst selected, movement of the lens around the displayed “now” image reveals an earlier view of the same street scene hidden below the contemporary scene (the “then” image)
Three West London Maps – from 2008, 1920 and c.1805 – are presented in layers. Initially only the 2008 map is visible. As the user moves the on-screen slider it blends the displayed artefacts backwards and forwards in time, gradually revealing the earlier maps the further back in time the slider is moved. As the user passes through the era when the Chiswick Empire existed (1912-1959) it flickers into life on its site by Turnham Green. If the user stops the slider in this era the image will grow to be viewed. Clicking on the Empire switches audio on/off. The red dot is Hogarth’s House, which was present throughout the time period of all three maps. Clicking on the dot brings up a photo of the house. Clicking on the red dot again removes the photo. The blue dot is Duke’s Avenue. Clicking on the dot brings up two photos that morph in and out of each other from the twentieth and nineteenth century, accompanied by sounds of children playing from the 19th century. Clicking on the blue dot again removes the experience
The user can select the on-screen “lens” by clicking with the mouse. The potential for this interaction is emphasised by the cursor changing from pointer to hand. Whilst selected in this way, movement of the lens around the displayed image reveals an earlier view of the same street scene hidden below the contemporary scene. The contemporary scene is video rather than a photograph. The older image is a photograph. The lens design is modelled on that of a vintage magnifying glass. Contemporary traffic noises from the video soundtrack are intentionally audible. Both audio and moving images of the present are looped
The user can select the on-screen “lens” by clicking with the mouse. The potential for this interaction is emphasised by a sound playing when the lens passes over the lens, as well as the cursor changing from pointer to hand. Whilst selected in this way, movement of the lens around the displayed image reveals a map of the same area from an earlier era
User movement of the on-screen slider transitions between a contemporary moving image of Trafalgar Square at the far left extreme of slider movement and moving images from the 1920s at the far right extreme of slider movement. At intermediate stages, the on-screen visualisation provides a merged view of the two images, the balance determined by the user’s movement of the slider
The user can select the on-screen “lens” by clicking with the mouse. The potential for this interaction is emphasised by a sound playing when the lens passes over the lens, as well as the cursor changing from pointer to hand. Whilst selected in this way, movement of the lens around the displayed image reveals an image of the area adjacent to Cleopatra’s Needle from an earlier era
The user can select the on-screen “lens” by clicking with the mouse. The potential for this interaction is emphasised by a sound playing when the lens passes over the lens, as well as the cursor changing from pointer to hand. Whilst selected in this way, movement of the lens around the displayed image reveals an image of the area around the Cenotaph from an earlier era
This is a non-interactive piece. The screen displays a contemporary image of the junction by Marble Arch where Tyburn tree reputedly stood. Background ambient sound recorded at the scene of the plaque that marks the location of the tree is played in the background. Intermittently an image of the “tree” during a public execution flickers into place, accompanied by an “other worldly” sound
Provides an interactive navigable landing (or launch) page for content. Each of the menu options is visual and floats gently around a central point. The options are laid out in a linear form. Some of the menu options are animated, including video footage
Provides an interactive navigable landing (or launch) page for content. Each of the menu options is visual and rotates around a central point. The options are laid out in a cyclic form
Provides a rotational landing (or launch) page for content. In the centre of the screen two pieces of text (“palimpsests” and “of time and place”) rotate around a central point. The rotation and its relative position is accompanied by atmospheric sound, both musical and voices
Provides a “3D-like” perspective, containing images of the homes in which the researcher has lived. The images are animated, they start small and grow in size as they move towards the front of the screen. Clicking on an image produces an ethereal sound and causes the selected image to grow in size until it fills the whole screen, after which the screen returns to its previous formation
Provides a static historic image of part of the City of London. Embedded within the landscape are 14 soundscapes, all of which play together by default. They can be located and listened to with a “spotlighted” increase in their individual volume by moving the cursor around the screen until they are each found
This work was itself a refined version of an earlier online gallery of images. It added sounds, selected randomly from an underlying collection, as users click on and examine images in more detail
A set of three rotating cubes of a window frame in an old house, overlaid with a sound from the past.
A framework for a work examining surveillance cameras in London. Each box is a placeholder for an image of a surveillance camera
A set of early prototypes and developments to explore the idea of a radio able to tune into any sound ever made
An aural-only technique requiring the use of headphones. It experiments with the head related transfer function (HRTF) to simulate surround-sound, with the whispers that can be heard moving around (behind, beside, in front of and inside the listener’s head). Online here. Requires headphones.
An aural-only technique comparing a dry recording of a flute with a subsequent “wet” version utilising an original impulse response obtained in one of the historic rooms at London’s Geffrye Museum. The reference flute is online here. The flute with the impulse response from the 1630 hall applied is here.
An aural-only technique comparing a dry recording of a clarinet with a series of “wet” recordings utilising “found” and synthetic impulse responses. The clarinet (original sound) is online here. The clarinet (18th century room) is online here. The clarinet (school hall) is online here. The clarinet (labyrinth) is online here. The clarinet (small prehistoric cave) is online here. The clarinet (kitchen) is online here.
An aural-only technique aiming to enable a pair of stereo speakers to create sounds that appear to, in part, come from beyond the normal stereophonic space. This is part of a wider exploration of the potential applications of more spatialised sound. This work is online here.
This work is aural only. The composition merges two differing impulse responses – the first dry and far away, the second closer and fuller. The impulse response is here.