Shown @ Optional Pause: Longplay Dec 2017

Video: Graceful Degradation

Graceful Degradation is a engineering term whereby a system is designed to retain its core functions even when degraded or partially destroyed. Video and audio is transcoded 1000 times to create a slow shift to a new visual and aural identity.


Installed @ Kings ARI 09.02.18

Start reading this out loud. Shift your focus from the words to where your voice is coming from. It resonates inside your skull, chest and the cavities of your body. This resonance is called auditory feedback, and it is one of the most essential tools humans use in communicating language. Auditory feedback helps us to regulate tone, improvise sentence structure, and maintain rhythm in our voice. Our speech output is directly connected to our sensory input.

When we delay auditory feedback, or create a gap in time between input and output, we erode this connection. This causes fragmentation in speech and a delay in communication.

Delayed Communication is a work which uses a delay between speech and hearing to fragment and break down speech patterns. A microphone and headphones were setup in the gallery with a sign prompting the patron to speak and listen to their voice break. Click play to hear a short segment of voice under the stress caused by delay.

DAF Preview

Delayed Communication: Installation Views


19.8.17 Exhibited @ New Music Day, Yongin Poeun Art Gallery, SK.

Down Under is a composition made in response to field material from Melbourne Zoo. It uses FFT processing to manipulate, destroy and create senses of space.


Poster for New Music Day - 2017


Installed @ Killing Time: Exchanging Utopia and the Pleasure Principle, Capitol Theater Melbourne 01.11.19

Extraction (2019)

Extraction (2019) uses grid-based mediation of video to analyse and restructure networks of data mediation inherent in digital life. The increasing fragmentation of social communication blurs the distinction between real and virtual.

The content is subordinated by the gesture in a mediated reality.
Pixel values represent behaviour on an eight-bit scale.

It’s a shell, a mindless exhibition of wants
⁣You’re a clump of flesh exhibiting desires on a scale of one to ten
⁣Where ten is the most profitable
⁣And one is the least
⁣And there is nothing in between that interests you
⁣Life dominated by a dopamine rush
⁣Flooding neurons and computing networks
⁣Echoes resonate in your subconscious⁣
⁣#throwbackthursday ⁣

FLUID (2018)

Performance/Installation @ Meta: Royal Parade 29.19.17

Fluid (2018) is a set of perfomances and installations using ascultation as a source of sound.

Sounds are harvested through hacked medical tools. The heart, lungs, blood and intestinal tract are captured. These are then reamplified, reinterpreted and deconstructed to create a sonic simulation of being inside a body. However this body is not organic, it is a hybrid of “virtual” and “real” sounds.

After initial experiments at Meta: Royal Parade in 2017, I took this project on stage at the Tote, where I live mixed and improvised with the harvested sounds in space.

Find the release on Bandcamp below.


Installed @ Franklin St. Melbourne CBD, 09.10.16

Long Distance Relationships was a public art piece installed in phone booth on Franklin St, Melbourne. The user was invited to call a hotline and were provided with a series of choices.

Synthetic natural compositions and different sonic textures were played to represent different situations. The intent was to take people out of the busy surrounds of the CBD and augment their experience. Below is a recording testing the user interface.

Audio: Long Distance Relationships


Installed online && ongoing research

statemachine 1

State Machine: Terminal

State Machine is a multidisciplinary exploration into artificial simulation, memory and perception. It contrasts the passivity of the machine with the sentimentality of the human experience.

At the beginning of 2018 I downloaded all of my Facebook data. A decades worth of messages, check ins, statuses, photos, videos and assorted ephemera were all bundled and emailed to me as a huge 5gb document.

Raking through this digital archive of my online self, I began wondering what could be pieced together by the analysis and re-purposing of this data. Could I construct a believable simulation of myself, or at the very least a machine which could take this input, process it in a basic simulation of a cognitive system, and output it to create something new?

Following this prompt I have refined a number of outcomes which reflect on the contrasts and similarities between the simulated and “real” brain.

Taking the form of online content, interactive installation and sound composition, State Machine is an ongoing project in which I attempt to see what can be recreated from all of this data.

statemachine 1

Screenshot: State Machine Website

The website - found here takes 10 years of Facebook messenger logs, and runs them through a series of markov chains. These chains take the weighted probabilities attached to words and add them together, creating a set of probabilities for which words will follow which.

For instance, “I” is more likely to follow “If” than “Banana”, so “I” has a higher probability than “Banana” of occurring after the word “If”. By chaining together a large number of these probabilities, somewhat meaningful sentences can be constructed.

This project continues with the creation of a neural network which uses a LSTM text generation algorithm in order to generate new text from a data-set. In this case, the data was trained on my recently completed Honours Thesis in an attempt to learn to synthesize institutionalized art jargon at record speeds. My research continues on this subject as newly acquired coding skills open new avenues for exploration.


Ongoing research project

Synthetic Perception

Synthetic Perception: installation view

Synthetic Perception is an ongoing research project into the behavioural effects of digital surveillance culture on everyday social interactions.

As a year long Honours project. I experimented with using computer vision algorithms to fragment and deidentify video data.

That data was then recomposed and output onto a display medium. These included computer screens, televisions and smartphones.

Synthetic Perception 2

Synthetic Perception: installation view

I attempted to make an algorithm that was content agnostic and extendable. This means it would not (contrary to the implicit hierarchies of human perception) have a bias toward a certain kind of content seen within data, treating it natively in its own way.

Synthetic Perception 2

Synthetic Perception: Extraction Network App

I settled upon a recomposing on a grid to fracture the dataset by removing any spatial context from the elements that the algorithm picked. Essentially, the code would not care if it could not pick out any comprehensible details within the data that it processed.

The video data processed have varied throughout the course of this project. I have used both self-generated and online videos chosen by web scraping and data-extracting algorithms.

For a more comprehensive look at the research and practice which surrounds this project, feel free to read my Honours Thesis accessible here.


Video still: Synthetic Perception

Video: Networked Mediation (2019)

Archived Livestream: 2018


Online @

seed 00001

generated image: seed 00001

This mask does not exist is an online work which displays images generated by a Generative Adversarial Network.

This network is trained to create images of people wearing face masks, after mask wearing was made mandatory in Melbourne following huge spikes in COVID-19 transmission.

seed 00002

generated image: seed 02243

In this work I trained an algorithm with a large dataset of mask images. After training the GAN can then begin to make inferences about their style and features. Holding the sum of these images in it’s memory, the network can map these features to synthesize new content.

seed 00003

generated image: seed 00842

AI and machine learning are today mainly used in corporate and government applications. Advertisers use AI to figure out what you might buy next. Facebook and Google use AI to try and pre-empt and automate human behaviour towards the goals of their platforms. My use of these algorithms attempts to demystify and uncover the flaws in these approaches. In this work I use these systems in a scientifically unsound way, seeing what will happen when the algorithm is fed with dirty data, or undertrained.

Here you can find a archive of selected works. Please use the links above to navigate.