Visual & Auditory Adventures in Algorithms

by Joseph Wilk

Work | Artist Statement

[21/06/2019] How To Be Seen

Project:

As a remote student to School For Poetic Computation (SfPC), Joseph experiences the people and space of the school through cameras, screens and software. How to be seen attempts to bring the artist into the physical space that he has never physically experienced through an interactive, streamed live coding performance.

The performance consists of a tablet which displays a live stream of the artist. Throughout the exhibition, he writes programming code live to control a second screen — a digitally transformed version of the video stream. The produced graphics use depth and dimension to suggest that the artist’s image is being projected out of the screen and into the gallery space itself.

Initially, viewers of the piece might not understand that the artist is present through the live stream. But this is revealed when viewers pick up the headphones and engage with Joseph directly, interrupting and becoming a part of the performance. Visitors are invited to take “Joseph” in their hands and show him around the gallery, becoming his voice and avatar. Through this interaction, How to be seen explores themes of visibility, control and dependency.

Materials

Webcams, computer, screen, tablet, OBS & OpenFrameworks.

Press: https://www.creativeapplications.net/maxmsp/sfpc-spring-2019-student-showcase/

How To Be Seen 01
How To Be Seen 02
How To Be Seen 03

[01/06/2019] Dark Matters Zine: The Flow Of My Voice

Project:

Part of the Dark Matters Zine created for the School For Poetic Computation showcase.

Invisible audio waves with no colour spectrum flow into a computer interface. Through microphones, hardware, undersea cabling & server rooms. Passing through layer upon layer that transmits, stores and records them. All are invisible to the user, the value is delivered but what happens inside the black boxes is unknown. What rights or usages are allowed of the data is lost in 100 page license agreements. Power and control is lost. In the final stages those words are outsourced to automatic machine translation, not just recording but altering them in attempts to turn sound into the written word. Translation using neural networks trained upon datasets consisting of millions of human audio recordings. The speech which started invisible has now been turned into words, a factory of production invisible of the human labour, environmental cost or its history

Materials

Paper & print
The Flow of My Voice
Dark Matters Zine

[07/05/2019] Visibilty

Explore ways of transforming the way I'm seen. Routed in performance being a way to control the visibility of disability.

Visibility 01

[28/04/2019] Emoji Averages

Comparing all the emojis of different companies to identify patterns and characteristics. Exludes all flags.

Emoji Averages

[28/04/2019] Tessellating Source code

Inspired by Salavon’s "Local Index project" exploring new ways of seeing the structure of source code.

[25/04/2019] 40k sounds sorted

Sorting 40,000 sounds. Using T-SNE trained on a general image set. Represented as waveform (shape) and spectrograph (color). Sounds and images come from https://freesound.org/

Sorting Sound
Sorting Sound: close up

[09/04/2019] Depth without depth

Inspired by Vera Molnár's work exploring turning 2D images into 3D.

[07/06/2019] Creatures Collaborative Songbook

Part of the Bath Fringe Festival 2019. By Mob-Handed Press, series of collaborative songbooks created by selected musicians around a central theme. Artists contribute one song each, which are then anthologised into a zine. This publication is launched with a night of performance and storytelling.

It is a communication between artist and audience, and an exploration of how people visualise the music that they write, perform, and consume.

My submission created through live-coding

Creatures

[06/03/2019] Piezoelectric Resonator Live coded performance

A performance I've been working on using SonicPi, Unity3d and Emacs. Trailer:

Full performance: https://vimeo.com/321104184

[22/01/2019] Light and water

I've been working on a new live coded musical and visual performance with a heavy emphasis on light and water effects. I've extracted some 8k screenshots from my performance work as Giclée prints (https://shop.josephwilk.net/). Made with Unity3D, SonicPi and Ruby.

[27/10/2018] Accents as One Voice

I was interested in what sort of ouput we would get from training a wavenet encoder on many voices but treating them as a single voice. Most usages segment this dataset by person, gender and country. I used the GMU global accents database for training.

[07/09/2018] Accents From Around the World

I have been working on a project to use a sample dataset from the GMU Accent project to create audio and music. The project has over 2600 samples exploring accents from across the world. Everyone is reading the same text which highlights all the sounds of English. There are a huge variety of speakers and native languages with rich meta data for querying. I've been exploring how this dataset could be used to explore communication and language, interacting with human dancers controlling sample parameterisation through sensor feedback.

In exploring the data I did some simple DSP onset detection to extract all voices saying "train station". While very simple, it is deeply beautiful hearing such a wide pallet of voices.

[18/04/2018] Painting with Functions

I created a library (http://blog.josephwilk.net/clojure/functions-explained-through-patterns.html) to help explain functions through patterns in color. As a side effect this created a way of creating interesting patterns simply by composing functions. Instead of focusing on explaining functions I explored the patterns I could make.

(map (fn [e] (if (sequential? e) (partition-all 1 e) e)) (interleave (color/hues 21 25) (partition-all 2 (color/hues 12))))
(interleave (color/hues 21 25) (partition-all 1 (color/hues 6)))

[01/04/2018] Deforming Meshes in Unity3D

I've been exploring ways to deform meshes in 3D. The best way in the end was to use a terrain map to alter the mesh positions of a sphere. The terrain map was a blackwhite png where darkness maps to height. Effectively rendering a mountain range into a 3d mesh. Controlled by Emacs and SonicPi. Also using a focus blur for added effect.

[01/04/2018] ASCII Art in 3D graphics

Exploring bringing the text of live coding into the generated visuals of a performance. Created using Unity.

d[-_-]b
d[-_-]b

[24/03/2018] Origami Eyeball in Unity3D

An attempt to create an eye like shape with the edges/tentacles feeling like origami paper folding. Achieved through increasing light intensity, a cartoon edge like shader applied as a postFX and repeatedly resting the laws of physics with the edges/tentacles simulation.

[06/03/2018] Controlling Synths live with Emacs

Sending midi control data to a live running synth through Emacs. Emacs has a OSC port open which is sending messages to the synth.

[24/02/2018] Scratching samples with Emacs

Using the cursor position in emacs to control the playback position in a sample. A little like scratching a record.

[06/03/2018] Performance Generated Graphics

Snapshots of live generated visuals used in a performance. All the graphics are programmed in the GLSL programming language and that code is executed through OpenFrameworks and ShaderView.

Lava.glsl

[16/02/2018] Brains seeking meaning in human voices

For some unknown reason I'm slightly obsessed with Soprano samples. I've been playing around with the idea of the human ear seeking meaning in a repeated phrase that still has an element of a human voice. I'm using wavetable playback manipulation, playhead speed and some added FM synthesis on a single 4 second Soprano sample.

[17/11/2017] Organic sample based instrument

I've been thinking on some advice given by Deru. "Turn organic sounds into electronic and electronic into organic" Based on this I've been creating an instrument from two classical instruments sounds, a single 1 second sample of a Soprano singing a latin phrase and a single 1 second sample of a Violin. Added together with a lot of granular synthesis. Part of a new piece of music I'm working on.

[23/10/2017] Performance Generated Graphics

Snapshots of live generated visuals used in a performance. All the graphics are programmed in the GLSL programming language and that code is executed through OpenFrameworks and ShaderView.

Shapes.glsl
Phasing Flower.glsl
Code.glsl

[31/12/2016] Music for Machines

Looking into how machines hear music when trying to identify music. Evolving procedurally generated audio based on the mis-identification of the music. The more popular the artist the higher the weight of the score. (Full technical write up: http://blog.josephwilk.net/art/audio-fingerprint-smudges.html)

[20/03/2017] Watching someone code people watching code

Created as part of a live coded performance. I'm writing code as part of the performance overlaid over the visuals. The visuals I'm controlling are someone watching someone live code.

Watching Code 1
Watching Code 2

[20/03/2017] Post effects on GLSL shaders

I recently added into ShaderView (https://github.com/josephwilk/shaderview) Post processing filters. Effects like bloom, fisheye, mirroring, noise and grain effects. Two of my favourite examples generated live.

Mirror, Bloom and Grain 1
Mirror, Bloom and Grain 2

[23/10/2017] Method Missing Code Poem

Method missing, the method that does not exist is a magical and confusing concept. I wanted to draw attention to how counterintuitive this method is and the pain of many programmers trying to understand the behaviour of code using it.

MethodMissing

[21/10/2016] Painting with Regular Expressions in Emacs

When searching text in Emacs, matching terms are highlighted. I wanted to explore creating visual patterns through writing regular expressions. I compressed a section of performance code and then in emacs created different searches.

Patterns in text

[30/09/2016] Live Coding Patterns with Emojis

In live coding percussive patterns its common to mark a hit with '1' or 'x' and a 'rest'/'miss' with '0' or '_'.

          kick_drum_pattern =  [1 0 0 0 1 0 0 0 1 0 1 0]
          if kick_drum_pattern == 1
            sample KICK_DRUM_SAMPLE
          end
        
I wanted to expand the expressiveness of patterns with Emojis rather than text. To add an extra level of meaning for the audience and for my own mnemonics of mapping different types of percussive hits. Parameters like attack, release, velocity and sustain being different Emojis.

Flower patterns
Horse and Unicorn patterns
Animal patterns

[26/08/2016] Ascii Art in Code

Clojure supports a very flexible naming convention for namespaces and functions. I used this in a live performance to exploring using ASCII art to give a visual meaning to the containers of functions.

[02/10/2015] Animation with Emacs

Emacs is a editor used primarly by programmers thats heavily optimised to write and manipulate text. Here I take its text manipulation power and use it to create animations.

[02/10/2015] Viewing Sound data as ASCII animations in Emacs

Using the Sound Engine SuperCollider I'm sending messages to Emacs which contain the frequency. This is then rendering in a scrolling Ascii graph.

[11/08/2015] Production Metrics in 3D

Converting live metric data from SoundClouds production systems into a mesh in OpenFrameworks. The connection of mesh vertexes is slightly random which creates interesting shapes between graphs peaks and troughs.

Requests per second
Requests per second with noise

[11/08/2015] Beautiful Production Metrics

Telemetry generate by the data pipeline at SoundCloud. Graphs generated with https://prometheus.io.

40 machines singing data 1
40 machines singing data 2
40 machines singing data 3
40 machines singing data 4

[31/08/2015] Distorting Text

In most performances when I live code visuals and music the controlling code is overlaid over the top of the visuals. I've been experimenting with how to bring the text into the visuals rather than just sitting on top of it. To do this I used OpenFrameworks I created meshes from black and white screen shots of text. Then adding noise to the text mesh creating interesting distortions. Making the text feel more organic and alive while still being readable.

Live Code
raise exception
Computational behaviour

[17/04/2015] Live Coding Cinema4D

Simple experiment controlling a Cinema4D scene from Overtone using OSC messages.

Flower without light rendering

[1/03/2015] Live coding Minecraft

Controlling the Minecraft world from a live REPL in Clojure. Full technical details: http://blog.josephwilk.net/clojure/overtone-driving-minecraft.html

Pig Algorave

[26/04/2014] OpenGL Silhouette

An experiment using a texture with transparent cutout text overlaid on top of a GLSL shader. Slight distortion on the edge of the text to better connect with the background texture.

RE cutout

[01/03/2014] Overtone Synths

The DarkSeaHorn and Fallout Synths I implemented in Overtone.

[01/01/2014] Piano generated by Recombinance using Bach Chorale 140

Exploring the book Computer Models of Music Creativity through creating generated music in Overtone and Clojure. Uses Bach Chorale 140 as a training source. Full details: https://github.com/josephwilk/musical-creativity

[02/12/2013] Brainwave Powered music

Controlling a synth using Brainwave signals and mapping brainwave activity to piano pitches. Full technical details: http://blog.josephwilk.net/clojure/sounds-of-the-human-brain.html

[13/11/2013] Sonification of a live servers health status

Mointoring a live SoundCloud production service by listening to the health status mapped to piano pitches. Low notes are good, high are bad.