Image Image Image Image Image
Scroll to Top

To Top

Project Showcase

On 22, Apr 2019 | In | By Kyle

We Live in an Ocean of Air

We Live in an Ocean of Air is a multi-sensory immersive installation illuminating the fundamental connection between animal and plant. Step through the canvas and share a breath with the giants of the plant kingdom.

This multi user VR experience premiered at Saatchi gallery London Dec 7th – 5th May 2019.

‘We Live in an Ocean of Air’ has been created by London based immersive art collective Marshmallow Laser Feast in collaboration with Natan Sinigaglia and Mileece I’Anson.

Everyone is happy were tasked with developing a VR enabled avatar system that would allow guests to see each other, themselves and their relationship with the environment.  As well as following design and art direction this had to integrate a very special blend of sensing technologies including body tracking, heart rate monitors, and breath sensors.

Read more…

On 17, Mar 2019 | In | By Kyle

Waifu Synthesis- real time generative anime

Bit of a playful project investigating real-time generation of singing anime characters, a neural mashup if you will.

All of the animation is made in real-time using a StyleGan neural network trained on the Danbooru2018 dataset, a large scale anime image database with 3.33m+ images annotated with 99.7m+ tags.

Lyrics were produced with GPT-2, a large scale language model trained on 40GB of internet text. I used the recently released 345 million parameter version- the full model has 1.5 billion parameters, and has currently not been released due to concerns about malicious use (think fake news).

Music was made in part using models from Magenta, a research project exploring the role of machine learning in the process of creating art and music.

Setup is using vvvv, Python and Ableton Live.

StyleGan, Danbooru2018, GPT-2 and Magenta were developed by Nvidia,, OpenAI and Google respectively.

Click below to see snapshots with some of the generated lyrics.

Read more…

On 28, Nov 2018 | In | By Kyle

CONSTELLATIONS – Joanie Lemercier

Was a pleasure to do some programing and interaction design for Joanie Lemercier, who just happens to be one of our heroes!
Using a unique floating screen-less projection as a canvas “Constellations” is an audiovisual installation and interactive piece, part of Lemercier’s ongoing artwork series on geometry and cosmos.

Conception and visuals: Joanie Lemercier
Music: Paul Jebanasam
Sound design: Echoic Audio
Production: Juliette Bibasse
Additional code: Kyle McLean

The Layered Realities 5G showcase is produced by Watershed on behalf of the Smart Internet Lab, University of Bristol. Smart Internet Lab has secured funds to establish ‘5GUK Test Networks’ a national asset, funded by the UK Government’s Department for Digital, Culture, Media and Sport (DDCMS) ‘5GUK Test Networks’.
The Layered Realities weekend 5G showcase brings together the University of Bristol’s Smart Internet Lab and Watershed, We The Curious, BT, Nokia, Zeetta, Cambridge Communications Systems, PureLiFi and BiO.
Deep neural network paintings of the Land of the Thunder Dragon Deep neural network paintings of the Land of the Thunder Dragon

On 04, May 2018 | In | By Kyle

Deep neural network paintings of the Land of the Thunder Dragon

In our first real foray in to programing with machine learning we left a Convolutional Neural Network (CNN) to ‘meditate’ on an image of Saraswati, Goddess of knowledge, music, arts, and learning. Once trained in this fashion you can feed the CNN with other images, and it will ‘paint’ them in the style of the image it has learnt, using a process known as Neural Style Transfer. The source images are photos we have taken whilst living in Bhutan, a Buddhist kingdom on the Himalayas’ eastern edge.

We’ve also done some pretty cool R&D into using the same methods with realtime video.

Click below to see the full gallery.

Read more…

Engram : Data Sculpture for Melting Memories

From February 7 through March 17, 2018, Pilevneli Gallery presented Refik Anadol’s latest project on the materiality of remembering. Melting Memories offered new insights into the representational possibilities emerging from the intersection of advanced technology and contemporary art. By showcasing several interdisciplinary projects that translate the elusive process of memory retrieval into data collections, the exhibition immersed visitors in Anadol’s creative vision of “recollection.”

We wrote custom software transposing EEG data in to procedural noise forms- a really engaging challenge, both technically and conceptually.

Read more…


Made in vvvv & HLSL, this program runs in full HD at 120fps on a GTX 1080 graphics card.

This capture was featured as a short film at the 2017 Prix Ars Electronica in Linz, Austria.  Music & Coding by Kyle McLean.

Siemens- Art of Intelligence Siemens- Art of Intelligence

On 08, Dec 2016 | In | By Kyle

Siemens- Art of Intelligence

Data consists of valuable information that enhances people’s lives and it has so many stories to tell.  Refik Anadol was commissioned by Europe’s largest engineering company, Siemens, to create a data-driven public art installation.

You can see some great examples of data sculpture in his other works, and we were very happy to help in this delivery.  Our brief:  create specialist software to enable large amounts of actual mobility data to be translated into a stunning, tangible piece of art that we can all enjoy.

We really enjoyed to collaborate with his studio and delivered a result that was very well received by both client and audiences.

For more information & video:

 mobility-data-p-02-940x526mobility-data-p-03-940x526 mobility-data-p-04-940x526 siemenssite1 siemenssite3 siemenssite4 siemienssite2