Researchers

Nicola Plant

Nicola Plant is a new media artist, researcher and developer currently working as a researcher on a project developing machine learning tools for movement interaction design in immersive media at Goldsmiths, University of London. Nicola holds a PhD in Computer Science that focuses on embodiment, non-verbal communication and expression in human interaction from Queen Mary University of London. She has an artistic practice that specialises in movement-based interactivity and motion capture, creating interactive artworks exploring expressive movement within VR.

Clarice Hilton

Clarice Hilton is a creative technologist and researcher specialising in Unity and immersive artwork. She is a researcher at the University of London  developing a movement based tool to intuitively design interaction in unity using machine learning. In her interdisciplinary practise she collaborates with filmmakers, dance practitioners, theatre makers and other artists to explore participatory and embodied experiences.

She developed an interactive puppetry and AR touring show If Not Here… Where? with The Little Angel, Great Ormond Street Hospital. She was the creative technologist on SOMA  a virtual reality experience exploring the somatic experience between the physical and the virtual in VR by Lisa May Thomas. She worked as a developer on The Collider developed by Anagram which has toured internationally at Tribeca, Venice Film Festival, IDFA Doc Lab and Sandbox Immersive Festival (best immersive artwork) and was named one of 2019 top immersive experiences by Forbes. She previously taught Interactive Storytelling and Unity at UCL on the Immersive Factual Storytelling course.

Carlos Gonzalez Diaz

Carlos Gonzalez Diaz is a PhD Candidate in the Intelligent Games and Games Intelligence (IGGI) Centre for Doctoral Training at the University of York, Goldsmiths and Queen Mary universities. His research focuses on how the use of interactive machine learning in the design of movement interactions of virtual reality games can affect the player experience.

During his PhD, Carlos has collaborated with Sony Interactive Entertainment R&D researching on the PSVR game system as well as participated in a Google-funded project to develop an interactive machine learning framework for the Unity3D game engine. He has recently co-organised academic events, being one of the chairs for ACM CHI Play 2019 and IEEE CoG 2019.

Ruth Gibson

Ruth Gibson

Ruth Gibson (PhD RMIT) is Co-Investigator on the 4i Project and lead at Coventry University. She is an Associate Professor in Dance and Reader in Somatic Sensing in Virtual Space at the Centre for Dance Research. She has over 20 years of experience in technology and trans-disciplinary research within higher education and industry sectors at a national and international level.

A recipient of awards from institutions and arts organisations NESTA, AHRC, EPSRC, Henry Moore Foundation, Museum of Modern Art Lisbon, Gulbenkian Foundation, Barbican, ACE and BBC and supported by specialist industry sectors Christie Digital, Vicon, World Viz and Digital Catapult – her movement research examines the application of tacit knowledge as a means of advancing emerging computer graphic imaging (CGI) in performative XR contexts. She has progressed a unique field of practice in motion capture and live performance in virtual settings with long term collaborator Bruno Martelli as Gibson/Martelli.

 

 

Bruno Martelli

Bruno Martelli

Bruno Martelli’s practice examines figure and landscape, transposing sites to create ambiguous topographies exploring the relationship natural and artificial. He works with live simulation, performance capture, installation and video to create immersive virtual realities. He holds a doctorate in Immersive Environments from RMIT. 

Commissioned by Wallpaper, Selfridges, Henry Moore Foundation, The Barbican & NESTA, his AHRC projects include: ‘Error Network’, ‘Capturing Stillness – visualisations of dance through motion capture technologies’ and ‘Reality Remix’. He led serious gaming projects to create permanent installations in James Cook University Hospital, Middlesbrough for the ‘Healing Arts Project’, and Ashfield School in Leicester – part of the ‘Building Schools for the Future’ programme. Directing motion capture for an award-winning UNICEF animation, his artworks have been commissioned by Great Ormond Street Hospital Trust.

Based in London Bruno collaborates with artist Ruth Gibson as Gibson/Martelli. Their first work together was BAFTA nominated, recently their ground-breaking ‘MAN A’ project won the Lumen Gold Prize. 

Phoenix Perry

Phoenix Perry creates physical games and embodied experiences. Her work looks for opportunities to bring people together to raise awareness of our collective interconnectivity. Current research underway at Goldsmiths, University of London looks at leveraging our other senses, with particular focus on sound and skin-based feedback to trigger affective response. A consummate advocate for women in game development, she founded Code Liberation Foundation. This organization teaches women to program games for free. Since starting in 2012, this project has reached over 3000 women in the New York and London areas between the ages of 16 to 60. Fostering professional growth and mentoring new leaders in the field, she strives to infuse the industry with new voices. Presently, she leads an MSc and a BSc in Creative Computing at UAL’s Institute of Creative Coding.

Her speaking engagements include A MAZE, GDC, Games for Change, The Open Hardware Summit, Indiecade, Comic Con, Internet Week, Create Tech, IBM Dev Pulse, Montreal International Games Summit and NYU Game Center among others. Perry’s creative work spans a large range of disciplines including drawing, generative art, video, games, interfaces and sound. Her projects have been seen worldwide at venues and festivals including the GDC, E3, Come out and Play, Maker Faire at the New York Hall of Science, Lincoln Center, Transmediale, Yerba Buena Center for the Arts, LAMCA, Harvest Works, Babycastles, European Media Arts Festival, GenArt, Seoul Film Festival and Harvestworks.

Rebecca Fiebrink

Rebecca Fiebrink is a Reader at the Creative Computing Institute at University of the Arts London (primary affiliation) and in Computing at Goldsmiths, University of London. She is the developer of the Wekinator, open-source software for real-time machine, and she is the creator of a MOOC titled “Machine Learning for Artists and Musicians.”

Much of her work is driven by a belief in the importance of inclusion, participation, and accessibility: she works frequently with human-centred and participatory design processes. Current and recent projects include creating new accessible technologies with people with disabilities, designing inclusive machine learning curricula and tools, and applying participatory design methodologies in the digital humanities. Dr Fiebrink has worked with companies including Microsoft Research, Sun Microsystems Research Labs, Imagine Research, and Smule. She has performed with numerous musical ensembles as both an electronic and acoustic musician. She holds a PhD in Computer Science from Princeton University.

Michael Zbyszyński

Michael Zbyszyński is a lecturer in the Department of Computing, where he teaches perception & multimedia computing, live electroacoustic music, and real-time interaction. His research involves applications of interactive machine learning to musical instrument design and performance.

As a musician, his work spans from brass bands to symphony orchestras, including composition and improvisation with woodwinds and electronics. He has been a software developer at Avid, SoundHound, Cycling ’74, and Keith McMillen Instruments, and was Assistant Director of Pedagogy at UC Berkeley’s Center for New Music and Audio Technologies (CNMAT). He holds a PhD from UC Berkeley and studied at the Academy of Music in Kraków on a Fulbright Grant. His work has been included in Make Magazine, the Rhizome Artbase, and on the ARTSHIP recording label.

Marco Gillies

Marco Gillies is Principal Investigator on the 4i Project. Marco’s research centres on how we can create technologies that work with embodied, tacit human knowledge. He has many years’ experience of research into how to generate non-verbal communication for animated virtual characters, particularly for social interaction in virtual reality.

His approach focuses on the role actors and performers can play in creating autonomous characters.He has also worked on other forms of immersive experience and embodied interaction, particularly applied to immersive theatre and performance. His recent research has been on human-centred machine learning in which humans guide machine learning algorithms interactively as a way of making use of tacit human knowledge in artificial intelligence systems.