This hackathon is presented as a Studio at TEI 2021, the 15th ACM International Conference on Tangible, Embedded and Embodied Interaction.
10 – 14 February 2021
You are invited to a 5-day virtual hackathon, bringing together artists, dancers and designers, to explore designing movement interaction and create prototypes using new interactive machine learning tool InteractML.
As immersive technologies are increasingly being adopted by artists, dancers and developers in their creative work, there is a demand for tools and methods to design compelling ways of embodied interaction within virtual environments. Interactive Machine Learning allows creators to quickly and easily implement movement interaction in their applications by performing examples of movement to train a machine learning model.
A key aspect of this training is providing appropriate movement data features for a machine learning model to accurately characterise the movement then recognise it from incoming data. We explore methodologies that aim to support creators’ understanding of movement feature data in relation to machine learning models and ask how these models hold the potential to inform creators’ understanding of their own movement.
This workshop is being delivered as part of a project funded by the UK Engineering and Physical Science Research Council, 4i: Immersive Interaction Design for Indie developers using Interactive machine learning.
Activities
Introduction:
This workshop presents a project developing a new immersive tool that uses interactive machine learning to recognise and implement complex movement interaction designs. We will introduce our project activities, team, ethos and research so far.
Embodied Sketching for Movement Interaction Exercise:
We will run an interactive session using our embodied ideation technique based on embodied sketching: designing movement interaction through movement (Márquez Segura et al. 2016). Rather than designing on paper, we will collaborate using our own body movements to explore new ways of interacting. New ways of interacting will be developed through physical “body storming and “sketched” with our bodies, by acting them out.
Interactive Machine Learning Presentation and InteractML Demo:
Interactive Machine Learning uses machine learning technologies within an iterative design process that allows for rapid prototyping and refinement. This makes it possible to design and implement movement interaction by performing body movements. We will present our software toolkit, InteractML, that we are developing to support interactive machine learning and movement interaction design in Unity3D (Gonzalez Diaz et al. 2019).
Prototype building with support from the team on our Discord server:
Days 3 and 4 of the workshop will be in the form of a hands-on prototype building session. With full support from the team on our dedicated Discord server, you will have the opportunity to implement your movement interaction designs using our InteractML tool.
Reflection and group discussion:
The workshop will end with an opportunity to reflect on the approach and discuss its application to creative immersive applications. Participants who would like to continue with this approach will then be given access to our tools and long term support.
Schedule
10 Feb , Day 1
2 hour interactive session on Microsoft Teams and in VR
- Welcome and introductions
- Bodystorming activity
- Interactive Machine Learning presentation and InteractML demo
Join the session that suits your time-zone. Morning session: 10 am CET (9am GMT) or Afternoon session: 7pm CET (6pm GMT)
11 Feb, Day 2
Offline activities with communication and materials distributed on our InteractML Discord Server
- Simple training exercises in a learning scene in special Unity project
- Facilitated group and idea formations
- 5 minute presentation for feedback on Microsoft Teams, to be organised directly between organisers and participants
12-13 Feb, Day 3-4
Asynchronous prototype development, with ongoing support from us on our Discord Server
14 Feb, Day 5
3 hour interactive session on Microsoft Teams
- Prototype presentations and feedback (2 hours)
- Discussion session (1 hour)
- Reflection on the interactive machine learning approach and designing embodied interaction.
- Issues relating to movement features and an understanding of movement between machine learning and humans.
Join the session that suits your time-zone. Morning session : 10 am CET (9am GMT) or Afternoon session : 5pm CET (4pm GMT)
Participation Pre-requisites
- Access to a cabled VR HMD and the computing capabilities to support VR interaction (an Oculus Quest is possible if there is a link cable set up and tested before).
- Unity 2019.4.11f1 installed with experience of Unity software at at least beginner level.
Registration
Please register your interest using this form and we will send you an access code and further instructions to sign-up via the conference website. The registration fee for this studio is €30.
Questions? Email Nicola Plant.
Organisers
Nicola Plant is a new media artist, researcher and developer currently working as a qualitative researcher on the 4i project at Goldsmiths, University of London. Nicola holds a Ph.D in Computer Science that focuses on embodiment, non-verbal communication and expression in human interaction from Queen Mary University of London. She has an artistic practice that specialises in movement based interactivity and motion capture, she creates interactive artworks exploring expressive movement within VR.
Clarice Hilton is the lead developer on the 4i project at Goldsmiths, University of London. She has previously worked as a Creative Technologist, Artist and Researcher in immersive technology. In her interdisciplinary practice she collaborates with filmmakers, dance practitioners, theatre makers and other artists to explore participatory and embodied experiences.
References
Gillies, M. (2019) ‘Understanding the Role of Interactive Machine Learning in Movement Interaction Design’, ACM Transactions on Computer-Human Interaction, 26(1), pp. 1–34. doi: 10.1145/3287307.
Gonzalez Diaz, C., Perry, P. and Fiebrink, R. (2019) ‘Interactive Machine Learning for More Expressive Game Interactions’, Proceedings of the IEEE Conference on Games. London, UK.
Höök, K. (2018) ‘Designing with the Body – Somaesthetic Interaction Design’, in. MIT Press, doi: 10.5753/ihc.2018.4168.
Márquez Segura, E. et al. (2016) ‘Embodied Sketching’, in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 6014–6027. doi: 10.1145/2858036.2858486.