Interactive experiences for a neural-interface armband that uses Electromyography (EMG) to measure the electrical activities produced by our skeletal muscles.
Note: Device form factor and Client require anonymity due to NDA
Platform: Electromyography arm-band/Unity/Kinect
Team Size: 6
Role: Programmer
Development Period: 15 Weeks
Client: NDA
Link
Introduction
NeuroACT is a series of interactive experiences designed for a cutting-edge neural-interface device. Operating on electromyography (EMG) technology that measures the electrical activities produced by our skeletal muscles, this wearable device lets us interact with the computer applications through thin air. We use machine learning to develop our prototypes in 3 specific areas that showcase the device’s strengths in gaming, training, and location-based entertainment. We believe our designs shine a light to what was considered as science fiction and enable guests to experience the device in the most fun and meaningful way.
DESIGN GUIDELINES
Tell users what to do.
Experiencing a new piece of technology can be daunting to new users. Use direct texts, visual and audio commands for instructions.
Provide affordances.
Use props to create a natural, comfortable, and consistent hand interactions.
Have a resting surface.
Extending the arms and consistently doing in-air interactions create muscle tensions after 2 minutes. Sorry, Minority Report.
Where are my virtual hands?
Visually show the users’ virtual hands to build a strong correlation between what’s happening in the physical world and what’s happening in the digital experience. For example, use a 2D/3D skeletal or a contextual rendering of an embodied character’s hand.
Delight users with data.
Showing users a live signal board of Electromyography technology can be an effective way to communicate how the device works.
Recommend all users on suggested gestures that are proven to work most effectively. Additionally, incorporate different ways to give experienced users the flexibility to explore their own gestures once they are familiar with the technology.
Know who your users are.
Suggested gestures might have an unexpected connotation in certain cultures, or might be difficult for some users to perform.
Less is more.
A few gestures done well can go a long way in creating a fun and meaningful experience.
Make training fun.
Training phase may or may not be part of the actual gameplay. If the raw training data is accessible, give experienced users the ability to skip training and go into the actual experience. Unfortunately with our current hardware and SDK, we have to train the device every time the application is restarted.
Responsive feedback is key.
Use paper prototypes to learn the right amount of time it takes for users to follow the instructions. Update users on the status of their tasks.
Celebrate a good performance.
Entice the gameplay experience by rewarding users with exaggerated audio and visual feedback.
Design around high latency issues.
Avoid time-based experiences if the device has technical constraints that prevent delivering responsive feedback.
My Contributions
I was a programmer/prototyper in this project. NeuroACT was an exploratory project where the team together built several prototypes to understand and find out use-cases for the armband. My responsibilities included working with the design team to:
Prototyped various experiences that make use of different hand gestures and created learning models for them, which allowed us to narrow down on the best gestures that the device worked with.
Providing feedback on the documentation and the device API.
Troubleshooting network and data transfer issues with the device.
Creating interfaces for the device API to work with multiple technologies like the Kinect and web.
Assisted in designing experiences for the armband which takes into consideration the training and calibration time for every user.
Final Prototypes
Sound Mosaic - An Interactive Music Synthesizer
Sound Mosaic is an experimental interactive application that uses the EMG armband to control sound parameters and the Kinect to track the user position on the CAVE floor to add new layers to the music track. CAVE is a 3-sided projection screen which contains a walkable space in its center.