Sit back, put your earphones on and breath!
A while ago I was exploring the concept of “mindful interactions” and I thought I would just share it here.
This concept aims to help people build awareness by training the ability to voluntarily pay attention. On the simplest level of the concept, the user only needs to breath along the animation respecting the time suggested by the animation itself.
As a second step, the app recognises the users’ breathing pattern and helps being aware while slowly matching to the initial animation. As the access with getUserMedia (for microphone access) to prototype for voice interfaces is still very limited, especially on iPhone, I had to simulate it. By switching to 3D Touch, I could at least dynamically communicate how it would sound and feel like.
On the third and last step, the user can finally close his/her eyes and by feeling a vibration pattern he/she can breath according to it. Here I had to fake it once again by using sound feedback (sine wave with Web Audio API) since there was no way to access the motors. It’s vibrating though!
This is no code-wizardry. Happy to hear thoughts and tips.