This is a read-only archive of the Framer Community on Facebook.

What is Framer? Join the Community
Return to index
Chris McDonnell
Posted Aug 08 - Read on Facebook

(Sound on for the video)

Hey Framerers! The Tesla Model 3's UI creates a lot of exciting new possibilities for interaction design. I haven't yet seen a video of how each interaction works, but seeing the volume bit of Andrew Goodlad's (https://twitter.com/Ichorus) mockups made me think that interaction could be improved.

When Autopilot is on, I imagine you can't use the steering wheel's pair of scroll wheels to adjust volume, since the wheel is in motion. You may not want to use voice if a gesture is quicker, you're having a conversation, trying to hear the audio, or trying to be precise with the volume level. Most of all, you want to be able to do this without taking your attention off the road.

What if you could reach over and make a quick rotation gesture (as if you're spinning a dial with 2+ fingers) anywhere on the screen? This prototype simulates that.

See attached video (sound on), or try it yourself!
🚘 Prototype: https://framer.cloud/YOXSP/
🚘 Note: Haven't tried this on an iPad or other touch device yet. Video is a recording of interacting with it in Chrome desktop browser, holding down the option key to create 2 touch points.
🚘 I must confess the "anywhere one the screen" part is smoke and mirrors, to build the prototype faster I defined 3 points where you can make this interaction. But if you know how to make the dial follow your touch...I would love to learn how to do that!
🚘 Would love your feedback and critique
🚘 Does anyone have a dribble invite?
🚘 Add me on linked in (https://www.linkedin.com/in/christophermcd/)

7 Comments

Melisa Masso Navarro

Vanessa Osorio Monica Osorno

Holly Jade Chan

Awesome, Chris!

Looks beautiful.

Agreed about the smoke and mirrors for discovering the interaction- but it could be learned. While still having a small UI element that allows for a volume control to expand on the screen. (So, 2 via points for adjusting volume)

A thought I have is that the actual gesture spinning motion seems to be too many degrees rotation for a wrist's ergonomic capability (I.e. If you try to mimic the gesture motion, one's wrist can't quite spin that far - unless I'm misinterpreting the gesture(?))

Again. Nice work! Fun to see

Yujin Gu

Kinsey Wang

Marc Krenn

Strongly reminds me of Matthaeus Krenn's car interface:

https://www.youtube.com/watch?v=XVbuk3jizGM

Wouldn't be surprised if this interaction paradigm ends up in Project Titan in some form or another.

Chris McDonnell

Thanks for sharing Marc, that's an interesting concept. It definitely addresses a problem, although it might create a new one if cognitive load on the driver to learn and remember how many fingers are needed for each action is too much. Interesting and well executed exploration though. Any idea how I could update my prototype to have the dial follow my touch position? The way I have it set up now, I'm sort of simulating it by having a few dials distributed about the screen, but to have one that follows kind of like in this prototype would be more realistic.

Dezideriu Sorin Raita

Not so bad

Dezideriu Sorin Raita

But how those ate made?

Read the entire post on Facebook