WatchKit is imminent and I can’t wait to dig into the details and think about what’s possible.
In the mean time, I’ve been playing around with Form—a prototyping tool—and an XCode project to start getting a better idea of what it’ll be like to make apps that are on the wrist. Although there are already a range of smartwatches out there and it might seem a bit ridiculous to strap an iPhone to your wrist, I wanted to use the tools I’m comfortable with to experiment and prototype. In short, I want to run Swift on my wrist.
Here are some things I’ve noticed:
Twist & tap: Twisting your wrist around for extended periods can get uncomfortable. On on a normal watch, you simple glance at the time and move your arm back to where you want it. On my prototype, I found myself messing around with the information on the screen for a long time and it made my wrist hurt. I think it’s interesting that when people adjust the time on their watches, the often remove them first. Tapping and swiping and reading things on your wrist at that angle is not a natural feeling and can be a strain. My guess is that successful apps will not be dependent on long sessions—and by long I mean more than 30 seconds—and will allow people to return their wrist to a relaxed state as soon as possible. I think this could also be why WatchKit’s staged rollout will be a benefit to developers.
Inside or outside: As a result of the twisting strain, I wonder if this might lead to more people wearing the it on the inside of their wrist. If so, accessories manufactures might take this into account when they start releasing a range of straps for the device.
Glance angles: In contrast to iPhone, I often look at my prototype’s screen from oblique angle, instead of straight at it. The effect is that the screen often forms a diamond or heavily skewed rectangle depending on where I’m holding my arm.
Right or left: Designing for right- and left-handed people could become important. I’m guessing that WatchKit will have an API for which wrist the device is being worn on; if not, it’ll be possible to work it out from the gyroscope. Having worn my prototyping tool for a few hours on each wrist and played with some of the toy apps I’ve created, I found myself wanting to tailor the interface slightly for the different finger and thumb orientation. People who wear it on the left wrist will have their right-hand’s thumb closer to the left side of the screen, and vice versa.
Forces: In the introductory video, Jony Ive explained that “tiny electrodes around the display recognise the difference between a tap and a press”. Designing for these different forces is an Apple Watch-specific consideration. To experiment with this, I measure how long my finger is held on the screen. If it’s over a second I pop up some new menu items. I think it won’t take long for people to get used to this new input method. (Random idea: Perhaps this technology will be added to the iPhone 7? It could be used to reveal extra controls and make single-handed use easier?)
Rotate to zoom: The primary Watch-specific input method that was presented in the keynote is the digital crown. Apps on iPhone are there to fill a screen and the home button brings them back. The crown seems to have a similar high-level function where you can access an app that’s in the center of the App honeycomb. The Maps app that was included in the demo video showed a user zooming into an app and then zooming down into the map itself. If there’s wiggle room, I think there’s going to be lots of experimentation here. I’m curious to find out is whether zooming out must always close an app.
I’m really excited for the Watch.
[15/288 shots]