mobile has moved well beyond a simple tap or drag.  call it the tinder-ization of mobile, but users now routinely gesture, swipe, flick, pull, drag, long press, and hard press.  the more i delve into building mobile apps, the more i notice the finer points of how these features are and can be implemented.

there’s nothing quite as telling as handing someone a new build of an app i just completed and see what they expect to interact act with and how.  from these interactions, plus being an avid downloader of apps, the one constant i’ve noticed is how there is no ‘rule of thumb’ for gestures.  some email clients archive by swiping left.  others by swiping right.

routinely apps will go so far as to build in tutorials of how and why you should swipe.  i’ll expound upon more examples and ideas throughout this post, but i’ll begin with this -- the basis for how i’ve come to think about mobile gestures.

where web navigation is directional,

mobile gestures are relational.

if you want to go to the bottom of a web page from a computer, you scroll your mouse downwardly or peck away at the down key.  but if you’re on your phone and want to go to the bottom of the article, you swipe up.  moving away the content you’ve read and pulling up the new content into view.

if you want to view the next photo in your scroll, you swipe the current photo out of the way to reveal the next photo. 

if you want to view a menu or some sub action, you move an existing view out of the way or drag in a new view.

this primary difference with mobile is virtually opposite to previous web-era user experience design, but it’s also more natural.  watch a baby play with a phone, and they get it.  how your fingers interacts with glass on a screen is almost innate.  so as mobile developers and product people, how do we design for the best (defined as logical + magical) experience?

short answer: follow tinder (which is also how apple defines gestures), don’t buck convention, and pay close attention to context.  

a quick definition before we continue: swiping right means moving your finger from left to right.  if the focus of the interaction is visible, the object should move and/or dismiss to the right.  thinking swiping away a card on google now, or dismissing a swarm message after your last check in.

but if the focus of the interaction is not visible, new content should appear from the left.  think scrolling through photos, or moving to the next day or week in your calendar.

getting the right direction of a gesture was something i struggled with during the latest update to airbear.  i wanted to make it easy to move between different days, especially since this is where monetization kicked in.

the airbear feeding home screen

i had different implementations on my phone, showed people to see which way they would expect to swipe, and received ranging reviews.  most said it didn’t matter, that they’d learn which way to swipe.  throw on top my natural inclinations towards backwardness as a lefty, and i was stumped.  


so after shipping what i now consider that wrong implementation, i started studying gestures.  i want to build more in my apps, but i also don’t want users to have to learn something new or worse be confused.  as intercom recently published about some of their user testing, they were seeing users swipe left to dismiss (largely because of tinder habits).  even google gets things mixed up, as it tries to buck this swipe left to dismiss convention.

really? swipe right?!?

best practices for winning at mobile gestures

1) context -- does the user have a natural sense that an element is swipable?

without a nudge (or ugly tutorial) a user won’t know that some nuanced element of your ui will register a gesture without being primed.  table views do this by default (such as text or email).  movement that presents the element can also accomplish this, as well as also giving a glimpse of further content (such as a new screen or subview blurred by the layer on top).  arguably the only time a tutorial is acceptable is when the ui is so simple that an app needs to orient its users with all of the features that are invisible, discoverable only with swipes, such as what clima -- my favorite weather app -- does.

2) direction -- which way(s) should a gesture be received?

follow convention (tinder and relational as discussed above), but don’t limit the ways in which a gesture can work.  e.g. twitter only lets you dismiss an image vertically, whereas slack recognizes any gesture direction and then pleasantly spins the image out of view.

3) multi-function -- are there options other than gestures?

while users acclimate themselves to expect more gestures, build in other ways for users to interact with your product.  sure a gesture might be the best way to move between days in airbear, but the user can still tap on the individual days to move directly.

4) bespoke magic -- can you implement a common gesture in a new way that speaks to your product?  can you move gestures beyond navigation?

especially as long and hard presses become more expected, delight your users with some magic that speaks to your product.  robinhood did this by having users confirm a trade by swiping up.  facebook messenger did this by having the size of your ‘like’ determined by how long you hold the like button.

 

in the end, we’re still in the very early days of mobile ux and gestures.  guessing most people don’t use them to their potential or realize how ubiquitous they really are.  as a side experiment to this post, i’m running a poll to see how many people use gestures on their notifications.


Posted
Authorjonathan hegranes