Cyril Mottier

β€œIt’s the little details that are vital. Little things make big things happen.” – John Wooden

The Making of Prixing #2: Swiping the Fly-in App Menu

The previous article of the “The making of Prixing” series was a brief introduction to the fly-in app menu. I gave a lot of Android UI development tips and tricks that may be helpful to create such a widget in your application. Obviously, the fly-in app menu is a little bit more advanced than what I described. It requires more than a single post to do a complete review of the techniques you may need to create your own implementation of a sliding menu. Let’s continue the in-depth study, focusing on gesture interaction.

In the first article we ended up creating a very basic fly-in app menu. Indeed, the widget we created was closable/openable only programmatically or using a simple click gesture. The latter obviously works like a charm but it has a cruel lack of naturalness and intuitiveness. Having a closing/opening animation of the host makes the user want to interact with the object as he/she would in the real world using a well known gesture: the swipe gesture.

Having a drawer the user can move precisely is the best way to make your UI feel like it is an extension of the real world. Have you ever played video games in which it is impossible to shoot a tree or move a basic object like a book? Pretty frustrating, isn’t it? Users usually hate being artificially blocked or unable to do something he/she wants to. UI development in general follows the exact same rule: Do not block your users, let them do whatever they want. The figure below shows some of the available gestures:

The purpose of this second article is to give you all the requirements you need to add the swipe gesture capability to your fly-in app menu. To be honest, the swipe gesture is far from being new. It is pretty natural to mobile (iOS, Android, Windows mobile, etc.) users. It has been spread so much it is now now available in lots of mobile applications nowadays. In Prixing, the main problem we had was to imbricate horizontally scrollable components (ViewPager, MapView, etc.) in another swipable component (the RootView). This obviously leads to a simple question: Which component has the right to intercept the gesture?1. The Android documentation gives a partial answer to this fundamental question: “You should never use a ScrollView with a ListView, since ListView takes care of its own scrolling.”. This is actually true but several ways to overcome the issue exist:

  • No swiping: The easiest way to implement tricky things is not to implement them! As I like to say: if you don’t want to have trouble, stay at home and do nothing. The problem with this method is it makes your life boring. I am not this kind of guy.

  • Smart swiping: This technique consists of first looking at the top-most (in the drawing order)View which wants to deal with the gesture. In this case, if the component can scroll horizontally in the direction of the swipe gesture, then let it scroll. If it has already reached its maximum position, let the scrollable container intercepts the gesture. This pattern is used in the latest version of GMail. Indeed, when browsing your emails, the ViewPager won’t scroll to the next page until the WebView has not scrolled to the edge. The main problem with this technique is it requires the canScrollHorizontally(int) method which is only available starting from API 14. Moreover you need to be able to determine the View that consumed the gesture (aka TouchTarget in the Android source code). I couldn’t find a public method allowing us to determine this View1 and Prixing had to run on devices with Android 2.1+ so we decided to use another technique.

  • Bezel swiping: This technique consists of intercepting a swipe gesture if and only if it starts near from the edge of the View. Bezel swiping is usually implemented looking at the position of the first gesture’s TouchEvent: the ACTION_DOWN. This is how Chrome for Android lets you navigate between your tabs. I don’t consider it as perfect because you may end up with some gesture mismatch but it makes your menu always accessible. In Prixing, we decided to go for it!

Handling touch events

The Android framework provides everything you need to handle touch events. The most important method is onTouchEvent(MotionEvent). It is called by the system when a MotionEvent (the class representing a touch event) occurred in your View. This is where you put the code that will make the host follow the user’s finger. Unfortunately, implementing onTouchEvent(MotionEvent) will not be enough as it may never be called by the system in case another View - deeper in the View hierarchy - consumes the touch event first. In order to be able to intercept the touch you will simply have to override onInterceptTouchEvent(MotionEvent) in your RootView. I strongly suggest you read the documentation of this method) as it has a fairly complicated interaction with onTouchEvent(MotionEvent)) If you are not familiar with the event dispatching system in Android you can have a look at the great “Mastering the Android Touch System” presentation Dave Smith - @devunwired - gave at AnDevCon III. I didn’t attend the talk, but looking at the slides and the source code clearly suggested to me that the talk was great.

Using platform-provided constants

If you look closely to the source code of some UI widgets included in the framework, you will see they use various constants. The vast majority of these constants can be easily retrieved thanks to the ViewConfiguration class. You should always use this class whenever you need to create your own UI widgets. In fact, the provided values have been widely tested by the Android team. For instance, we will later use the “touch slop” constant which describes the distance a touch can wander before we think the user is scrolling. This value is expressed in pixels and can be accessed via a call to getScaledTouchSlop(). Please note the static unscaled counter part method getTouchSlop() is deprecated and should never be used anymore as it doesn’t scale the constant to match the screen density.

If you really want to use your own dimensions, always remember to scale them appropriately to the current device. This will ensure your widgets will always behave the same regardless of the device they are running on. Dimension scaling can be done pretty easily relying on the resource system or manually using the code provided below:

1
2
final float density = getResources().getDisplayMetrics().density;
mScaledValue = (int) (UNSCALED_VALUE * density + 0.5f);

Note: The information given in this article never mentions multitouch capabilities. This may seem irrevelevant when dealing with a one-finger-only-gesture but this is not the case. Try sliding the host drawer and touching the screen with another finger and you will see it “jumps” from one finger to another. This strange behavior happens because the current implementation of the RootView in the Prixing application doesn’t take into account actions such as MotionEvent#ACTION_POINTER_[DOWN|UP]. That’s definitely something we have on our ToDo list. It just has a low priority compared to the thousands of other tasks.

When can a gesture be considered as a swipe?

The Prixing application uses two separate algorithms to determine if a swipe gesture has been started. Indeed, the swipe-to-open algorithm is a little bit more complicated than the swipe-to-close one. In order to get a complete understanding of how to implement such a gesture, we will mainly focus on the most technical one: the swipe-to-open gesture. The implementation involves several constants and techniques that may be difficult to explain textually only. As a consequence, I created a figure (it also gave me a chance to play around with the new amazing Photoshop CS6!) laying everything down. Please note the figure is available in high definition by simply clicking on it:

A gesture always starts with a MotionEvent whose action is ACTION_DOWN (the red surrounded arrow). In order to consider this touch event as valid to start a swipe gesture, it must happen on a thin area on the left of the View. In the preceding figure, this area is represented by the translucent green rectangle on the left of the screen. The width of this area is very important. Having a narrow bezel area may make your sliding menu difficult to open. On the contrary, a too large bezel area results in gesture mismatches. Android has no support for bezel swiping by default, as a result, we spent a lot of time at Prixing looking for the perfect dimension. We finally ended up with 30dp.

Having an ACTION_DOWN next to the left edge of the View is far from being enough to consider it as the starting point of a swipe gesture. Indeed, some noisy ACTION_MOVEs may be generated by the system as the user’s finger is on screen. This is perfectly normal and you have to deal with this. Fortunately, Android includes a method to determine the distance beyond which a gesture is considered as a drag: ViewConfiguration#getScaledTouchEvent(). On every Android build I have seen, the returned value is equal to 8dp scaled to the current display density. As a consequence, the fly-in app menu considers a gesture as accurate and intercepts it when the travelled distance is greater than the touch slop threshold. Technically, it means:

1
Math.hypot((mCurrX - mStartX), (mCurrY - mStartY)) > mTouchSlop

In order to minimize gesture mismatches, we also added a few other conditions to our RootView. All of them are only applied to the swipe-to-open gesture:

  • The direction of the gesture must be left-to-right. From a technical point of view, it requires all MotionEvents to have increasing values on the X-axis.

  • The gesture is valid only when the distance made by the traveled distance is greater than ViewConfiguration#getScaledPagingTouchSlop() (16dp) on the X-axis. This method has been introduced with API 8 so you may need to use the ViewConfigurationCompat class or copy its implementation into your code (getScaledPagingTouchSlop() is twice the value returned by getScaledTouchSlop()).

The algorithm described previously could be summed up pretty simply using the graphic above. In order to consider a gesture as a bezel swiping gesture, we need to ensure all MotionEvents occur in the striped area, respect the left-to-right direction, and go out of the stripped area via its right edge.

Dragging the host

Now it’s action time! The swipe gesture has been intercepted and we have to drag the host according to user actions. The trick here is to reuse what we have already discovered in the first article of this series: offsetLeftAndRight(int). As long as the onTouchEvent(MotionEvent) receives MotionEvent#ACTION_MOVEs, the RootView reads the position of the MotionEvent on the X axis and translates the host accordingly. Remember that after being intercepted, the rest of the gesture’s MotionEvents are passed to our onTouchEvent(MotionEvent) method.

Ending a swipe gesture?

This is where the magic happens! As a UI developer, managing the end of a gesture is awesome. This is where you have to study the gesture as precisely as possible to graphically reflect what the user wants. So we are at the exact moment the user removed his/her finger from the screen. We need to make sure the RootView animates to one of its stable state (opened or closed). This is done in a few steps described on the diagram below. As usual, clicking on it opens the high definition version.

Velocity is very important

This diagram explains almost everything except how to handle velocity. Indeed, RootView actually looks at the velocity of the gesture using the VelocityTracker helper class. This may be used in some cases to determine whether the host has been thrown rapidly enough to close or open the RootView. The best example of this is when you do a pretty “short” (let’s say 32dp) swipe gesture. Tracking the velocity will let the sliding menu be closed/opened whenever the velocity has gone beyond a certain threshold (ViewConfiguration#getScaledMinimumFlingVelocity()). One may also need to normalize the velocity using the ViewConfiguration#getScaledMaximumFlingVelocity() method. When crafting your own UI widgets, always take velocity into account. Velocity is a well known notion and managing it will make your component even more realistic and usable.

Conclusion

That’s it for now. I think we have been through a lot in this second part. It gave you all the elements to create stunning actionable UI widgets. In the third part of this series I will concentrate on some other details such as parallax translation, menu fading, selection arrows and menu hints. We will also cover a point lots of developers have been asking me to reveal: Activity transitions. Indeed, Prixing is made up of tens of Activity. Having animated transitions like we do is “normally” impossible on Android … Shake up your brain cells about it and feel free to leave a comment below if you have an idea of how I managed to do it on Prixing. Stay tuned for more!

Thanks to @franklinharper for reading drafts of this


  1. Some of you may ask how GMail succeed to do it. I have not reverse-engineered the application but I am pretty sure they did it very easily. Each page of the ViewPager being a WebView, the touch target is always the WebView. Knowing that, you simply have to call canScrollHorizontally(int) on the WebView of the currently selected page.