Android 15: Gestures Kirk Scott
15.1 Introduction • 15.2 Design Information on Gestures • 15.3 Training Information on Gestures
15.1 Introduction • This unit belongs to the third third of the course • It covers topics that might be of some interest when developing your final project • It’s a little late in the game, but not too late to consider changing the user interface from widgets to gestures
The initial sets of overheads were taking from the “training” part of the Android developers’ Web site • Then the large bulk of the following sets were taken from the API guides • This set of overheads consists of two parts
The first part is taken from the developers’ Web site section on design • It addresses ideas about desirable GUI features • The second part is taken from the developers’ Web site section on training • This is the hands on “tutorial” part of the Web site
15.2 Design Information on Gestures • The first part of this section is taken from: • Design • Patterns • Gestures
Gestures • Gestures allow users to interact with your app by manipulating the screen objects you provide. • The following table shows the core gesture set that is supported in Android.
Touch • Triggers the default functionality for a given item. • Action • Press, lift
Long press • Enters data selection mode. Allows you to select one or more items in a view and act upon the data using a contextual action bar. Avoid using long press for showing contextual menus. • Action • Press, wait, lift
Swipe or drag • Scrolls overflowing content, or navigates between views in the same hierarchy. Swipes are quick and affect the screen even after the finger is picked up. Drags are slower and more precise, and the screen stops responding when the finger is picked up. • Action • Press, move, lift
Long press drag • Rearranges data within a view, or moves data into a container (e.g. folders on Home Screen). • Action • Long press, move, lift
Double touch • Scales up the smallest targetable view, if available, or scales a standard amount around the gesture. Also used as a secondary gesture for text selection. • Action • Two touches in quick succession
Double touch drag • Scales content by pushing away or pulling closer, centered around gesture. • Action • A single touch followed in quick succession by a drag up or down: • Dragging up decreases content scale • Dragging down increases content scale • Reversing drag direction reverses scaling.
Pinch open • Zooms into content. • Action • 2-finger press, move outwards, lift
Pinch close • Zooms out of content. • Action • 2-finger press, move inwards, lift
The second part of this section is taken from: • Design • Style • Touch Feedback
Touch Feedback • Use illumination and dimming to respond to touches, reinforce the resulting behaviors of gestures, and indicate what actions are enabled and disabled.
Whenever a user touches an actionable area in your app, provide a subtle visual response. • This lets the user know which object was touched and that your app is "listening".
Be responsive to touches in a gentle way. • Whenever a user touches an actionable area in your app, let them know the app is "listening" by providing a visual response. • Make it subtle —just slightly lighter or darker than the untouched color. • This provides two benefits:
Sprinkles of encouragement are more pleasant than jolts. • Incorporating your branding is much easier because the default touch feedback works with whatever hue you choose.
Most of Android's UI elements have touch feedback built in, including states that indicate whether touching the element will have any effect.
Communication • When your objects react to more complex gestures, help users understand what the outcome will be. • In Recents, when a user starts swiping a thumbnail left or right, it begins to dim. • This helps the user understand that swiping will cause the item to be removed.
Boundaries • When users try to scroll past the beginning or end of a scrollable area, communicate the boundary with a visual cue. • Many of Android's scrollable UI widgets, like lists and grid lists, have support for boundary feedback built in. • If you’re building custom widgets, keep boundary feedback in mind and provide it from within your app.
If a user attempts to scroll past the last home screen panel, the screen content tilts to the right to indicate that further navigation in this direction isn’t possible.
This section is taken from: • Develop • Training • Best Practices for User Input • Using Touch Gestures
Using Touch Gestures • This class describes how to write apps that allow users to interact with an app via touch gestures. • Android provides a variety of APIs to help you create and detect gestures.
Although your app should not depend on touch gestures for basic behaviors (since the gestures may not be available to all users in all contexts), adding touch-based interaction to your app can greatly increase its usefulness and appeal.
To provide users with a consistent, intuitive experience, your app should follow the accepted Android conventions for touch gestures. • The Gestures design guide shows you how to use common gestures in Android apps. • Also see the Design Guide for Touch Feedback.
Lessons • Detecting Common Gestures • Learn how to detect basic touch gestures such as scrolling, flinging, and double-tapping, using GestureDetector. • Tracking Movement • Learn how to track movement.
Animating a Scroll Gesture • Learn how to use scrollers (Scroller or OverScroller) to produce a scrolling animation in response to a touch event. • Handling Multi-Touch Gestures • Learn how to detect multi-pointer (finger) gestures.
Dragging and Scaling • Learn how to implement touch-based dragging and scaling. • Managing Touch Events in a ViewGroup • Learn how to manage touch events in a ViewGroup to ensure that touch events are correctly dispatched to their target views.
 Detecting Common Gestures • A "touch gesture" occurs when a user places one or more fingers on the touch screen, and your application interprets that pattern of touches as a particular gesture. • There are correspondingly two phases to gesture detection:
1. Gathering data about touch events. • 2. Interpreting the data to see if it meets the criteria for any of the gestures your app supports.
Support Library Classes • The examples in this lesson use the GestureDetectorCompat and MotionEventCompat classes. • These classes are in the Support Library. • You should use Support Library classes where possible to provide compatibility with devices running Android 1.6 and higher.
Note that MotionEventCompat is not a replacement for the MotionEvent class. • Rather, it provides static utility methods to which you pass your MotionEvent object in order to receive the desired action associated with that event.
Gather Data • When a user places one or more fingers on the screen, this triggers the callback onTouchEvent() on the View that received the touch events. • For each sequence of touch events (position, pressure, size, addition of another finger, etc.) that is ultimately identified as a gesture, onTouchEvent() is fired several times.
The gesture starts when the user first touches the screen, continues as the system tracks the position of the user's finger(s), and ends by capturing the final event of the user's fingers leaving the screen. • Throughout this interaction, the MotionEvent delivered to onTouchEvent() provides the details of every interaction. • Your app can use the data provided by the MotionEvent to determine if a gesture it cares about happened.
Capturing touch events for an Activity or View • To intercept touch events in an Activity or View, override the onTouchEvent() callback. • The following snippet uses getActionMasked() to extract the action the user performed from the event parameter. • This gives you the raw data you need to determine if a gesture you care about occurred: