1 / 41

Other Input Methods

Other Input Methods. Pre-Lab Lecture 4. Revisit . Pre-Lab 3 – Animation Boundary Information Layer Concept Animation algorithm Next Position Calculation Data and Image object Moving Out of Boundary Lab 3 Shooting out a list of Bullets Shrinking the Time Left Bar

phyre
Télécharger la présentation

Other Input Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Other Input Methods Pre-Lab Lecture 4

  2. Revisit • Pre-Lab 3 – Animation • Boundary Information • Layer Concept • Animation algorithm • Next Position Calculation • Data and Image object • Moving Out of Boundary • Lab 3 • Shooting out a list of Bullets • Shrinking the Time Left Bar • Generating the target balloon and moving down

  3. Overview of Lab 4 INPUT PROCESS OUTPUT Part 1 Part 2 Part 3 Touch on Screen Tilt Device Tutorial 4

  4. Input method • Now, we can control • The movement of the user Image through the Left or Right Button • The Bullets Shooting through the Shoot Button • In fact, iPhone is well-known of the following two kinds of input method which will be discussed in this Pre-Lab Lecture • Touches and Gestures Detection • Device Tilting

  5. Touch Event vs Action on UI Component • Similarity • Both of them involve the finger touches on the screen • Differences • If the finger touches on the screen which has a UI component, and this UI component can apply action, then this becomes an action to the UI component • e.g., Press a button activates the “Touch Up Inside”action on the left button in Tutorial 1 Part 3. • Otherwise, this becomes a touch event to the screen view • i.e., We can only know some basic information like • The point touches by the finger on the screen view • How many points currently touched on the screen view • By interpreting these basic information, some sophisticated touches actions and gestures can be detected.

  6. Touches Actions and Gestures • Touches Actions and Gestures • Tapping • Single Tap – touch a single point of the screen once • Double Tap – touch a single point of the screen twice • Multitouch • Touch several points on the screen simultaneously • Dragging • Touch on a certain UI component and move the center of the UI component • Swipping • Moving on the screen to the right or left to represent next page/prev • Zooming In or Out • Moving two fingers towards or outwards to represent zoom in or zoom out gesture

  7. Tapping – Single Tap • Touch a Single Point on the Screen Touch point on screen Screen View

  8. Tapping – Double Taps and Multi Taps • Double Taps: Touch a Single Point on the Screen twice within a short period of time • Multiple Taps: Taps more than twice are possible to be detected Second Touch point on screen First Touch point on screen Screen View

  9. MultiTouches • MultiTouches: It is possible to detect more than one touch on the screen simultaneously Touch point 2 Touch point 1 Screen View

  10. Dragging • Dragging: Touches on a certain UI component (cannot apply action type) and move to other place on the screen together with the UI component Screen View

  11. Swipping • Swipping: Touches on a certain point on the screen and move to other place • Usually, moving right is used to represent the gesture of next page, and moving left is used to represent the gesture of previous page Screen View

  12. Zooming in/out • Zooming in: Two fingers touch on the screen simultaneously and move towards each other. This is to represent the gesture that we would like to zoom in to look more detail on the image • Zooming out: Two fingers touch on the screen simultaneously and move outwards Screen View

  13. Basic Touches Event Handlers • Traditionally, iPhone SDK does not provide any method for interpreting the touches actions and gestures • They only provide three basic methods for handling three different stages of touches events • Touches Began • This method will be invoked ONCE when the finger first touches on the screen every time • Touches Moved • This method will be invoked CONTINUOUS when the fingermoves on the screen • Touches Ended • This method will be invoked ONCE when the finger leaves the screen

  14. Part 1A - Touch Events Test • Basically, the iPhone app can handle three basic touch events when you implement the following methods inside the view controller • TouchesBegan Method • TouchesMoved Method • TouchesEnded Method • We will then discuss how to interpret several common touches actions from these touches events • Single Tap on Screen • Moving on the Screen • Double Taps on the Screen • Two Touches on the Screen

  15. Basic Touches Method // Touches Began Method -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@”Touches Began”); } // Touches Moved Method -(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@”Touches Moved”); } // Touches Ended Method -(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@”Touches Ended”); }

  16. Example Situation 1 – Single Tap on Screen Finger leaves the screen Finger touches the screen TouchesEnded TouchesBegan

  17. Example Situation 2 – Moving on the screen Finger moves on the screen Finger leaves the screen Finger touches the screen TouchesEnded TouchesMoved TouchesBegan

  18. Information provided by a touch event • Within each touch event method, you can request it to give you some more detail information • How many touches currently identified on the screen • The touch point of each touch on the screen • How many times a certain point is touch consecutively

  19. Example: Touches Began Event -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ // By invoking the allTouches method of event, which is provided by the touches method (i.e., touches began method in this case), a set of touches will be returned. // Note that the return type is NSSet, and it is just a convention of using allTouches as a variable to hold the set. You can definitely use our variable name. • NSSet* allTouches = [eventallTouches]; // By invoking allObjects method of the allTouches, we will then get an NSArray holding the UITouch object // Note that UITouch can be regarded as the object which actually holds a specific touches information on the screen, e.g., the point that this touch occurs. In the following code, we are interested in the first object of the array. • UITouch* touch = [[allTouchesallObjects] objectAtIndex:0];

  20. Information that we may interest in // Touch Point Location Information // By invoking the locationInViewmethod of the UITouch, we can get the actual touch point location of this touch on our current screen view by using our current screen view as the input parameter, i.e., [self view] • CGPointtouchPoint = [touchlocationInView:[self view]]; // Number of touches simultaneously on screen // Recall that allTouches is the variable holding the set of touches on the screen. // We can ask for how many touches occur on the screen by asking for its size directly • [allTouchescount]; // Number of Consecutive Touches on a single point // Recall that touch is a variable referring to a specific touch information on the screen // We can invoke its tapCountmethod to get this information • [touchtapCount];

  21. Example Situation 3 – Double Taps or Multi Taps • NSSet * allTouches = [event allTouches]; • UITouch * touch = [[allTouches allObjects] objectAtIndex:0]; • switch ([touch tapCount]){ • case 2:{ • NSLog(@”Touch a point 2 times screen”); • CGPointtouchPoint = [touchlocationInView:[self view]]; • NSLog(@”x: %f, y: %f”, touchPoint.x, touchPoint.y); • } }

  22. Example Situation 4 – Two or More Touches on Screen • NSSet * allTouches = [event allTouches]; • switch([allTouches count]){ • case 2:{ • NSLog(@”touch 2 points on screen”); • UITouch * touch1= [[allTouches allObjects] objectAtIndex:0]; • CGPointtouchPoint1 = [touch1locationInView:[self view]]; • NSLog(@”x: %f, y: %f”, touchPoint1.x, touchPoint1.y); • UITouch * touch2= [[allTouches allObjects] objectAtIndex:1]; • CGPointtouchPoint2 = [touch2locationInView:[self view]]; • NSLog(@”x: %f, y: %f”, touchPoint2.x, touchPoint2.y); • } • }

  23. Discussion on Multi-Touches Case • Please note that the order of the touch points is not fixed • Refer to the previous example, • (a)First touch point is “x:204, y:109” • (b) Second touch point is “x:276. y:211” • However, even if you touch the two same points again, it is possible for (a) to be second touch point, and (b) to be the first touch point. • i.e., the order is reverse

  24. Simulating two touch points in iPhone Simulator • In iPhone simulator, single touch point is used by default. • To simulate two points, you can press OPTION when you move the mouse pointer on the screen view.

  25. Part 1B - Double Taps to Shoot • Hints: • Now, when you press the shoot button, the bullet will be shot To support the new function • Add in a method “fireBullet” to handle all the bullet shooting situations. Invoke the method when • Press the Shoot Button • Double Taps on the screen

  26. Part 2 – Touch Moved and Dragging Practice • Objective: • Allow the shooter to be dragged horizontally when the user touches on the image. • The orientation of the image can be changed by swiping outside the image. • Algorithm: • Detect the touch move position on screen • Check whether the touch falls on the image or not • If true • Handle the userImage move situation • If false • Handle the angle change situation • Problem: • Why Implement the function in TouchesMoved? • How to distinguish whether your finger falls on the image or not? • How to calculate the angle moved

  27. Image Touch Detection Technique • Recall that to detect the touch location on the screen view • UITouch * touch = [[allTouches allObjects] objectAtIndex:0]; • CGPoint touchPT = [touch locationInView:[self view]]; • To check whether your finger touch falls on the image • touchPT.x > x1 and touchPT.x < x2 • touchPT.y > y1 and touchPT.y <y2 • Note that x2 = x1+width, y2 = y1+height (x1, y1) (x2, y1) (x1, y2) (x2, y2)

  28. Angle Calculation Technique P3 (x, y - 10) P1 (x’, y’) Finger Touch Point Artificial Point Angle Can be Calculated P2 (x, y) User Image Center Point • For your simplicity, we implemented AngleCalculator.h and AngleCalculator.m • Method find_angle returns the angle at p2 of the triangle formed by the three points by taking CGPoints p1, p2, p3 as input parameters. • To make a CGPoint, you can use CGPointMake(float x, float y).

  29. Image Angle Rotation 0 • To change the angle of the userImage • We can rotate the userImage to the appropriate direction • // Rotate the view • CGAffineTransform transform = CGAffineTransformMakeRotation(<angle of rotation>); • <IBOutlet of the image>.transform = transform; • Note that the angle of rotation refers to the angle start from the 0 no matter what the current angle of rotation is 0 angle of rotation

  30. Input Method – Device Tilting • iPhone allows the application to detect device tilting through the taking of the accelerometer reading periodically • They provide accelerometer reading in three directions • X, Y, and Z • To simplify the case, we will only explain X and Y in the Pre-Lab • Accelerometer reading is bounded by (-1 < 0 < 1) depends on the degree you tilt the device • i.e., the larger the degree you tilt, the larger will be its absolute value. The maximum absolute value will be 1 • We can use this to act as an another kind of input method to our application

  31. Horizontal Position • This is the position in which the device is horizontally put on the desk • In this case, • X = 0, Y = 0 • Suppose we rotate the device right

  32. Right Rotation Position • This is the position in which the device is rotated to right on the desk • In this case, • X = 1, Y = 0 • Suppose we rotate the device left this time

  33. Left Rotation Position • This is the position in which the device is rotated to left on the desk • In this case, • X = -1, Y = 0 • Suppose we rotate the device to vertical up position

  34. Vertical Up Position • This is the position in which the device is rotated to vertical up position • In this case, • X = 0, Y = -1 • Suppose we rotate the device to vertical upside down this time

  35. Vertical Upside Down Position • This is the position in which the device is rotated to vertical upside down position • In this case, • X = 0, Y = 1

  36. To conclude -y +x -x +y

  37. Recall: Screen View Orientation and Coordinate Representation Coordinate System Changes when it changes from Protrait mode to landscape mode (320, 0) (0, 0) (0, 0) (480, 0) Landscape Mode Protriat Mode (0, 320) (480, 320) (0, 480) (320, 480)

  38. Accelerometer Reading • However, this representation will not change even the orientation of the screen change -x -y Home Button +x -x -y Protrait Mode Landscape Mode Home Button +y +x +y

  39. Part 3A - Detecting Device Tilting Event (UIAccelerometer Event) • Implement UIAccelerometer Delegate Protocolby changing the header of BallShootingViewControllerto • // This is to show that this view controller has implemented a method to handle the Accelerometer reading • @interface BallShootingViewController : UIViewController<UIAccelerometerDelegate> • Initialize the Accelerometer Reading with the update Interval by writing method initializationAccelerometer. Make sure you call this method in ViewDidLoad. • // Reset the Accelerometer • [[UIAccelerometer sharedAccelerometer] setDelegate:nil]; • // Set the Accelerometer to take the reading every DEFAULT_TIMER_RATE second • [[UIAccelerometer sharedAccelerometer] setUpdateInterval:DEFAULT_TIMER_RATE]; • // This is to tell the Accelerometer that the current view controller has implemented a method in handling when the reading is ready • [[UIAccelerometer sharedAccelerometer] setDelegate:self];

  40. Part 3A - Detecting Device Tilting Event (UIAccelerometer Event) II • Implement the didAccelerate method in the view controller which will be invoked when accelerometer reading is ready • - (void) accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration{ • // You can get the corresponding X, Y, Z reading by looking for acceleration.x, acceleration.y, and acceleration.z in the didAcceleratemehtod • NSLog(@"Accel x: %f, Accel y: %f, Accel z: %f", acceleration.x, acceleration.y, acceleration.z); • } • Note that you must load your application to the real device to test the results.

  41. Part 3B – Tilting Device to Move User Image • In part 3B, we would like the student to implement the function • Tilt device to right hand side in landscape mode to move userImage to right • Tilt device to left hand side in landscape mode to move userImage to left • Hints: • Applying appropriate offset (obtained from the accelerometer Reading) to the center position of the userImage • Careful handling the Accelerometer Reading when tilting the device

More Related