Figure 4-14. Testing EightBall
Choose the Hardware ➤ Shake Gesture command in the simulator. This command simulates the user shaking their device, which will cause shake motion events to be sent to your app.
Congratulations, you’ve successfully created a shake-motion event handler! Each time you shake your simulated device a new message appears, as shown in Figure 4-15.
Figure 4-15. The working EightBall app
Finishing Touches
Put a little spit and polish on your app with a nice icon. Well, at least with the icon you’ll find in the EightBall (Resources) folder. In your project navigator, select the images.xcassets file, and then select the AppIcon group. With theEightBall (Resources) folder visible, drag the three icon images files into the AppIcon preview area, as shown in Figure 4-16. Xcode will automatically assign the appropriate image file to each icon resource, based on its size.
Figure 4-16. Importing app icons
With that detail taken care of, let’s shake things up—literally—by running your app on a real iOS device.
Testing on a Physical iOS Device
You can test a lot of your app using Xcode’s iPhone and iPad simulator, but there are few things the simulator can’t emulate. Two of those things are multiple (more than two) touches and real accelerometer events. To testthat, you need a real iOS device, with real accelerometer hardware, that you can touch with real fingers.
The first step is to connect Xcode to your iOS Developer account. Choose Xcode ➤ Preferences ... and switch to the Accounts tab. Choose Add Apple ID ... from the + button at the bottom of the window, as shown in Figure4-17.
Figure 4-17. Adding a new Apple ID to Xcode
Supply your Apple ID and password, and then click the Add button. If you’re not a member of
the iOS Developer Program yet, there’s a convenient Join Program ... button that will take you to Apple’s website.
Plug an iPhone, iPad, or iPod Touch in to your computer’s USB port. Open the Xcode organizer window (Window ➤ Organizer). In the toolbar, switch to the devices tab. The iOS device you plugged in will appear on the left, asshown in Figure 4-18. If a “trust” dialog appears on your device, as shown on the right in Figure 4-18, you’ll need to grant Xcode access to your device.
Figure 4-18. Device management
Select your iOS device and click the Use for Development button. Xcode will prepare your device for development, a process known as provisioning. This will allow you to build, install, and run most iOS projects directly throughXcode.
Once your device is provisioned, return to your project workspace window. Change the scheme setting from one of the simulators to your actual device. I provisioned an iPhone, so iPhone appears as one of the run destinationsin Figure 4-19.
Figure 4-19. Selecting an iOS device to test
Run the EightBall app again. This time, your app will be built, copied onto your iOS device, and the app will start running there. Pretty neat, isn’t it?
The amazing thing is that Xcode is still in control—so don’t unplug your USB connection just yet!
You can set breakpoints, freeze your app, examine variables, and generally do anything you could do in the simulator.
With EightBall app running, shake your device and see what happens. When you’re done, click the Stop button in the Xcode toolbar. You’ll notice that your EightBall app is now installed on your device. You’re free to unplugyour USB connection and take it with you; it is, after all, your app.
Other Uses for The Responder Chain
While the responder chain concept is still fresh in your mind, I want to mention a couple of other uses for the responder chain, before you move on to low-level events. The responder chain isn’t used solely to handle events. Italso plays an important role in actions, editing, and other services.
In earlier projects, you connected the actions of buttons and text fields to specific objects.
Connecting an action in Interface Builder sets two pieces of information:
n The object that will receive the action (SUViewController)
n The action message to send (-shortenURL:)
It’s also possible to send an action to the responder chain, rather than a specific object. In Interface Builder you do this by connecting the action to the First Responder placeholder object, as shown in Figure 4-20.
Figure 4-20. Connecting an action to the responder chain
When the button’s action is sent, it goes initially to the first responder object—whatever that object is. For actions, iOS tests to see if the object implements the expected message (-loadLocation:,
in this example). If it does, the object receives that message. If not, iOS starts working its way
through the responder chain until it finds an object that does.
This is particularly useful in more complex apps where the recipient of the action message is outside the scope of the Interface Builder file. You can only make connections between objects in the same scene. If you need abutton to send an action to another view controller, or the application object itself, you can’t make that connection in Interface Builder. But you can connect your button to the first responder. As long as the intended recipient is in the responder chain when the button fires its action, your object will receive it.
Editing also depends heavily on the responder chain. When you begin editing text in iOS, like the URL field in the Shorty app, that object becomes the first responder. When the user types on the keyboard—virtual orotherwise—those key events are sent to the first responder. You can have several text fields in the same screen, but only one is the first responder. All key events, copy and paste commands, and so on, go to the active textfield.
Touchy
You’ve learned a lot about the so-called high-level events, the initial responder, and the responder chain. Now it’s time to dig into low-level event handling, and you’re going to start with the most commonly used low-levelevents: touch events.
The Touchy app is a demonstration app. It does nothing more than show you where you’re touching the screen. It’s useful both to see this in action and to explore some of the subtleties of touch event handling. You’ll also learn anew, and really important, Interface Builder skill: creating custom objects in your interface.
Design
The Touchy app also has a super-simple interface, as depicted in
Figure 4-21. Touchy will display the location, or locations, where you’re touching your view object. So the app isn’t too boring, you’ll jazz it up a little with some extra graphics, but that’s not the focus of this outing.
Figure 4-21. Sketch of Touchy app
The app will work by intercepting the touch events using a custom view object. Your custom view object will extract the coordinates of each active touch point and use that to draw their positions.
Figure 4-22. Creating the Touchy project
Choose a location to save the new project and create it. In the project navigator, select the project, select the EightBall target, select the summary tab, and then turn off the two landscape orientations in the supported interface orientation section, so only the portrait orientation is enabled.
Creating a Custom View
You’re going to depart from the development pattern you’ve used in previous apps. Instead of adding your code to the TYTouchViewController class, you’re going to create a new custom subclass of UIView. “Why” is explained in Chapter 11. “How” will be explained right now.
Select the Touchy group (not the project) in the project navigator. From the File menu, or by right/control+clicking on the Touchy group, choose the New File... command, as shown
in Figure 4-23.
Figure 4-23. Creating a new source file
Much like the project template assistant, Xcode provides templates for creating individual files too. You’re going to create a new Objective-C class, so choose the Objective-C Class template in the iOS Cocoa Touch group, asshown in Figure 4-24.
Figure 4-24. Choosing a new file template
Name the new file TYTouchyView, and change its subclass to UIView, as shown in Figure 4-25. Click Next and Xcode will ask where you want to save your file. Make sure the Touchy target is checked. Accept the default location(inside your project folder) and click Create. This will add two new files to your project: TYTouchView.h and TYTouchyView.m.
Figure 4-25. Naming your new Objective-C class
You’ve successfully created a new Objective-C class! Your class is a subclass of UIView, so it inherits all of the behavior and features of a UIView object, and can be used anywhere a UIView object can.
Handling Touch Events
Now you’re going to customize your UIView object to handle touch events. Remember that the base class UIResponder and UIView don’t handle touch events. Instead, they just pass them up the responder chain. Byimplementing your own touch event handling methods, you’re going to change that so your view responds directly to touches.
As you already know, touch events will be delivered to the view object they occurred in. If you didn’t know that, go back and read the section “Hit Testing.” All you have to do is add the appropriate event handling methods toyour class. Add the following code to your TYTouchyView.m implementation file, just before the @end statement:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[self updateTouches:event.allTouches];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self updateTouches:event.allTouches];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self updateTouches:event.allTouches];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[self updateTouches:event.allTouches];}
Each touch event message includes two objects. An NSSet object, containing the touch objects of interest, and a UIEvent object that summarizes the event that caused the message to be sent.
In a typical app, your method would be interested in the touches set. This set, or unordered collection, of objects contains one UITouch object for every touch relevant to the event. Each UITouch object describes one touchposition: its coordinates, its phase, the time it occurred, its tap count, and so on.
For a “began” event, the touches set will contain the UITouch objects for the touches that just began. For a “moved” event, it will only contain those touch points that moved. For an “ended” event, it will contain only those touchobjects that were removed from the screen. This is very convenient, from a programming perspective, because most view objects are only interested in the UITouch objects that are relevant to that event.
The Touchy app, however, is a little different. Touchy wants to track all of the active touches, all of
the time. You’re not actually interested in what just happened. Instead you want “the big picture”: the list of all touch points currently in contact with the screen. For that, move over to the event object.
The UIEvent object’s main purpose is to describe the single event that just occurred; or, more precisely, the single event that was just pulled from the event queue. But UIEvent has some other interesting information that itcarries around. One of those is the allTouches property that contains the current state of all touch points on the device, regardless of what
view they are associated with.
So now I can explain what all of your event handling methods are doing. They are waiting for any change to the touch state of the device. They ignore the specific change, and dig into the event object to find the state of allactive touch objects, which it passes to your -updateTouches: method. This method will record the position of all active touches and use that information todraw those positions on the screen.
So, I guess you need write that method! Immediately above the touch event handler methods you just added in TYTouchyView.m, add this method to your implementation:
- (void)updateTouches:(NSSet*)set
{
NSMutableArray *array = [NSMutableArray array]; for ( UITouch *touch in set )
{
switch (touch.phase) {
case UITouchPhaseBegan: case UITouchPhaseMoved:
case UITouchPhaseStationary:
[array addObject:[NSValue valueWithCGPoint:
[touch locationInView:self]]];
break; default:
break;
}
}
touchPoints = array; [self setNeedsDisplay];
}
You’ll also want to declare that method in a private interface before the @implementation statement, and you’ll also need to define a variable to store the active touch points:
@interface TYTouchyView ()
{
NSArray* touchPoints;
}
- (void)updateTouches:(NSSet*)set;
@end
Now, back to the -updateTouches: method. It starts by creating an empty array object. This is where you’ll store the information you’re interested in. -updateTouches: then loops through each of the UITouch objects in the set andexamines its phase. The phase of a touch is its current state: “began,” “moved,” “stationary,” “ended,” or “canceled.” Touchy is only interested in the states that represent
a finger that is still touching the glass (“began,” “moved,” and “stationary”). The switch statement matches these three states, obtains the coordinates of the touch relative to this view object, and converts that into an NSValueobject (suitable for adding to a collection). The NSValue object is then added to the collection.
When all of the active touch coordinates have been gathered, the new collection is saved in your object’s private touchPoints variable. Finally, your view object sends itself a -setNeedsDisplay message. This message tells your view object that it needs to redraw itself.
Drawing Your View
So far, you haven’t written code to draw anything. You’ve just intercepted the touch events sent to your view and extracted the information you want about the device’s touch state. In iOS, you don’t draw things when they happen. You make note of when something needs to be drawn, and wait
for iOS to tell your object when to draw it. Drawing is initiated by the user interface update events I mentioned at the beginning of this chapter.
How drawing works is described in Chapter 11, so I won’t go into any of those details now. Just know that when iOS wants your view to draw itself, your object will receive a -drawRect: message. Add this -drawRect: message toyour implementation:
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext(); [[UIColor blackColor] set]; CGContextFillRect(context,rect);
UIBezierPath *path = nil; if (touchPoints.count>1)
{
path = [UIBezierPath bezierPath]; NSValue* firstLocation = nil;
for ( NSValue *location in touchPoints )
{
if (firstLocation==nil)
{
firstLocation = location;
[path moveToPoint:location.CGPointValue];
} else
{
[path addLineToPoint:location.CGPointValue];
}
}
if (touchPoints.count>2)
[path addLineToPoint:firstLocation.CGPointValue];
[[UIColor lightGrayColor] set]; path.lineWidth = 6;
path.lineCapStyle = kCGLineCapRound; path.lineJoinStyle = kCGLineJoinRound; [path stroke];
}
unsigned int touchNumber = 1; NSDictionary* fontAttrs = @{
NSFontAttributeName: [UIFont boldSystemFontOfSize:180], NSForegroundColorAttributeName: [UIColor yellowColor]
};
for ( NSValue *location in touchPoints )
{
NSString *text = [NSString stringWithFormat:@"%u",touchNumber++]; CGSize size = [text sizeWithAttributes:fontAttrs];
CGPoint touchPoint = location.CGPointValue;
CGPoint textCorner = CGPointMake(touchPoint.x-size.width/2, touchPoint.y-size.height/2);
[text drawAtPoint:textCorner withAttributes:fontAttrs];
}
}
Wow, that’s a lot of new code. Again, the details aren’t important, but feel free to study this code to get a feel for what it’s doing. I’ll merely summarize what it does.
The first part fills the entire view with the color black.
The middle section is a big loop that creates a Bezier path, named after the French engineer Pierre Bézier. A Bezier path can represent practically any line, polygon, curve, ellipsis, or any arbitrary combination of those things. Basically,if it’s a shape, a Bezier path can draw it. You’ll learn all about Bezier paths in chapter 11. Here, it’s used to draw light grey lines between the touch points. It’s pure eye candy, and this part of the -drawRect: method could be left out andthe app would still work just fine.
The last part is the interesting bit. It loops through the touch coordinates and draws a big “1,” “2,” or “3” centered underneath each finger that’s touching the screen, in yellow.
Now you have custom view class that collects touch events, tracks them, and draws them on the screen. The last piece of this puzzle is how to get your custom object into your interface.
Adding Custom Objects in Interface Builder
Select your Main_iPhone.storyboard Interface Builder file and select the one and only view object in the view controller scene. Switch to the identity inspector. The identity inspector shows you the class of the object selected. Inthis case, it’s the plain-vanilla UIView object created by the project template.
Here’s the cool trick: You can use the identity inspector to change the class of the object to any subclass of UIView that you’ve created. Change the class of this view object from UIView to TYTouchyView, as shown in Figure4-26. You can do this either by using the pull-down menu or by just typing in the name of the class.
Figure 4-26. Changing the class of an Interface Builder object
Now instead of creating a UIView object as the root view, your app will create a TYTouchyView object, complete with all of the methods, properties, outlets, and actions you defined. You can do this to any existing object in yourinterface. If you want to create a new custom object, find the base class object in the library (NSObject, UIView, etc.), add that object, and then change its class to your custom one.
There are still a few properties of your new TYTouchyView object that need to be customized for it to work correctly. With your TYTouchyView object still selected, switch to the attributes inspector and check the MultipleTouch option under Interaction. By default, view objects don’t receive multi-touch events. In other words, the -touchSomePhase:withEvent: message will never contain
more than one UITouch object, even if multiple touches exist. To allow your view to receive all of the touches, you must turn on the multiple touch option.
Select the Main_iPad.storyboard Interface Builder file and make the same changes you just made to Main_iPhone.storyboard. Now your app is ready to test.
Testing Touchy
Set your scheme to the iPhone or iPad simulator and run your project. The interface is completely—and ominously—black, as shown on the left in Figure 4-27.
Figure 4-27. Running Touchy in the simulator
Click on the interface and the number “1” appears, as shown in the middle of Figure 4-27. Try dragging it around. Touchy is tracking all changes to the touch interface, updating it, and then drawing a number under theexact location of each touch.
Hold down the option key and click again. Two positions appear, as shown on the right in Figure 4-27. The simulator will let you test simple two finger gestures when you hold down the option key. With just the option key, you cantest pinch and zoom gestures. Hold down both the option and shift keys to test two-finger swipes.
But that’s as far as the simulator will go. To test any other combination of touch events, you have to run your app on a real iOS device. Back in Xcode, stop your app, change the scheme to iOS Device (iPhone, iPad, or iPodTouch—whatever you have plugged in). Run your app again.
Now try out Touchy on your iOS device. Try touching two, three, four, or even five fingers. Try moving them around, picking one up, and putting it down again. It’s surprisingly entertaining.
Advanced Event Handling
There are a couple of advanced event handling topics I’d like to mention, along with some good advice. I’ll start with the advice.
Keep your event handling timely. As you now know, your app is “kept alive” by your main thread’s run loop. That run loop delivers everything to your app: touch events, notifications, user interface updates, and so much more.Every event, action, and message that your app handles must execute and return before the next event can be processed. That means if any code you write takes too long, your app will appear to have died. And if code you writetakes a really, really, long time to finish, your app will die—iOS will terminate your app because it’s stopped responding to events.
I’m sure you’ve had an app “lock up” on you; the display is frozen, it doesn’t respond to touches, or shaking, or anything. This is what happens when an app’s run loop is off doing something other than processing events. It’s notpleasant. Most iOS features that can take a long time to complete have asynchronous methods (like the ones you used in Shorty) so those time-consuming tasks won’t tie up your main thread. Use these asynchronousmethods, pay attention to how long your program takes to do things, and be prepared to reorganize your app to avoid “locking up” your run loop.
I’ll demonstrate all of these techniques in later chapters.
Secondly, handling multiple touch events can be tricky, even confusing. iOS does its best to
untangle the complexity of touch events and present them to your object in a rational, and digestible, form. iOS provides five features that will make your touch event handling simpler:
n Gesture recognizers
n Filtering out touch events for other views
n Prohibiting multi-touch events
n Providing exclusive touch event handling
n Suspending touch events
Gesture recognizers are special objects and intercept touch events on behalf of a view object. Each recognizer detects a specific touch gesture, from a simple tap to a complex multi-finger swipe. If it detects the gesture it’sprogrammed to recognize, it sends an action—exactly like the button objects you’ve used in earlier projects. All you need to do is connect that action to an object in Interface Builder and you’re done. This feature alone has saved iOS developers tens of thousands of lines of touch event handling code. I’llshow you how to use gesture recognizer objects in later chapters.
As I described earlier, the touch event methods (like -eventBegan:withEvent:) only include
the relevant touch objects—those touches that originated in your view object—in the touches parameter. Your code doesn’t have to worry about other touches in other views that might be happening at the same time. InTouchy, this was actually a disadvantage, and you had to dig up the global set of touch objects from the UIEvent object. But normally, you only pay attention to the touches in your view.
You’ve also seen how iOS will prohibit multi-touch events using UIView’s multipleTouchEnabled property. If this property is NO, iOS will only send your view object events associated with the first touch—even if the user is actuallytouching your view with more than one finger. For the Touchy app to get events about all of the touches, you had to set this property to YES. Set this property to NO
if your view only interprets single touch events and you won’t have to write any code that worries about more than one touch at a time.
If you don’t want iOS to be sending touch events to two view objects simultaneously, you can set UIView’s exclusiveTouch property to YES. If set, iOS will block touch events from being sent to any other views once a touchsequence has begun in yours (and vice versa).
Finally, if your app needs to, you can temporary suspend all touch events from being sent to a specific view or even your entire app. If you want to make an individual view “deaf” to touch events, set its userInteractionEnabledproperty to NO. You can also send your application object the
-beginIgnoreingInteractionEvents message, and all touch events for you app will be silenced. Turn
them back on again by sending -endIgnoringInteractionEvents. This is useful for preventing touch events from interfering with something else that’s going on (say, a short animation), but don’t leave them turned off for very long.
Summary
By now you have a pretty firm grasp on how messages and events get into your app and how they are handled. You know about the event queue and the run loop. You know that events in the queue are dispatched to theobjects in your app. You know that some of them go directly to your objects, touch events use hit testing, and the rest get sent to the first responder.
You’ve learned a lot about the responder chain. The responder chain performs a number of important tasks in iOS, beyond delivering events.
You know how to configure an object to handle, or ignore, specific types of events. You’ve written two apps, one that handled high-level events, and a second that tracked low-level touch events.
Possibly even more astounding, you built and ran your app on a real iOS device! Feel free to run any other projects on your device too. Creating your very own iOS app that you can carry around with you is a very impressive feat!
In the next chapter, you’re going to learn a little about data models and how complex sets of data get turned into scrolling lists on the screen.














No comments:
Post a Comment