Chapter 14. Working with the Real World

Desktops, laptops, iPhones, and iPads are all physical devices and exist in the real world—either on your desk, on your lap, or in your hand. For a long time, your apps were largely confined to your computer, and weren’t able to do much with the outside world besides instructing a printer to print a document.

Starting with iOS and OS X 10.6, however, things began to change, and your code is now able to learn about the user’s location, how the device is moving and being held, and how far away the computer is from landmarks.

In this chapter, you’ll learn about how your programs can interact with the outside world. Specifically, you’ll learn about how to use Core Location to determine where your computer or device is on the planet, how to use Core Motion to learn about how the user is holding the device, and how to use the printing services available on OS X and iOS to work with printers.

Note

Most of the technology discussed in this chapter works on both OS X and iOS. Some of the technology has an identical API on both platforms (Core Location), some has different APIs on the two platforms (print services) and some is only available on iOS (Core Motion). We’ll let you know which technology is available where.

Working with Location

Almost every user of your software will be located on Earth.[1]

Knowing where the user is on the planet is tremendously useful, because it enables you to provide more relevant information. For example, while the recommendations ...

Get Learning Cocoa with Objective-C, 3rd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.