Posted on by & filed under Content - Highlights and Reviews, Mobile Development.

The types of iOS applications that continue to draw people’s attention are those that make use of augmented reality. Augmented reality (AR) can generally be described as another form of virtual reality.  It is technology used to augment sensory input, like video, enhancing one’s perception of reality in real-time.  There are different types of augmented reality-based apps for iOS that serve many different purposes.

Different Applications for Augmented Reality

We often see augmented reality in interactive advertising.  Advertisers will have QR codes printed on real-world items, such as a page on a magazine.  These ads require you to use their app to view the page on your iPhone to generate a virtual model of the item they are advertising right on the screen.

World Lens is a great example of an AR app.  It translates English, French, and Spanish on the fly using the camera view.  There is no interaction involved, other than pointing your camera at text that you’d like translated.  The app will replace the text shown on your screen with the translation.

Utilizing Mobile Sensory Input

If you’re going to create an augmented reality app of your own, you’ll need to overlay information over the view of the live view of the camera.  Effective AR applications take advantage of the different sensors that are available on our devices.  As iOS developers, we have access to the phone’s camera, accelerometer, gyroscope, compass, and GPS.

An AR application, such as a Car Finder, makes use of the camera, GPS and compass to overlay directional and distance information.  By first marking the location of a car, the app stores the device’s current latitude and longitude using the GPS.  To guide a person back to their car, we can continuously calculate the distance between the phone’s current location and the saved position of the car.  Using the compass, we can point the user in the right direction given their current heading.   All of this information is then displayed on the screen using an overlay on top of the live camera view.

Retrieving Your Current Heading Using the Compass

Accessing your heading information is easily accomplished with the Core Location framework.  If we already have a project setup, with the Core Location framework already linked in, we can add the following to a new UIViewController, named LocationViewController for this example.

Within the interface, we’d need to import the Core Location framework.  Core Location allows us to retrieve information related to a user’s current location. We’ll then need to specify this class as the delegate for the CLLocationManager, which we’ve setup an instance variable for:

#import <UIKit/UIKit.h>
#import <CoreLocation/CoreLocation.h>

@interface LocationViewController : UIViewController <CLLocationManagerDelegate> {
CLLocationManager *_locationManager;
}

@property(nonatomic,retain) CLLocationManager *locationManager;

@end

Our next step is to initialize and setup the CLLocationManager within the implementation of LocationViewController.  First, setup the getters and setters for the locationManager by adding the following to the implementation:

@synthesize locationManager = _locationManager;

Within the viewDidLoad method we’ll initialize our CLLocationManager instance as well as some other properties:

- (void) viewDidLoad
{
[super viewDidLoad];

CLLocationManager *locationManager = [[CLLocationManager alloc]  init];
[self setLocationManager:locationManager];
[locationManager release];

self.locationManager.delegate = self;

if([CLLocationManager headingAvailable]) {
[self.locationManager startUpdatingHeading];
}
}

In the bit of code above, we’re first allocating and initializing a new CLLocationManager and setting it to our instance variable.  We then set our class as the delegate. The last important piece is checking if we have access to the compass by calling the class method, headingAvailable.

We can now setup the delegate method to retrieve the compass’ heading information:

- (void) locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading
{
NSLog(@”Magnetic Heading: %f”, newHeading.magneticHeading);
}

Above we’re retrieving the new heading information as a CLHeading object.  We’re then logging the current magnetic heading property.  If the value is 0, it means that we’re pointing magnetic north, 90 means east, 180 is south, etc.

We can do quite a bit with this data.  All of it won’t help much if we can’t overlay it over the live camera view for an augmented reality app.  Thankfully, overlaying UI elements over the live camera view is relatively simple.

Overlaying Elements on top of the Camera View

Although I used iOS 5.0 for the example in this post, since iOS 3.1, Apple has officially allowed developers to overlay UI elements on top of the live camera.  The process is relatively simple.  You first create an instance of the UIImagePickerViewController and then specify a UIView as the cameraOverlayView.  We can modify our previous example to use such an overlay view. Another good overlay example can be found in Pro iOS 5 Augmented Reality.

In our LocationViewController’s interface, we can add a UILabel, making it look like the following:

#import <UIKit/UIKit.h>
#import <CoreLocation/CoreLocation.h>

@interface LocationViewController : UIViewController <CLLocationManagerDelegate> {
CLLocationManager *_locationManager;
UILabel *_label;
}

@property(nonatomic,retain) CLLocationManager *locationManager;
@property(nonatomic,retain) UILabel *label;

@end

Next, we can setup the getter and setter for the label by adding the following to the implementation:

@synthesize label = _label;

Now add the following to the bottom of the viewDidLoad method to setup the UILabel we just added:

UILabel *label = [[UILabel alloc] initWithFrame:CGRectMake(0,200,320,40)];
label.font = [UIFont boldSystemFontOfSize:18.0f];
label.backgroundColor = [UIColor clearColor];
label.textColor = [UIColor whiteColor];
label.textAlignment = UITextAlignmentCenter;
label.opaque = NO;

[self setLabel:label];
[label release];

We’re positioning the UILabel to be close to the center of the screen and setting its font size to 18. To make it appear better, we’re setting the background to be transparent and the text to be white. Now, to setup the UIImagePickerController, add the following to the viewDidAppear: method.

-(void) viewDidAppear:(BOOL)animated
{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];

picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.showsCameraControls = NO;
picker.navigationBarHidden = YES;
picker.cameraViewTransform = CGAffineTransformScale(picker.cameraViewTransform, 1.242f, 1.242f);
[picker setCameraOverlayView:self.label];

[self presentModalViewController:picker animated:YES];
[picker release];
}

We can now make our label display the current magnetic heading to demonstrate a constantly updating overlay.  Change the locationManager:didUpdateHeading: so that we’re setting our label’s text instead of logging it to the console:

- (void) locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading
{
self.label.text = [NSString stringWithFormat:@"%f°", newHeading.magneticHeading];
}

Besides setting up our UILabel, what we’ve done is setup the UIImagePickerController so that it displays the camera view.  The source type is set to display the live camera view.  We’re then hiding the camera controls and navigation bar.  To fill up the whole screen properly we’re setting the cameraViewTransform property by using the CGAffineTransformScale() function.  The overlay view for the UIImagePickerController accepts any UIView, so we’re simply pointing it to our UILabel.

The label is now displaying our current magnetic heading in real-time over the live camera view, as shown in this figure.

We’ve only covered a very small amount of what we can do with augmented reality apps.  This post should be enough to get your feet wet.  Use it to help you create your own unique augmented reality application.

Safari Books Online has the content you need

Take advantage of these Augmented Reality and iOS resources in Safari Books Online:

After you read Pro iOS 5 Augmented Reality, you’ll be able to build augmented reality rich media apps or integrate all the best augmented reality techniques and tools into your existing apps
The iOS 5 Programming Cookbook, thoroughly updated in response to reader requests and new developments in iOS 5, helps you solve the vexing, real-life issues you’re likely to face when creating apps for the iPhone, iPad, or iPod Touch.
In Pro iOS5 Tools: Xcode Instruments and Build Tools you will learn how to use the tools provided in the iOS developer’s toolbox, plus popular third-party tools, to take an app to a final product and ready to ship.
Building iOS 5 Games: Develop and Design offers real world examples and actual games the reader can code and play and is aimed at people who understand programming concepts but are new to iOS game development.

About the author

  Brendan Lim is an accomplished Ruby, iOS, Android, and Mac developer (learn more about him at brendanlim.com). He has spoken and keynoted at various conferences on cutting-edge tech and mobile platforms. He’s a Y Combinator alum and Co-founder of Kicksend (http://kicksend.com), an extremely simple way to instantly send files to the people you know. Brendan is also the co-author of MacRuby in Action.

Tags: Augmented Reality, iOS, QR codes,

Comments are closed.