Image Handling and iPhone 4

Development for Apple’s iOS – the operating system running on iPhone, iPod Touch, and iPad – is a funny combination of wonderful and maddening. The platform’s native API, called Cocoa Touch, provides an app developer with an amazing combination of power and flexibility – and it’s been improving with each version. But the power and flexibility also come with a tradeoff: frameworks perform neat tricks, but at the cost of processing time and memory consumption. This is especially true when accessing a device’s camera, and especially true when accessing the camera on an iPhone 4, which returns huuuuge images.

Creating a Camera Picker

Gaining access to media, including the camera, on an iOS device is incredibly easy. Just create an instance of a UI object called UIImagePickerController, tell it where to go (either the photo library or the camera), and, well, that’s pretty much it!

- (void) popCameraPicker {
    if (picker == nil)
        picker = [[UIImagePickerController alloc] init];

    picker.sourceType = UIImagePickerControllerSourceTypeCamera;
    picker.allowsEditing = YES;
    picker.delegate = self;

    [self presentModalViewController:picker animated:YES];
}

The memory squeeze begins when the camera picker is actually presented, in the last line there. Anecdotally, most apps will receive a Low Memory Warning immediately, and then it worsens considerably when a picture is taken (our app, at least, gets a Low Memory Warning Level 2 at that time). The real trouble, though, occurs when the picture is returned to the app.

You Said You Wanted It, So, Uh, Here’s Your Picture

The delegate callback received when a user takes a picture looks like this:

- (void)imagePickerController:(UIImagePickerController *)currentPicker didFinishPickingMediaWithInfo:(NSDictionary *)info

This looks reasonable: the picker passes in a dictionary containing information about the new image, and also the image itself. But unfortunately, what isn’t given is any way to specify a compressed (JPEG, PNG) or resized image. Nope, it’s just providing a UIImage – pixel data. And depending on the resolution of the camera built into the device, that UIImage can get to be quite large. The original iPhone and the 3G use a 2MP camera, the 3GS uses 3.2MP, and the iPhone 4 uses 5MP – fully two and a half times the pixel count of the original. The memory squeeze gets pretty incredible on the new iPhone.

Handling the Callback

Let’s say we just received a 1936×2592-pixel image, as returned by iPhone 4. Yikes. The first order of business is to get rid of as much of the image picker object as possible, as quickly as possible:

[picker release];
picker = nil;
[self dismissModalViewControllerAnimated:YES];

That should provide a bit of breathing room – great. Next, let’s get the image out of the dictionary:

UIImage *image = [[info objectForKey:@"UIImagePickerControllerEditedImage"] retain];

(In this case we’re getting the edited image, not the original image, because we’re letting a user pan and zoom to crop the image before selecting it. But this process works either way.)

Here’s the complete implementation of didFinishPickingMediaWithInfo:

- (void)imagePickerController:(UIImagePickerController *)currentPicker didFinishPickingMediaWithInfo:(NSDictionary *)info {
	// throw everything out to try to avoid running out of memory.
	[picker release];
	picker = nil;
	[self dismissModalViewControllerAnimated:YES];

	// get the edited image
	UIImage *image = [[info objectForKey:@"UIImagePickerControllerEditedImage"] retain];

	if ([self setPreviewImage:image]) {
		// success!
	} else {
		// failure.
	}
	[image release];
}

Resize + Display

The ultimate goal, at least for our app, is to show a 600px-wide version of the just-selected image in a UIImageView. To do that, let’s make a new method, called -(BOOL) setPreviewImage:(UIImage *)image. It returns TRUE if it was able to set the preview image, or FALSE if we ran out of memory. Let’s make sure it returns TRUE.

// set the given image as the preview.
// returns true if successful
- (BOOL) setPreviewImage:(UIImage *)image {
	if (image == nil) {
		NSLog(@"image was nil");
		return FALSE;
	} else {
		UIImage *scaledImage = [self scaleImage:image toSize:CGSizeMake(600, 600 / image.size.width * image.size.height)];
		if (scaledImage == nil) {
        	    NSLog(@"scaledImage was nil");
	            return FALSE;
		}
		[self setSelectedImage:scaledImage];
		selectedImageView.image = selectedImage;
		if (selectedImageView.image == nil) {
        	    NSLog(@"selectedImageView.image was nil");
        	    return FALSE;
		} else {
	            // success!
	            return TRUE;
		}
	}
}

This method differs a bit from the conventional wisdom: there are no NSData intermediate objects. It’s commonly argued that an NSData copy of the image pixels is needed to provide a JPEG or PNG intermediate representation during the resize and preview process, but I found it to be a needless third copy of the pixel data. This method avoids it entirely.

The key areas to watch in this method are the nil checks: if the image comes in as nil, that means we’ve run out of memory headroom, and the picker released our camera image. Ditto for the check on the UIImageView near the end. If selectedImageVIew.image comes out as nil, that also means we exceeded our footprint.

The key to this method is that it immediately scales down the image – scaleImage:toSize: reduces the UIImage we started with to one no more than 600px wide. This significantly reduces the ongoing memory hit of showing the image, and makes it manageable. Unfortunately, scaleImage:toSize: isn’t something provided by Apple, but there’s a great (and fast) Core Graphics-based implementation I’ve found reliable here.

Wrap Up

It’s unfortunate that the amount of memory available to iOS apps isn’t increasing, because the size of the media they work with sure is! It’s still possible to reliably handle large images, but careful memory management is required. Did these code samples help you out? Are you having other iOS problems? Please feel free to let us know in the comments below!

Continue the conversation by sharing your comments here on the blog and by following us on Twitter @CTCT_API

Leave a Comment