Monday, June 29, 2009

DevDiary: The tech behind Piggy



While the recently released Piggy app might not seem like the most technically advance iPhone app ever produced (and it's not), there was an interesting tech problem to be solved.

The animation used in Piggy is pre-rendered 3D animation. Normally, when playing a pre-rendered movie, you might just use Apple's internal movie player and be done. However, with this piggy app, there are about a dozen different animation sequences and they can be triggered in any order. There's no way to pre-cache all the movies - a bunch of 320x480 movies animated at 30 frames per second would just take up way too much memory. And loading the movies on demand using the SDK's supplied movie player wouldn't be ideal. It wouldn't be a very interactive pig if after tapping on a trigger point you had to wait a few seconds before the animation would occur.

So, as I described a few weeks ago, I converted each frame of animation into a PVRTC texture image. This image format is used natively by the iPhone graphics chip, so if I use them as OpenGL textures, then I can load any frame of animation from storage on the fly and get a nice 30 frames per second.

4 comments:

Tony Zale said...

Thanks for the post! I was thinking about implementing a similar "streaming texture" system and was curious if you'd be willing to share a few more details... do you spin off a separate thread to load the next image, or is iPhone file IO quick enough to do this inline during one of your 1/60th of second frames? Did you start by using the PVRTextureLoader example xcode project as the basis of your work and modifying the raw openGL to suit your needs?

Mostly Torn said...

Hi Tony,

I use the standard OpenGL method of having a separate animation thread that is running at the desired frame rate interval.

Then, to load the textures, as part of the redrawing of the view I simply load one text at a time from its corresponding file via something like this:


NSString *fileName = [NSString stringWithFormat:animationList[animationChoice].fileNameTemplate,frame];
NSString *imageFile =[NSString stringWithFormat:@"%@/%@", [[NSBundle mainBundle] resourcePath], fileName];
NSData *texData = [[NSData alloc] initWithContentsOfFile:imageFile];

glCompressedTexImage2D(GL_TEXTURE_2D, 0, GL_COMPRESSED_RGB_PVRTC_2BPPV1_IMG, 512, 512, 0, [texData length], [texData bytes]);

[texData release];

Tony Zale said...

Thanks for the followup. I tried implementing something similar and while it works, I'm only seeing about 15-18fps (on the simulator). I'm not yet able to run my app on a phone... how did you find the simulator framerate compared to the device? Maybe I will need to try 2bpp PVRTC instead of the 4bpp.

Thanks again.

Mostly Torn said...

The simulator is actually slower than the iPhone. The iPhone has hw accelleration for those types of textures.


All content copyright © 2009  Brian Stormont, unless otherwise noted.   All rights reserved.