Monday, June 29, 2009

DevDiary: The tech behind Piggy



While the recently released Piggy app might not seem like the most technically advance iPhone app ever produced (and it's not), there was an interesting tech problem to be solved.

The animation used in Piggy is pre-rendered 3D animation. Normally, when playing a pre-rendered movie, you might just use Apple's internal movie player and be done. However, with this piggy app, there are about a dozen different animation sequences and they can be triggered in any order. There's no way to pre-cache all the movies - a bunch of 320x480 movies animated at 30 frames per second would just take up way too much memory. And loading the movies on demand using the SDK's supplied movie player wouldn't be ideal. It wouldn't be a very interactive pig if after tapping on a trigger point you had to wait a few seconds before the animation would occur.

So, as I described a few weeks ago, I converted each frame of animation into a PVRTC texture image. This image format is used natively by the iPhone graphics chip, so if I use them as OpenGL textures, then I can load any frame of animation from storage on the fly and get a nice 30 frames per second.

Piggy - the iPhone app


In the pursuit of mindless silliness on the iPhone, Stormy Productions presents Piggy, a new iPhone app.
You can buy it on iTunes here for $0.99.

And what is Piggy? It's an interactive pig alarm. Isn't that a helpful description? No? Well, let me be a bit clearer. It's a 3D cell-shaded pig that displays a range of animations and produces a series of pig-like sounds on command. Poke the pig in different spots to produce the desired effect.

And, if that weren't enough (you mean you want more than an interactive pig noise maker app?) you can also use the pig as a timer or alarm. Choose the sound effect you'd like, set the timer, and... wait. You can watch the digits count down from the time of your choosing. Have trouble with numbers? You can learn how to count backwards by watching this app! While that's not the most useful or creative way to use the Piggy timer, how you choose to use it is totally up to you.

Saturday, June 20, 2009

Image Picker Sample

The UIImagePickerController provided by Apple is pretty weird. If all you want to do is grab an image, its usage is very straightforward. But, if you want to first allow the user to zoom and crop the photo, the way the controller works is very unintuitive.

Back in February I struggled with trying to use a UIImagePickerController to allow a user to make such cropping adjustments before having the image returned. Fortunately, I found a good starting point with a post on Apple's iPhone developer forum (read it here). Using that sample code and a lot of tinkering with many sample images, I came up with a method that seems to handle the zooming and cropping of arbitrary images.

Someone else was recently writing about how this process is a real pain in the neck, so I figured it might be helpful to others if I posted some sample code.

So, here's a link to a sample XCode project which implements a subclass of UIImagePickerController. The code isn't pretty - I wrote it in a hurry several months ago, it did what I needed at the time, and I don't want to revisit it now. Feel free to do whatever you'd like with it.

My key to understanding the whole image picker crop/zoom process and the cropping info returned is summarized by these comments in my code

"The crop rectangle returned has been scaled relative to the image being 480×640 regardless of the real image dimensions. And, to make matters worse, it’s a virtual 320×320 square centered on the screen rather than being the whole screen. So, we need to rescale the crop rectangle so it matches the original image resolution and so it’s the full screen."

Perhaps this will be useful or helpful to someone else.

The code works with both 2.x and 3.0 of iPhone OS, although the sample project file I include was made specifically for 3.0.

Cheers.

Wednesday, June 17, 2009

My 0.5 seconds of fame

A friend of mine sent me a tip today that I was mentioned in some article at Macnewsworld.com . Here's the link:

http://www.macnewsworld.com/story/67330.html

It's an article about a local company here in Rhode Island (GLAD WORKS) that is seeing an ever increasing part of its business related to iPhone development.

.

Monday, June 15, 2009

Radio Paradise app 1.6 - hopefully next week

I'm just finishing up some enhancements to the free Radio Paradise app - this'll bring it to version 1.6. It's now pending review with Apple.

The most noticeable change will be with the user interface, as can be seen in this screenshot:



The app will now buffer approximately 10 minutes of audio and allows rewind and fast forward within this buffer while still buffering the live stream.

It'll also support true "pause" (again, up to about 10 minutes), so if you accidentally pull out your headphones, or need to stop the music for a few minutes, you can then resume right where you left off!

In the above picture the gray bar above the rewind/stop/fast forward buttons represents the audio currently buffered. The red line represents what portion of the buffer to which you are currently listening. In this example, the buffer is about 1/4 full and we are listening to the "live" part of the stream.

I'm still not crazy about my "tunemarking" buttons. The + one will allow you to tag a song, buy it up on iTunes, email the track info, etc, and the other button will let you manage your list of previously marked songs. I'm admittedly a poor graphic designer (programming is my skill set), so this is the best I've come up with so far for representing the tunemarking concept.

You can read more about the current version of the app here.

Sunday, June 7, 2009

Fast, high quality, small memory footprint. Pick two.

I've finally found a solution to my page-flipping animation problem. The short answer: PVRTC texture maps. PVRTC texture maps are extremely efficient. I've found I can load a 512x512 PVRTC texture directly from the iPhone filesystem for each unique frame of full screen animation and still main 60 frames per second. I repeat - 60 frames per second of fullscreen 320x480 page flipping animation with no a single image being pre-loaded or cached. I was very surprised at the speed difference.

That's the short answer. The long answer is this efficiency doesn't come without some tradeoffs. All the time I was trying to find a good solution to the full screen page-flipping animation problem, I would find things that worked, but the tradeoffs were usually unacceptable. For instance, I could get 30 FPS animation if I load all my frames of animation into ram first and just use drawRect. Given I had over 1300 frame of animation, this was not acceptable - the RAM requirements would be crazy!

My frames of animation are cell-shaded pictures and as jpg images, they only take about 30K each. However, when converted to a PVRTC, the size of each image is about 120K using the standard PVRTC compression and is 64K if I use the most aggressive compression. So, the tradeoff now is the app download size. Instead of being about 35 meg, it's now going to be about 80 meg. But, this is the most reasonable tradeoff of the bunch. I couldn't have my app use more RAM, and I didn't want to drop the frame rate even lower than 10 FPS, so opting for more filesystem space used with the benefit of 60 FPS seems like a decent trade off.

Now I just have to go through the process of converting all the images. Unfortunately, Apple's texturetool program is dog slow! It is taking over 4 seconds per 512x512 on my MacBook Pro. Fortunately I have batched the process using a perl script so I can just let it run overnight.

Saturday, June 6, 2009

DevDiary: Problems with DrawRect:

I've been working on a "hobby" app project off and on for the past few weeks. It involves performing full-screen page-flipping animation. Unfortunately, I keep running into system bottlenecks issues.

I know there must be a good solution, but so far I haven't found it.

Here's the problem. I have about 35 meg worth of cell-style animation, and each image is 320x480 (i.e. it takes up the full iPhone screen). There's about 1300 frame in total. There's no realistic way I can cache all these images in RAM on the iPhone. Since the animation is dynamic, I can't even cache what will be animated next as it is determined by user interaction.

What I am currently doing is loading each frame of animation on demand. This is much slower than caching the images in RAM first (I can only get about 10 FPS rather than the 30 if I were able to cache the images). But this isn't the problem. I'm ok with the lower frame rate.

The problem is I can't seem to find a good method for having low level access to the image buffer of the current view while still maintaing decent responsiveness for touch events.

I'm currently overriding the drawRect method of the UIView (I call [image drawAtPoint:(CGPointMake(0.0, 0.0))]; to render my image) and then for my animation routine, I just repeatedly call [view setNeedsDisplay]. This almost works. There are no memory issues at all since the view is just getting bitmap data written to it rather than a handle to an image. The problem is the rendering takes up so much bandwidth that the app no longer responds to touch events when running under iPhone OS 2.x. Oddly, running under 3.0 the app performs fine with touch events. But, I'd like to release the app for 2.x, so this approach won't work.

I also tried using the CALayer of a view and setting its content to my current image (via layer.content = myImage), but that causes severe memory problems with the iPhone OS. Touch events now work, but memory problems occur. Even though I'm releasing each image after the layer uses it, the memory still hangs around and eventually the iPhone OS kicks my app for being a bad citizen.

So, I'm now trying to figure out another approach. I'm wondering if using OpenGL ES would work. I've used it in another app for doing animation, but have no experience with trying to render large full screen bitmaps. Ideally, what I'd like is to just get raw access to the OpenGL framebuffer, dump my bitmap image there and then display the framebuffer. I'm hoping I don't have to treat my animation frames as textures. That seems like it would add even more overhead and slow down the frmae rate even more.

Anyone have any tips on what might be a good solution?

Friday, June 5, 2009

Adjusting the iPhone master volume programmatically

Looking for a way to adjust the master volume control on the iPhone via a function call? It's been said it can't be done without using the private Celestial framework, but that's not true!

If you are using an MPVolumeView to provide a slider interface for adjusting the iPhone system volume, you can adjust the value of this slider and in turn adjust the master volume on the iPhone.

Here's how I figured it out.

Assuming you already have an instance of the MPVolumeView class, you need to search its subviews to find the MPVolumeSlider view:


UISlider *volumeViewSlider;

// Find the MPVolumeSlider
for (UIView *view in [volumeView subviews]){
if ([[[view class] description] isEqualToString:@"MPVolumeSlider"]) {
volumeViewSlider = view;
}
}


So, what is an MPVolumeSlider and what can you do with it? Well, here's where Erica Sadun's documentation on the iPhone SDK classes comes in handy. Specifically the page which documents the MPVolumeSlider.

If you take a look at that documentation, you can see an MPVolumeSlider is a subclass of the UISlider. This means you can do all the UISlider type stuff to it, such as change the look, color, etc. I've already documented taking advantage of that in a sample XCode project in a previous blog post.

So, if it's a UISlider, what happens if we try to adjust its slider value via this call?


[volumeViewSlider setValue: 1.0f animated:YES];


Well, the slider does change to the maximum value, but the system volume remains the same. If we then just tap on the thumb control for the slider, the volume then immediately jumps to the new volume level, but we are looking for a solution that requires no user interaction. We are getting close, but not quite a complete answer.

Since the volume does change when the slider is tapped, I then tried looking at the targets that might have registered with the UIControl methods of the slider:


NSSet *mySet = [volumeViewSlider allTargets];
NSLog(@"%@", mySet);


No such luck. The targets was null. So, how is the system volume getting changed when someone drags the slider? Something has to be monitoring the slider, right?

Well, I tried to see if any targets were registered as being notified by any UIControl event:

NSArray *actions = [volumeViewSlider actionsForTarget:nil forControlEvent:UIControlEventAllEvents];
NSLog(@"Actions:%@", actions);

But, again the results were nil.

So, not having any luck there, I took another look at the documentation on Erica's site. I noticed a class method for the MPVolumeSlider class called _commitVolumeChange. Hmmm.... could this be useful? Indeed, it is!

A simple two function call as follows does the trick!

[volumeViewSlider setValue: 1.0f animated:YES];
[volumeViewSlider _commitVolumeChange];


So, this turns out to be a very easy way to adjust the master volume of the iPhone. You should be aware the _commitVolumeChange method is NOT documented, which means Apple could change it at any time. It currently works for all versions of the iPhone OS, but that doesn't mean it won't change in the future.

Tuesday, June 2, 2009

DevDiary: Time-shifting live radio streaming

Ah, the joys of refactoring code. I took a couple days to rewrite portions of the audio streaming code used in my radio apps to add some more enhancements. In addition to making the code much more modular to allow it to be easily dropped into new projects with no external dependencies, I made a significant improvement.

The code will now allow the live stream to be buffered for up to 10 minutes! This means if you are listening to a radio program and get distracted and miss a bit of what was just played, you can simply rewind the program. Then, if you choose, you can continue to listen to the program slightly time-shifted, or, just fast-forward back to the live portion of the stream.

Or, if you need to take short break from listening, but don't want to miss anything, you can pause the program for up to 10 minutes and all the new content will be buffered.

The only limit imposed right now is the buffer is designed to hold 10 total minutes worth of audio and this includes both pausing and rewinding. You can't both pause the program for 10 minutes AND have the ability to rewind 10 minutes into the past. If you pause the program for 5 minutes, you'll only have 5 minutes of the program buffered in the past.

Hopefully my clients will find this new feature useful.

Time Bomb v1.4 now available!

The newest update to the Time Bomb app (version 1.4) is now available on iTunes. This is primarily a bug-fix release, however one minor change was made to the graphics. Per popular request (and possibly against my better judgement), I removed the game title text from the game-play screen.



So, the bomb screen now it looks a bit more like a real time bomb, except for the big oval tool selector on the lower left. Yes, I could make that slide out of view after you select a tool to give an even more realistic time bomb look, but intentionally did not do that.

You can read more about the game at TimeBombGame.com.

All content copyright © 2009  Brian Stormont, unless otherwise noted.   All rights reserved.