Philip Hodgetts e-mailed me yesterday, having found my recent CocoaHeads Ann Arbor talk on AV Foundation, and searching from there to find my blog. The first thing this brings up is that I’ve been slack about linking my various online identities and outlets… it should be easier for anyone who happens across my stuff to be able to get to it more easily. As a first step, behold the “More of This Stuff” box at the right, which links to my slideshare.net presentations and my Twitter feed. The former is updated less frequently than the latter, but also contains fewer obscenities and references to anime.

Philip co-hosts a podcast about digital media production, and their latest episode is chock-ful of important stuff about QuickTime and QTKit that more people should know (frame rate doesn’t have to be constant!), along with wondering aloud about where the hell Final Cut stands given the QuickTime/QTKit schism on the Mac and the degree to which it is built atop the 32-bit legacy QuickTime API. FWIW, between reported layoffs on the Final Cut team and their key programmers working on iMovie for iPhone, I do not have a particularly good feeling about the future of FCP/FCE.

Philip, being a Mac guy and not an iOS guy, blogged that he was surprised my presentation wasn’t an NDA violation. Actually, AV Foundation has been around since 2.2, but only became a document-based audio/video editing framework in iOS 4. The only thing that’s NDA is what’s in iOS 4.1 (good stuff, BTW… hope we see it Wednesday, even though I might have to race out some code and a blog entry to revise this beastly entry).

He’s right in the podcast, though, that iPhone OS / iOS has sometimes kept some of its video functionality away from third-party developers. For example, Safari could embed a video, but through iPhone OS 3.1, the only video playback option was the MPMoviePlayerController, which takes over the entire screen when you play the movie. 3.2 provided the ability to get a separate view… but recall that 3.2 was iPad-only, and the iPad form factor clearly demands the ability to embed video in a view. In iOS 4, it may make more sense to ditch MPMoviePlayerController and leave MediaPlayer.framework for iPod library access, and instead do playback by getting an AVURLAsset and feeding it to an AVPlayer.

One slide Philip calls attention to in his blog is where I compare the class and method counts of AV Foundation, android.media, QTKit, and QuickTime for Java. A few notes on how I spoke to this slide when I gave my presentation:

  • First, notice that AV Foundation is already larger than QTKit. But also notice that while it has twice as many classes, it only has about 30% more methods. This is because AV Foundation had the option of starting fresh, rather than wrapping the old QuickTime API, and thus could opt for a more hierarchical class structure. AVAssets represent anything playable, while AVCompositions are movies that are being created and edited in-process. Many of the subclasses also split out separate classes for their mutable versions. By comparison, QTKit’s QTMovie class has over 100 methods; it just has to be all things to all people.

  • Not only is android.media smaller than AV Foundation, it also represents the alpha and omega of media on that platform, so while it’s mostly provided as a media player and capture API, it also includes everything else media-related on the platform, like ringtone synthesis and face recognition. While iOS doesn’t do these, keep in mind that on iOS, there are totally different frameworks for media library access (MediaPlayer.framework), low-level audio (Core Audio), photo library access (AssetsLibrary.framework), in-memory audio clips (System Sounds), etc. By this analysis, media support on iOS is many times more comprehensive than what’s currently available in Android.

  • Don’t read too much into my inclusion of QuickTime for Java. It was deprecated at WWDC 2008, after all. I put it in this chart because its use of classes and methods offered an apples-to-apples comparison with the other frameworks. Really, it’s there as a proxy for the old C-based QuickTime API. If you counted the number of functions in QuickTime, I’m sure you’d easily top 10,000. After all, QTJ represented Apple’s last attempt to wrap all of QuickTime with an OO layer. In QTKit, there’s no such ambition to be comprehensive. Instead, QTKit feels like a calculated attempt to include the stuff that the most developers will need. This allows Apple to quietly abandon unneeded legacies like Wired Sprites and QuickTime VR. But quite a few babies are being thrown out with the bathwater — neither QTKit nor AV Foundation currently has equivalents for the “get next interesting time” functions (which could find edit points or individual samples), or the ability to read/write individual samples with GetMediaSample() / AddMediaSample().

One other point of interest is one of the last slides, which quotes a macro seen throughout AVFoundation and Core Media in iOS 4:


__OSX_AVAILABLE_STARTING(__MAC_10_7,__IPHONE_4_0);

Does this mean that AV Foundation will appear on Mac OS X 10.7 (or hell, does it mean that 10.7 work is underway)? IMHO, not enough to speculate, other than to say that someone was careful to leave the door open.

Update: Speaking of speaking on AV Foundation, I should mention again that I’m going to be doing a much more intense and detailed Introduction to AV Foundation at the Voices That Matter: iPhone Developer Conference in Philadelphia, October 16-17. $100 off with discount code PHRSPKR.

Not that I think very many people were wondering, but I do have an update to Road Tip nearly ready to go. It’s just going to take a while to get it finished and out the door.

The reason, frankly, is that it is literally not worth my time to work on it. Once I’ve paid my quarterly bills to MapQuest for their data service (without which, I wouldn’t have an app at all), my cumulative return on Road Tip is less than I make in four hours of contract programming. I can only justify time on it based on its intellectual interest, because as a professional endeavor, Road Tip has been a five-figure loss for my family’s finances once I account for expenses and the time I took to develop it.

I’m tempted to say you’ve seen the last of me as an indie iPhone developer, but you never know… I vowed not to write another computer book after the wobbling grind that was Swing Hacks, and yet I’m still at it. Best to say that indie development is now much more of a hobby in my mind, completely subordinate to contract programming and writing.

Still, I can’t help myself: I use Road Tip, and want to make it better, if only for myself. So here’s what I’ve gotten done.

  • iOS 4 foregrounding/backgrounding support – This is the big one, of course, and it’s been interesting. I handle backgrounding by opting out of location updates while backgrounded, to save the battery. This causes a number of problems at wake-up time, mostly related to the fact that any data in memory is almost certainly out of date and useless. I handle foregrounding by forcing the user back to the front screen (clearing out the presumably old data on any result page they might be on), and asking Core Location for new location data. Core Location provides timestamps in CLLocation objects, so it’s easy enough to determine if your data is old. Thing is, while I can get an updated coordinate quickly, the reported course will often be wrong (for reasons described in my earlier post on Road Tip), so I have to hold off on enabling the GUI until I have a higher level of confidence that the data supplied by Core Location is accurate.

  • HUD improvements – The “spinner” while you’re waiting for data is interminable on Edge when there’s a lot of data coming back, so I’ve added some status text to give you a better idea of what progress is actually being made. Here’s what that looks like:

  • Freestyle Favorites – The last of the features that I “always meant” to add, this feature lets you add any brand as a favorite, directly from the map screen. What this means is that you’re no longer limited to just the best-known brands from the Settings UI. You can add whatever business names you find along your way. For example, in this screenshot (taken on a family vacation last week), I could use the “make favorite” button to add the “Kahunaville” restaurant inside the Kalahari Resort in Wisconsin Dells.

    Freestyle favorites also work in tandem with the main list of favorites. If you tap “make favorite” on a brand from the canned list (like “Burger King” or “Shell”), it will just set the favorite in the main list, and you’ll see that the next time you go to the Settings app.

    BTW, the one decent critical review I got on Road Tip (as opposed to the usual anonymous 1-star haters) complained about my use of the Settings app. That was a consequence of one of my top design goals: nothing finicky in the main app. Meaning that I wanted the main app to be very “glanceable” and work with coarse gestures, not the kind of UI that requires your full attention. This is, after all, meant for use in a moving vehicle: anything distracting is bad. Heck, if iOS had a speech API, the app would be better off being completely hands-free. Anyways, scrolling through a list of brands to pick favorites is something you can do at home, and therefore can be done the Settings app (per Apple’s guidance to developers, I might add). But looking at it, I thought that putting a big “make favorite” button (which can also be un-set with a second tap) was a sufficiently coarse, non-distracting action, and potentially very handy.

  • Retina-friendly display – Despite the fact that you should be looking at Road Tip’s screen as little as possible, I did join Glyphish’s Kickstarter project so I could get Retina-display-friendly @2x icons. Kimberly Daniel of Vantage Point Creations also reworked the app icon to have a nice bezel and pre-rendered gloss

  • Purchase UI improvements – During an in-app purchase talk at CocoaHeads Ann Arbor, our group leader got confused by the long latency in pressing the “buy” button and getting the system-provided confirmation dialog. A new “purchasing…” HUD fills that gap.

All that work is done. So what’s the delay? Bugs, mostly. The backgrounding/foregrounding support has turned up some issues that either are new in iOS 4 or were always there, but never exposed because the app used to get re-launched when it was needed, rather than running indefinitely. The biggest problem at the moment is a bug where the “view all upcoming services” mode continues to get old results from whatever the first location was you searched from. Actually, it gets the freeway name right (even if you’ve since turned onto another freeway), but provides exits from the previous search results.

I’m also getting some “junk” characters in some freeway names that aren’t getting stripped out, as seen at the top of this screen.

I think it might take one full day to work through these bugs, and maybe another day to get everything set for an update submission (I have to re-write some of the help text and re-shoot some of the screenshots). So I’ll get to it at some point. I just don’t know when. Probably in the next few weeks… no promises, though.