Quick note about speaking plans for late 2015:
I’ll be speaking at CocoaConfs Boston (Sep. 18-9) and San Jose (Nov. 6-7). Boston is going to be a one-track conference, since CocoaConf had such good results with that in Yosemite. I’ll be bringing my App Extensions class and Video Killed the Rolex Star, which is all about the media APIs that are (and aren’t) on Apple Watch.
Early Bird for Boston ends Friday (July 31), so get on it if you want to go.
Next up on our tour of WWDC 2015 media sessions is the innocently-titled Content Protection for HTTP Live Streaming. Sounds harmless, but I think there’s reason for worry.
For content protections, HLS has always had a story: transport segments get a one-time AES encryption, and can be served from a dumb http server (at CocoaConf a few years back, I demo’ed serving HLS from Dropbox, before it was https:-always). You’re responsible for guarding the keys and delivering them only to authenticated users. AV Foundation can get the keys, decrypt the segments, and play them with no client-side effort beyond handling the authentication. It’s a neat system, because it’s easy to deploy on content delivery networks, as you’re largely just dropping off a bunch of flat files, and the part you protect on your own server is tiny.
So what’s “FairPlay Streaming”, then?
Read the rest of this entry…
I attended WWDC for the first time since 2011, thanks largely to the fact that working for Rev means I need to go out to the office in San Francisco every 6 weeks anyways, so why not make it that week and put my name in the ticket lottery. I probably won’t make a habit of returning to WWDC, and the short supply of tickets makes that a given anyways, but it was nice to be back just this once.
Being there for work, my first priority was making use of unique-to-attendee resources, like the one-on-one UI design reviews and the developers in the labs. The latter can be hit-or-miss based on your problem… we didn’t get any silver bullet for our graphics code, but scored a crucial answer in Core Audio. We’ve found we have to fall back to the software encoder because the hardware encoder (
kAppleHardwareAudioCodecManufacturer) would cause
ExtAudioFileWrite() to sometimes fail with OSStatus
kExtAudioFileError_AsyncWriteBufferOverflow). So I asked about that and was told “oh yeah, we don’t support hardware encoding anymore… the new devices don’t need it and the property is just ignored”. I Slacked this to my boss and his reaction was “would be nice if that were in the documentation!” True enough, but at least that’s one wall we can stop banging our head against.
Speaking of media, now that everyone’s had their fill of “Crusty” and the Protocol-Oriented Programming session, I’m going to post a few blogs about media-related sessions.
We’re so close to finally having iOS 8 SDK Development out the door! We just got the index back on Friday, and it looks great. I spent some time last night going over it, looking for either missing topics or things that didn’t really need to be in there, and it all looked great. If anything, I came away thinking “did we really write all that?”
And yet I’m kind of wondering: do books even need indexes anymore?
Read the rest of this entry…