Jeff Bier’s Impulse Response―Unleashing Mobile Multimedia Applications

Posted in Opinion
| Write the first comment.

Smartphones and tablets are becoming ubiquitous.  According to DIGITIMES Research, global smartphone shipments reached 464 million in 2011.  And 63 million tablets were shipped in 2011, according to IDC.  That's over half a billion devices sold in one year, and few analysts doubt that these numbers will grow in 2012.

Not only are these devices increasingly popular, but–at least based on my informal observations–people are spending more and more time using them.  That's no surprise, considering that smartphones and tablets are highly capable, portable, connected products.  Surfing the web?  Check.  Reading email? Check.  Playing games?  Check.  Taking photos and recording video?  Check.  Listening to music?  Check.

The capabilities and popularity of smartphones and tablets have made them the preferred application development platform for many software developers.  And a huge number of useful and entertaining applications are available.  We're even beginning to see sophisticated computer vision applications on mobile devices, such as the Philips Vital Signs app for the iPad 2, which accurately measures the user's heart rate and respiration rate solely based on video images of the user's face and chest.

Another favorite multimedia application of mine is Shazam, which identifies songs based on samples captured with a smartphone or tablet microphone.  Indeed, I believe that multimedia applications are central to the future of smartphones and tablets—for users, equipment manufacturers, chip makers, software developers and network operators.

Back when personal computers were our most frequently used computing devices, multimedia applications were a nice bonus, but for most users they weren't the main attraction of the PC.  For most of us, PCs are about creating and reading documents, presentations, email, and spreadsheets.  And while smartphones and tablets were initially treated as miniature PCs by many users, their "use cases" are rapidly evolving to be much more media-centric.  These days, any time I take an airplane flight, I see multiple passengers watching movies or television programs on their tablets or smartphones.   Millions of consumers use their smartphones as their primary device for listening to music.   And anyone who thought that mobile games were a passing fancy should consider the 300 million minutes spent daily playing just one such game, Angry Birds.

Multimedia functionality is central to smartphone and tablet users for several reasons.  For one thing, we live in a multimedia world, not a textual one, and our mobile devices are out in that world with us, rather than being relegated to our desks.  As a consequence, applications and devices that effectively incorporate multimedia capabilities often yield a more engaging experience for the user, or enable valuable functionality that can't be implemented without multimedia.  Another consideration is that while PCs rely on keyboards and large screens, mobile devices have neither.  So, by necessity, mobile devices must rely on other forms of input and output, such as sound, vision, and motion.  To type a detailed description of a scene using my thumbs on my smartphone screen would take me hours and be very tedious.  To snap a photo of the scene and email it takes seconds.

The connectedness of mobile devices also makes multimedia applications more attractive.  My smartphone is the best-connected device I own, equipped as it is with LTE cellular, WiFi and Bluetooth transceivers.  This—plus the fact that it's the one device that I almost always have with me—makes the natural platform for collecting and accessing multimedia content.

The importance of multimedia applications on mobile devices has been apparent to application developers for some time.  But developing and deploying multimedia applications on smartphones and tablets has been, and continues to be, very challenging, for several reasons.  First, while the CPU processing power available in high-end mobile devices is impressive, most users don't have the latest high-end devices.  This leaves mobile multimedia application developers with a fairly modest CPU processing power budget, compared to what's available on a run-of-the-mill PC.

Second, the "application processors" that power smartphones and tablets deliver the majority of their processing power not via the CPU, but through a constellation of coprocessors, typically including a graphics processing unit, video processing unit, digital signal processor and image signal processor.  And whereas the vast majority of application processors use ARM CPUs, there's a great deal of diversity in the coprocessors.  This creates a serious dilemma for mobile multimedia application developers:  Multimedia applications typically need all the performance they can get.  And the types of tasks they need to perform are often just the types of tasks that coprocessors like GPUs and DSPs are designed to execute efficiently.  But there are hundreds of different application processors out there, each with a different complement of coprocessors.

In theory, it should be possible to create application programming interfaces (APIs) that abstract the details of the particular set of coprocessors included in a specific application processor.  This would enable the developer of, for example, a tablet video editing application to write the application once and have it run in many different tablet models based on many different application processors, harnessing whatever relevant coprocessors are available in each device.  In practice, however, such APIs have mostly been missing from the mobile space.  This absence has been a key impediment to large numbers of mobile application developers making full use of the chips powering mobile devices.

Fortunately, this situation is beginning to change.  In next month's column I'll discuss one such API, which I believe has the potential to enable a big step forward in mobile multimedia applications.  I'll also explore the implications of such APIs for chip and equipment suppliers.

Jeff Bier is president of BDTI and founder of the Embedded Vision Alliance. Post a comment here or send him your feedback at

What do you think? Write the first comment.