Jeff Bier’s Impulse Response: Change or Be Changed

Submitted by Jeff Bier on Thu, 03/18/2010 - 16:00

We’re on the cusp of profound changes in the competitive landscape for embedded processing engines—changes that I believe will “shuffle the deck” with respect to which kinds of architectures are dominant in many applications. 

Consider this scenario: You’re a systems engineer working on the next generation of your digital-signal-processing-oriented product. You need to upgrade processors because the one you’ve got won’t cut it.  Perhaps it’s not fast enough.  Or maybe it doesn’t have sufficient energy efficiency.  In the past, upgrading was simple:  every year or two your current vendor would offer you a new processor very similar to its predecessor, but with significantly better performance, cost/performance, and energy efficiency. This was great, because you were able to tap the added performance without completely changing your software architecture, software development methods, and tools.  Unfortunately, that gentle ride is over for many applications.  This time around, the upgraded chip from your current vendor is a multi-core beast, with a new, complex concurrent programming model. Whoa.

On the one hand, since you’re an engineer, you’re probably very curious about this multi-core stuff, and would love to dive in and learn all about it. On the other hand, since you’re an engineer, your risk radar is very sensitive, and you suspect that becoming an early adopter of multi-core for embedded applications may introduce significant risk into your product development plan.  The chips are complex, the tools are complex, and the software development techniques are in flux.  But because you need more performance or better energy efficiency, you don’t really have a choice.

Then you start to wonder—if you’re going to be forced to switch to a new kind of processor, why limit yourself to the one your current vendor is offering?  Is there another choice that would be better? Maybe an entirely different type of processing engine? It turns out that there are lots of interesting alternatives, some of which I’ve written about in recent columns.

Intel, for example, has invested a huge amount of money and brain power in developing multi-core chips and tools for IT applications; many of those investments will translate into competitive advantages for Intel in embedded applications.  GPUs are coming on strong, especially in medical imaging—Nvidia in particular has done an excellent job of creating a developer community by offering low-cost boards, free tools, and other developer resources.  And after a major shake-out of parallel processor start-ups in the past year, a few innovators are going strong.  For example, Tilera and picoChip recently attracted substantial capital infusions.

And then there are FPGAs. With the emergence of effective high-level synthesis tools, FPGAs are becoming more accessible to non-FPGA experts—and more attractive as processing engines for some applications. 

Change can be scary, but for many of us, when it comes to processor selection, there is no alternative.  We are going to have to let go of the single-core processor paradigm and get our hands dirty in the parallel processing wilderness. And as we do so, many processor vendors will lose their incumbent’s advantage.

I see this shift every day. A few days ago I had a meeting with an equipment vendor to help choose a new processing engine for a next-generation product. A few years ago the choice would have been obvious—I would have said, “Use a mainstream, high-end DSP processor.” But now, with DSP processors becoming more complex, and with the availability of bigger, faster FPGAs and better tools to leverage them, this company will likely be using an FPGA in lieu of multiple DSP processors.

This is an amazing, exciting time for processing engines. I believe that we will ultimately look back on this period as a huge turning point; in ten years, the processing engine landscape will look vastly different than it does today.  System engineers and processor vendors who realize this first, and act accordingly, have a rare opportunity to re-shuffle the competitive landscape in their favor.

Jeff Bier is the president of Berkeley Design Technology, Inc. (www.BDTI.com), a benchmarking and consulting firm focusing on digital signal processing technology.  Jennifer Eyre White of BDTI contributed to this column.

Add new comment

Log in to post comments