Jeff Bier’s Impulse Response—A Different Kind of Verification Crisis

Submitted by Jeff Bier on Wed, 07/22/2009 - 17:00

It often surprises me how chip vendors will spend a huge amount of money (like, say, 50 million dollars) developing a chip without rigorously verifying that the chip includes all the necessary features for the target applications.

For example, awhile back we had a processor vendor ask us to benchmark their new core. As is often the case, the vendor had already done their own evaluation of the chip’s performance, but wanted independent benchmark results to use in their marketing materials.  During the benchmarking project they got an unpleasant surprise: one of the features they’d designed to accelerate the core’s performance in a specific application didn’t work the way they thought it would due to subtle application constraints. As a result, they weren’t getting nearly the performance they expected.  

Obviously this was a major bummer. On the bright side, however, they were able to use this information to rejigger the architecture and achieve the target performance—and they did it before they ever manufactured the chip. 

This wasn’t the first time we’d seen this sort of problem. It’s very difficult to gauge how a chip’s features will map to a specific application until you’ve actually done it, and that’s a lot of work. There are often obscure requirements (such as operations that must be performed in a specific order or fixed-point arithmetic that must be performed in a particular way) that can really bog down performance if they aren’t taken into account in the design.  

It reminded me of how, back when digital signal processing first started becoming really popular, processor designers would sometimes modify low-cost CPUs to add a single-cycle multiplier without also modifying the memory architecture so that it could support sustained single-cycle multiplication.  It was kind of heartbreaking, really.

Engineering managers are always talking about how hard it is to verify chips these days, but what they mean is that it’s hard to validate that a chip works the way you’ve designed it.  What I’m talking about is a higher-level validation crisis: With the increasing complexity of chips and applications, it’s really hard to make sure that what you’ve designed will meet the needs of the application and perform as you expect. Especially if your chip targets many different applications.

Unfortunately there’s no substitute for implementing a good chunk of the application—before taping out the chip—to make sure that no performance killing details have been overlooked.

Jeff Bier is the president of Berkeley Design Technology, Inc. (www.BDTI.com), a benchmarking and consulting firm focusing on embedded processing technology. Jennifer Eyre White of BDTI contributed to this column.

Add new comment

Log in to post comments