In 2002, in a famous piece of unintentional rhetorical artistry, U.S. Defense Secretary Donald Rumsfeld spoke to reporters about "known knowns" (things you're aware that you know), "known unknowns" (things you're aware that you don't know) and "unknown unknowns" (things you're unaware that you don't know). As every programmer immediately realized, the category that Rumsfeld didn't mention is the "unknown knowns" – things you know but don't realize that you know.
Lately, I've come to the view that unknown knowns are a major cause of crappy development tools.
Just about every week, my colleagues and I at BDTI fire up a new set of development tools for some sort of embedded processor. Often, we're using these tools to create optimized implementations of audio or computer vision algorithms for our clients. Sometimes, we're evaluating the tools—either on behalf of the embedded processor suppliers that develop them, or on behalf of a client selecting among processor options.
Unfortunately, many of the development tools we use for embedded processors are very frustrating. How so? Mainly in two ways: Buggy code and sketchy documentation.
It's long puzzled me that the same chip companies that routinely deliver sophisticated chips without major defects often deliver severely flawed software development tools for use with these chips. Recently I realized that unknown knowns are often to blame.
What happens is this: the engineers developing tools inside the chip company have deep expertise in their tools and their processors. They've been working with these chips and tools for years. The tools have been developed to support working in the ways in which these engineers like to work, and these engineers have developed ways of working that maximize the utility of the tools and avoid weak points. So, naturally, the tools work really well for these engineers.
But typical customers haven't been working with the same tools for years. They don't intuitively avoid weak points. And the tools haven't been developed with their work flows in mind. As a consequence, customers often find these same tools difficult to use and unreliable.
In other words, I believe that customers have frustrating experiences with embedded development tools in large part because the engineers developing these tools (and their colleagues using those in-house-developed tools) don't know what they know. That is, they don't realize that they've evolved their way of working to reflect the tools' strengths and weaknesses – and vice-versa. And they don't realize what's missing from the documentation, because they never looked for it. Why would they, when the information is already in their heads?
The obvious solution would seem to be gathering customer feedback. I often hear suppliers say things like "customers love our tools," and "we get very few complaints about our tools." I have to wonder about the basis for such statements. Because unless the supplier provides exceptionally good technical support for their tools, I'm going to bet that they rarely hear about customers' problems with the tools. If I've got a project to complete and I run into a problem with your tools, chances are I'm not going to take the time to communicate with you about it unless I think there's a real likelihood that you'll help me solve it quickly. Otherwise, I'm just going to search for a workaround and get on with my project. Similarly, potential customers who evaluate a chip but walk away from it based on the tools rarely explain their decision to the would-be supplier.
I wonder what chip company executives would think if they had the opportunity to be "flies on the wall" and hear the chatter in labs where developers are using their tools. I suspect it would be a shock.
I realize that chip companies have to manage costs, and development tools are often seen by their executives primarily as a cost. But tools are also a powerful source of competitive differentiation and supplier reputation – for better or for worse.
So, I'm talking to you, embedded processor companies: Your tools need work. Your engineers, brilliant and well-intentioned though they may be, are inherently unable to see many of the critical week points in your tools. Perhaps the time has come to start systematically collecting and analyzing real-world data about customers' development tools experiences, and using the insights gained to make improvements.
An excellent post but the challenges you highlighted aren't limited to just embedded processor tools. Your comments apply to practically ALL design software, across a variety of technologies.
Often, development software evolves organically, without an overall plan or architecture, pieced together from various point solutions. This isn't a bad initial plan, but it proves confusing and burdensome at some point during product evolution. Customers attempt to do something useful with the product. Figure out the critical aspects to make the customer successful and attempt to automate the rest. Too many design flows are hopelessly and needlessly overly complex. Especially for chip companies, the faster your customer is successful, the faster they get to volume manufacturing and the faster you start making real money.
Unfortunately, many of the software engineers creating the development software have never actually used the underlying technology. Perhaps they've exercised a small part of the overall development flow, but they often miss how their particular step affects a much larger effort. Especially for complex products, few software development engineers know the taste of their own particular brand of dog food.
Having been involved in a variety of hardware/software projects, I've seen plenty of cases where the company's engineers don't actually use the company's own software for internal development projects--the same software that they expect their customers to use successfully.
It would be a great luxury if company executives sat down with their customers during a development project. It likely will never happen due to the time involved. However, at a minimum, many of these same companies offer training classes that last a few days to a week. If company executives (and product marketing and software development engineers and ...) sat in on training WITH other customers, I guarantee that their eyes will be opened--even those with years of relevant design experience.