Wednesday, July 26, 2006

Innovation Trumps Consistency

During my last days at Red Hat, I recall making a sales call on a very large customer that had completely transformed their customer service costs and application performance by switching from proprietary Unix to Linux on X86. We were laying out the technology roadmap for the next Linux release, which offered yet again more improvements in performance vs. Unix, when a senior IT manager interrupted the presentation:

IT Manager - (loudly) - I really wish we could just go back to the good ol' days of Unix where nothing ever changed. How am I supposed to maintain any consistency when all of these performance innovations keep showing up in the code, and the application developers just can't resist adopting them to rachet up performance.

Red Hat SE - (calmly) - But isn't that the point? When a new innovation makes the customer experience better or the cost of service lower, adopting that innovation into the application just makes sense.

IT Manager - It's just more work for me. I care about performance and cost, but no one measures the cost of our efforts to support innovation. A slow moving, standard OS lowers the costs for the administrators who have to make the applications work with the rest of the infrastructure.

Red Hat SE - Well, I think innovation is always going to trump consistency. I wish there was more we could do to make it easier for you to stabilize multiple streams of innovation in your datacenter.

This interaction came to mind when I read an interview with the CTO of Amazon, Werner Vogels, about how Amazon enables innovation by giving the developers complete responsibility for delivering a working application service. The developer can adopt any innovation that makes their service more effective, but they have to manage the availability of that service in the datacenter. At some level, I am certain Amazon has guidelines, such as server preferences, OS preferences, management utility preferences, etc. My guess, however, is that these are pushed to the lowest possible level to enable flexibility in the upper portions of the stack in order to maximize application performance.

I am sympathetic to the workload of the datacenter staff, and I have high hopes that virtualization is going to separate forever the management of the box from the delivery of the application. Maintaining consistency of the OS across applications at the expense of application performance is simply unacceptable in a hyper-competitive world.

I think it was Ralph Waldo Emerson that said "A foolish consistency is the hobgobblin of little minds." I think Ralph was a closet application developer before application development was cool.


At 11:27 AM, Blogger Jon Klein said...

I'd like to give a qualified endorsement of your statement.

On the flip side, I can imagine an SE at your favorite bugbear, Microsoft, saying the same thing to customers when they are trying to jam the next release of Office down their throats.

Yet most of us don't use more than 20% of what is in those products already so we don't really need any more innovation.

I think a better statement is "Innovation that creates true value for customers trumps consistency."

Companies that overinnovate wind up with bloated products that are vulnerable to more streamlined competitors as Clayton Christensen argued in The Innovator's Dilemma.

At 9:29 AM, Blogger Billy Marshall said...

The only innovation that matters one that a particular customer values. That is precisely the problem with the "one size fits all" mentality that dominates the software business these days.

It is funny to me that "soft" ware has become even more rigid than "hard" ware. At least with hardware, you can unplug components (say a disk drive) and replace them with more effective components for your particular task.

It is not about jamming a new release down any throat. It is about customers being able to receive the next improvement that matters to them.


Post a Comment

<< Home