It’s been more than nine years since Nicholas Carr set off a firestorm of debate in the IT and business communities with his highly-debated “IT Doesn’t Matter” article (Harvard Business Review, May 2003). I was reminded of Carr’s article last week when I came across a BloombergBusinessweek article discussing it. So, I dusted off my copy of Carr’s article this past weekend and read it again — and so should you, in light of everything that has happened in the IT industry over the past nine years (an eternity when it comes to technology).
First, it’s important that you ignore the title and just read the article. The title, while attention-grabbing and provocative, does not align with Carr’s central message. He doesn’t really claim that IT doesn’t matter; in fact, he acknowledges that IT is essential to competition. What he claims (among other things) is that IT is becoming less important as a source of strategic differentiation. Back in 2003, many people reacted strongly to the title and everything it implies (that IT doesn’t play a role in business process innovation, for example) versus actually reading the full article. Again, look past the misleading title and read the whole article.
Some perspective: Facebook did not exist when Carr’s article was first published. Amazon.com was more than three years away from launching its Elastic Compute Cloud (EC2) service. Apple wouldn’t launch its App Store for another five years, Salesforce.com was still a year away from going public, and Google wouldn’t introduce Gmail and Google Docs for a few more years. In other words, the IT landscape has changed dramatically since 2003, and toward the general direction Carr predicted in his article:
“More and more, companies will fulfill their IT requirements simply by purchasing fee-based ‘Web services’ from third parties — similar to the way they currently buy electricity or telecommunications services.”
And the trend continues today, with “traditional” enterprise software vendors like Oracle, SAP, and JDA Software announcing “cloud computing” strategies in recent months, and Google launching its own “infrastructure-as-a-service” just a couple of weeks ago.
Carr equates what we currently call “cloud computing” with the commoditization of IT. Back in 2003, the arguments over whether IT was indeed becoming a commodity overshadowed two other important points that Carr also makes, which time has proven true:
- “As corporations continue to cede control over their IT applications and networks to third parties, the threats they face will proliferate. They need to prepare themselves for technical glitches, outages, and security breaches, shifting their attention [and IT resources] from opportunities to vulnerabilities.”
- “What’s important…is to be able to separate essential investments from ones that are discretionary, unnecessary, or even counterproductive. At a high level, stronger cost management requires more rigor in evaluating expected returns from systems investments, more creativity in exploring simpler and cheaper alternatives, and a greater openness to outsourcing and other partnerships. But most companies can also reap significant savings by simply cutting out waste.”
On point #1: Every week, there seems to be a new headline about an IT service failure or security breach. Last month, for example, Amazon.com’s EC2 service went down twice, affecting clients such as Instagram, Pinterest, Netflix, and WhatsYourPrice.com, which dumped the service last week. Also in June, hackers broke into LinkedIn’s site and stole more than six million of its customers’ passwords. And the list goes on. I’ve written about this topic over the past few years — see “When a Software-as-a-Service Solution Goes Down” and “The Next 9/11: The Risk of a Supply Chain Cyberwar” and “What If a Cyber Attack Takes Down Your Software-as-a-Service TMS?” — and I agree with Carr: companies need to pay more attention to these vulnerabilities and invest more time and resources to prevent disruptions.
On point #2: When the recession hit in late 2008, many companies pulled back on IT spending. And ever since, they have taken a “stronger cost management” approach to IT investments, which has fueled the move toward cloud solutions. For example, as I wrote back in November 2010, instead of upgrading to a new version, a growing number of companies are opting to replace their existing in-house transportation management system with a software-as-a-service (SaaS) solution because it would provide faster time-to-value, connectivity benefits, and lower upfront costs. One executive I spoke with at the time told me how her incumbent TMS vendor was asking for almost $900,000 in license fees for an upgrade. It was difficult for her to justify an upgrade when competitive SaaS offerings were a fraction of the cost. Her plan was to implement a SaaS TMS solution at a single facility to see how it worked. I never followed up with her, but I’m willing to bet that if she took that step toward SaaS, she hasn’t looked back.
Carr has a lot more to say in the article, and I don’t agree with him on everything, but his call for companies to focus more on IT risks and potential disruptions and to explore simpler and cheaper alternatives (and eliminate waste) rings truer and louder today than nine years ago. The problem in 2003 was that everybody was busy arguing whether IT mattered or not, which became a philosophical question to some, instead of discussing what will matter with IT moving forward. That conversation has just begun.