22 March 2012
More than a decade ago, I wrote an essay for one of those EE Times year-in-review editions that would announce its presence with a resounding "thump!" on your desk.
The essay's premise was that Moore's Law had advanced so far, so quickly that pretty soon a single engineer could become the next Intel because the entire design and supply chain would be at his or her fingertips.
Move it forward a bit, and I wrote on my personal blog frequently 4-5 years ago about the pending rise of the "gig economy," where digital tools and high-speed, ubiquitous connectivity would create work ecosystems in different industries in which skilled workers would float in and out of organizations as needs changed. Work would be plentiful (and hours flexible) because there were no longer geographic boundaries for either companies or workers.
Where's digital utopia?
And while I think some of this is happening–and is one reason behind slow job-creation growth in North America–this digitally-enabled utopia is not here yet.
Sure, we're seeing glimpses of it. I was really impressed by two guys' use of a global supply and design chain to start to build an embedded systems product to monitor head trauma. Or two guys building a copper nanowires business in a Duke University lab. You couldn't do this a decade ago.
Yet, what one would think would be an increasingly quickening pace of innovation doesn't seem to have materialized, despite the dizzying array of digital tools, networks and smarts we've seen emerging in the past two decades.
Why, for instance, are we still working with that huge email albatross around our necks–a technology that hasn't moved forward in a decade? And why hasn't search improved markedly, writes venture capitalist Paul Graham.
Forest, trees, blurred vision
It could be a tipping point, to be sure (it's darkest before dawn breaks, kind-of-thing). In the semiconductor industry, no one is funding anything, so entrepreneurs are figuring out how to do it on their own dime (like Adam Nepp and Scott Wohler, and, as you'll read in the coming weeks, Terry West and Greg Lahti).
It could be that there's just a biological regulator on how quickly human beings can understand, internalize and exploit advances in technology. Take social media for instance: Half my friends don't participate and could care less; of the other half, I would say less than 5% are truly engaged with the technology and networking possibilities. An older demographic to be sure, but if you look carefully at the so-called digital natives, while they are more likely to glom onto digital technology, they tend to stick with one or two tools and move on to the "next big thing" quite slowly.
So what do you think?
+Are we at a tipping point, and is dawn breaking shortly?
+Or are we slogging, as humans always do, through a slow but inexorable shift in how we deploy hot new technologies?