This article was originally published on InfoWorld.com by Bill Snyder on June 13th, 2013.
What brings this to mind is a talk I heard this week by John Kelly, the head of IBM Research. Kelly and his colleagues are thinking about a new paradigm (sorry for that word, but it fits) of computing. They call it cognitive computing. Part of that paradigm is design. In the first chapter (you can read an excerpt here) of an unpublished book, Kelly and coauthor Steve Hamm explain it this way:
Data processing should be distributed throughout the computing system, rather than centralized in a CPU. The processing and the memory should be closely integrated -- so there will be less shuttling of data and instructions back and forth. And discrete processing tasks should be executed simultaneously, rather than linearly.As interesting as a fundamentally new computer architecture may be, there's much more to cognitive computing. In his talk at the Computer History Museum in Mountain View, Calif., Kelly said that cognitive systems "can learn, understand natural language, interact with humans, and have a perception of their surroundings."
No, that's not just blue sky. Watson, the computer program that humbled the competition in a highly publicized "Jeopardy" match two years ago, represents the first generation of cognitive computing. That version ran on 100 powerful, but off-the-shelf, IBM Power 750 servers.
Watson is already out of the lab and into the field; it's also gotten smaller. IBM has partnerships with Memorial Sloan Kettering Cancer Center and the Cleveland Clinic in cancer treatment and with WellPoint in health care management. Big Blue just announced that Watson will be available as a cloud service called the IBM Watson Engagement Advisor.
By nature, researchers are optimists, and Kelly's prediction that we'll see "alternative materials like graphene, carbon nanotubes and nano wires, and things like phase-change memory and the quantum (probably hybrid) devices emerging in 15 years" may well be too optimistic. He is, of course, charged with boosting IBM's brand through such blue-sky notions. Even so, fundamental changes in computing don't happen very often, so in whatever timeline it actually occurs, this is stimulating stuff to think about.
The debut of Watson as a service
Although IBM would be happy to sell Power systems to anyone who cares to buy them, a cloud-based Watson makes much more sense. But because it is so complex, using Watson in the cloud won't be anything like simply plugging into Amazon Web Services, says Michael Karasick, director of IBM's Almaden Research Center.
"Companies will first bring data to Watson," he tells me -- not data about specific problems, but information that will teach Watson about the company's business. For now, Watson, which can actually read, has been trained in eight or so domains, including call centers and financial services.
Each domain is really a dialect of English (for now, Watson is fluent only in English) that Watson must absorb, and each company's implementation of those domains is a version of that dialect. Training Watson can take weeks or months, depending on the complexity. In an interview with Forbes, IBM said the likes of catalogs, training manuals, product disclosures, emails, customer forums, and call center logs, as well as publicly available feeds and reviews from places like Amazon.com, Yelp, Trip Advisor, and technical support communities, will all be grist for Watson's mill.
When Watson is up and running in the cloud, customers such as ANZ, Celcom, IHS, Nielsen, and Royal Bank of Canada will use it to create a customer-facing feature called Ask Watson. Consumers will be able to access it via Web chats, email, smartphone apps, and even SMS.
One of Watson's most striking features -- and one that will develop much more as cognitive systems roll out -- is its ability to understand unstructured data, which comprises the vast majority of data in the world. Even doctors' notes, which are famously hard to read, can be digested and are becoming a significant part of Watson's medical knowledge, Kelly says.
Quantum computing: Coming sooner than you think
In their book, Kelly and Hamm say, "People who demand the most from computers are already running into the limits of today's circuitry."
They quote Michel McCoy, director of the Simulation and Computing program at the Lawrence Livermore National Laboratory, noting that he is among those calling for a national initiative involving U.S. national laboratories and businesses to come up with radical new approaches to microprocessor and computer system design and software programming. "In a sense, everything we've done up until this point has been easy," McCoy says. "Now we have reached a physics-dominated threshold in the design of microprocessors and computing systems that, unless we do something about it, is essentially going to stagnate progress."
Kelly says part of the answer to that problem is what he calls data-centric computers. "The designers of computing systems have long treated logic and memory as separate elements. Now, they will meld the components together, first on circuit boards and later on single microchips. Also, they'll move the processing to the data, rather than vice versa."
Assuming that Kelly is correct, that shift obviously won't happen all at once. We'll see variations on silicon-based or silicon-like transistors over the next five years, he says. Indeed, Watson is really a hybrid of next-generation software and current-current generation hardware. Kelly makes the analogy to the first programmable computer systems that appeared decades ago: They had vacuum tubes; Watson systems are built on conventional silicon.
As I noted, Kelly is an optimist. "If you asked me five years ago how long it will be until we see quantum computers, I would have said 25 years," he said. "Now," he said as he held up two hands, "you could count the years on a bundle of fingers." The point is, it's really coming.
No comments:
Post a Comment