Quantum computers, machine learning and big data


I can see you thinking "yes but" already. Yes, but quantum computing is still a ways off, you're thinking. To which I would reply, yes, but not as far off as you probably think. Interestingly, big data is fueling its acceleration. How so, you ask? In many cases, such as the Large Hadron Collider at CERN, extreme big data has outgrown current computing capabilities and a new computing model is urgently needed soon.

"Moore's law has basically crapped out; the transistors have gotten as small as people know how to make them economically with existing technologies," said Scott Aaronson, a theoretical computer scientist at the Massachusetts Institute of Technology, in an article on Wired.

Computing has coped by using multiple levels of memory and parallel computing via multiple cores. Networks are now distributed across data centers and the cloud. But even those advancements are rapidly reaching their limit.

"Aaronson points out that many problems in big data can't be adequately addressed by simply adding more parallel processing. These problems are 'more sequential, where each step depends on the outcome of the preceding step,' he said in Wired . "Sometimes, you can split up the work among a bunch of processors, but other times, that's harder to do. And often the software isn't written to take full advantage of the extra processors."

So the push is on to use quantum computing and machine learning to get these massive big data projects done.

The Wired article offers rich detail on computing for extreme data projects. And in truth it does say that quantum computing will not be the ultimate answer for all. At least not as we know it now. Still, big data is pushing advancements in computing that we may not have considered in its absence. And that alone is worth noting.

For now, big data users are rethinking how to use current computing methods.

"Google's Alon Halevy believes that the real breakthroughs in big data analysis are likely to come from integration--specifically, integrating across very different data sets. 'No matter how much you speed up the computers or the way you put computers together, the real issues are at the data level,'" he said in the Wired article.

In the end computing will have to reach scales that we have never before contemplated. We are only now catching a glimpse of just how large data can grow. But it is only a glimpse. Few can truly imagine just how big it will get.

Increasingly machine learning will have to do the work because the human mind and imagination will be dwarfed by a data overload. In all likelihood, it will require integration, quantum computing and something else that we haven't thought of yet just to keep up.

For more:
- see the Wired article
- view the CERN LHC website

Related Articles:
Are statisticians the modern explorers?
5 big data myth busters
Big data skills give job applicants winning edge in almost any field
Big data strategies: fixed mountains vs. shifting dunes