Classical computers use “bits” of information that can be either 0 or 1. But quantum-information technologies let scientists consider “qubits,” quantum bits of information that are both 0 and 1 at the same time. Logic circuits, made of qubits directly harnessing the weirdness of superpositions, allow a quantum computer to calculate vastly faster than anything existing today. A quantum machine using no more than 300 qubits would be a million, trillion, trillion, trillion times faster than the most modern supercomputer.
In the words of Bob The Angry Flower re: the misuse of the apostrophe, NO! WRONG! TOTALLY WRONG! WHERE'D YOU LEARN THIS? STOP DOING IT!
There is a horrid amount of misinformation flying around about what quantum computing is, what it would be useful for, and what kinds of advantages it would provide over conventional plain old boring regular physics computing. I took the trouble to learn what those differences were when I started work on my current book, and while I can't call myself an expert I at least understand the issues involved, like the P=NP problem and so on.
The biggest things I learned were this: a quantum computer can only be used to accelerate solving a small subclass of mathematical problems (mostly stuff like factoring integers), and the speedup in question is only so dramatic for those problems. But no, it is not pixie dust which can be magically sprinkled over a CPU; you're better off immersing it in liquid nitrogen and overclocking the hell out of it.
The fact the author of this piece is a professor of physics is all the more depressing, but I've learned that expertise in one area rarely maps to others. Being a professor of electrical engineering does not give you a license to pontificate about history, and being a professor of physics does not imply you understand computing, either.
At least it's filed an opinion piece. Thank heaven for small blessings.
[Addendum 10:44 PM: My computer-science friend chews my ear a bit to tell me that while QC does in fact limit itself to being applied to integer factoring or BQP-class problems, a good many other mathematical problems can be stated in this form. Cf.: Shor's Algorithm. That said, I'm convinced that it's deeply irresponsible to talk about QC in such blithe terms this early on. I do, however, retract my comments about the author himself, since I have no idea what form his piece was originally submitted as or whether or not he even endorses it in this form. Live and learn.]