I think this is a pretty inaccurate simplification of the theory. They correctly point out that a qubit in superposition can be represented as a|0>+b|1> but a and b aren't bits, they are [i]probabilities[/i]. So the claim that N qubits are somehow "equivalent" to 2^N bits is unsubstantiated.
He hits closer to home at the end when he mentions how qubits are only advantageous for certain types of computations, but even then quantum computers are not parallel in any way that is analogous to a regular computer (in fact, due to the same bit/probability distinction I mentioned earlier).
Sorry, you need to Log In to post a reply to this thread.