Part 1
Networks and Parallel Computing
Chapter 1
On the Importance of Parallelism for the Security of Quantum Protocols
Marius Nagy and Naya Nagy
1.1 Introduction
1.2 Manipulating Entangled States
1.2.1 Generating entanglement
1.2.2 Distinguishing entangled states
1.2.3 Generalization
1.3 Quantum Bit Commitment
1.3.1 BB84 quantum bit commitment
1.3.2 Quantum bit commitment – within an equivalence class
1.3.2.1 Commit phase
1.3.2.2 Decommit phase
1.3.2.3 Unbinding entanglement
1.4 Oblivious Transfer
1.4.1 Protocol description
1.4.2 Sequential approach: single-qubit measurements
1.4.3 Attack strategy for Alice
1.4.4 Attack strategy for Bob
1.5 Conclusion
References
1.1 Introduction
The concept of parallelism is deeply related to time. The original motivation for the field of parallel computing was to increase the efficiency of computation by “saving” processor time. Specialized metrics such as speedup were developed to quantify the improvement in running time brought by a parallel solution with respect to the best sequential one. Nevertheless, the benefits of a parallel computational solution of a problem extend far beyond just efficiently using precious resources, such as processor time. In numerical computations, more processing units may translate to a more accurate solution [4], while in applications where the computational process is subject to certain constraints, parallelism may simply make the difference between success and failure in performing the task at hand [1, 2].
In one form or another, parallelism is present in virtually every attempt to increase the efficiency of our computations. Every computing device nowadays possesses multiple processing units (cores) working in parallel to get things done faster. This small-scale parallelism is extended to hundreds and thousands of processors connected together in intricate ways to achieve the impressive performances of supercomputers. Pushing the idea even further, entirely new computing technologies have been proposed that draw their power from the concept of massive parallelism. These nature inspired models of computation represent yet another step forward in the miniaturization trend we have been witnessing since the inception of computing technologies.
A bit is realized at the atomic or sub-atomic level in the field of DNA computing or quantum computing. DNA computing harnesses the power of the chemical bonds keeping together complex DNA strands and exploits their binding properties in order to do useful computations [7, 27]. Consequently, the “computer in a test tube” is actually a massive parallel computer employing millions of simple “processing units” in the form of links on a DNA strand.
Quantum computing, on the other hand, exploits the strange principles of quantum mechanics that govern the behavior of sub-atomic particles in order to speed up computation. To achieve this goal, one principle in particular is responsible for the power exhibited by the quantum model of computation: superposition. This principle expresses the ability of a quantum system to be described by a linear combination of basis states, or in other words, a superposition of basis states.
When applied to a computing problem, superposition endows a quantum computer with the capability to pursue multiple computational paths at the same time, in superposition, providing an advantage over a classical computer. Not every problem is amenable to a solution that can benefit greatly from the quantum parallelism intrinsic to superposition of states, but the most notorious example is Shor’s algorithm for fac...