Just 10,000 quantum bits might crack internet encryption schemes
Just 10,000 quantum bits might crack internet encryption schemes
Weekend reading with TechAptitude!
Quantum networks hold the potential to enable fundamentally new capabilities, including ultra-secure communication with unbreakable encryption, distributed quantum computing, and hold out the promise of creating a âQuantum Internetâ. Check it out!
https://techaptitude.substack.com/p/quantum-networking-ready-for-prime?r=vn8b8 #Quantum #QuantumNetworking #QuantumComputing #Networking #FiberOptics #Optical #Qubits #Photons #QuantumInternet #TechAptitude
Quantum Computer Visualization without Wavefunctions
The math behind quantum computing looks pretty similar to classical probabilistic computing. The only real difference is that the vector you evolve, Ï, is made up of complex-valued probability amplitudes rather than just being a vector of real-valued probabilities. I have been trying to bring them closer together and make quantum computing look more similar to probabilistic computing. The update rule for probabilistic computing is the following: - pââ = Îpâ Where pâ is your probability vector, Î is a stochastic matrix describing the probabilistic logic gate, and pââ is the updated probability vector after the probabilistic logical operation is applied. Now, if you take Ï and break it up into two real-valued vectors such that xâ=â(Ï) and yâ=â(Ï) and then transform xâ and yâ from Cartesian coordinates to polar coordinates, you get two new vectors: pâ and Ïâ. Notice how we have pâ. This very much interested me, so I tried to write down an update rule for this pâ directly, without going back to the original Ï formalism, and, interestingly, the rule ends up looking like this. - pââ = Îpâ + f(Ïâ) It looks identical to the probabilistic computing update rule plus an additional non-linear term that depends upon Ïâ. I didnât realize this until later, but apparently pilot wave theory does a similar thing, by putting Ï into polar form it gets a classically evolving probability distribution + an additional non-linear term called the quantum potential. Itâs more similar to probabilistic computing, but now I need to give some way to make sense of Ïâ. This problem is easier than making sense of Ï since Ïâ doesnât carry information that looks like possible paths as pâ does, so, regardless of the interpretation, we have already bypassed the problem of devolving into multiverse woo. I found that a good way to visualize such that Ïâ is a property of the system is to treat it as a âconnectiveâ property. Imagine plotting each bit in a quantum computer as a node on a hypergraph. Inside the node holds the bitâs âintrinsicâ property, that being a bit value of either 0 or 1. Then, you draw all the edges on the hypergraph and place numbers on each edge, each ranging from -2Ï to 2Ï. These numbers are weighted connections and represent the connective properties of the system. Below is an illustration using 3 qubits. [https://lemmygrad.ml/pictrs/image/1cd0a3a0-0646-4fcb-8eaa-ef0b3e9b5f2c.png] Note that the entire ontic state of the system (the intrinsic and connective properties) are all real-valued. This is the key distinction between probabilistic and quantum computing: quantum computing also has a deterministically evolving set of connective properties between the bits, which I refer to as the phase network, that you have to keep track of, and the state of this network can alter the behavior of logic gates, meaning that the same logic gate could have different stochastic behavior on the bits if it is in a different state. What is particularly interesting is that if you represent it in this form, then there is no longer a âcollapse of the wavefunction.â Measurement can just be represented as a Bayesian update on pâ since pâ is epistemic. Since Ïâ is ontic, you do not need to touch it when you make a measurement. We thus reduce the âcollapse of the wavefunctionâ to a classical knowledge update just like in probabilistic computing. Of course, we all know that systems behave differently if you measure them vs if you donât. But in this framework, this becomes explained slightly differently. It is not that, if you measure something, you immediately âcollapseâ anything. Rather, you can demonstrate that if particle A interacts with particle B such that the interaction is entirely passive on A but has the effect of recording Aâs value onto B, then it will influence the future nomology of A. By nomology I mean, the intrinsic properties of A do not immediately change as a result of this, but its behavior in a future interaction can change. I refer to this as the âinformation sharing principleâ (ISP), that even if an interaction is passive, if it shares information about a systemâs state, then the stochastic laws can lead to a different future evolution of the system. Hence, in something like the Mach-Zehnder interferometer, the measurement of the which-way information can be said to passively reveal the location of the photon on a particular arm of the interferometer, but by sharing its information with the measuring device, it will alter how it later is affected by the second beam splitter. You thus do not need to explain this âobserver effectâ with an immediately perturbing interaction, as if you physically âcollapsedâ something by looking at it. To actually fully illustrate the evolution of the ontic states, however, I would need some method of computing transitional probabilities. Transitional probabilities are a form of conditional probabilities where we condition on the immediate state before. For example, rather than asking the probabilities that the qubits are in a particular state at time t=4, we ask what are the probabilities that the qubits are in a particular state at time t=4 while conditioning on knowledge of the qubit values at time t=3. In traditional quantum mechanics, if I were to ask what are the transitional probabilities, the question doesnât even make sense, because it denies that systems have real values when you are not looking, and so it has no values at t=3. Of course, if you tried to measure those values at time t=3, you would alter the behavior of the computer due to the ISP, and it wouldnât be the same algorithm anymore. If quantum computing really is a kind of probabilistic computing, then we should be able to imagine that if we were Laplaceâs demon and could observe the qubits at time t=3, that we could then condition on them without altering the future evolution of the system via the ISP and then condition on that to update our probabilities for the state that the system will be in at time t=4. It turns out we can, indeed, do this, and it was first figured out by John Bell who showed that you can fit QFT to a stochastic process. The neat thing about transitional probabilities is once we have a way to compute them, then we can evolve the system stochastically such that at each set of logic gates applied to the qubits, we compute the transitional probabilities, then choose a new configuration of the qubits weighted by those transitional probabilities. The system will then evolve such that it has a definite bit value at every step of the algorithm, but if you ran the algorithm over and over again, each intermediate step would correspond to the intermediate Born rule statistics. I have put this all together with a website. In the website, there is a simulator for a 3-qubit quantum computer and a drag-and-drop interface to place quantum logic gates onto the tracks, and you can press Play to run them. On the left-hand side is the âepistemic stateâ which is pâ, the statistical distribution for the bits at that moment in the algorithm. On the right-hand side is the âontic state,â which is both Ïâ (the connective properties represented by edges on the hypergarph) and the bit values of the qubits (the intrinsic properties represented by a bit value of 0 or 1 within the node). [https://lemmygrad.ml/pictrs/image/23d4b9c2-944a-4363-9290-1cc9a88ee944.png] As you will notice if you play around with the drag-and-drop interface and make different circuits, you will see that the qubits always possess an ontic state, a measurement is just a Bayesian knowledge update on the epistemic state and does not affect the ontic state in the moment of measurement, you get correct transitional probabilities, The link is at: - https://www.stochasticnetwork.com/ [https://www.stochasticnetwork.com/] Please click the âEXPLANATIONâ button on the top-right if you want to see a technical explanation of all the equations and such I am using. I do not use Ï at all in the entire code of the simulator. It only evolves it directly using pâ and Ïâ. The proper equations for that are in the document you can find in âEXPLANATIONâ if you want to see them. Note that if you have any questions/concerns it is probably already addressed in that document.
Lesestoff fĂŒr die Pause in der Sonne: Alexander GlĂ€tzle, MitgrĂŒnder von planqc erklĂ€rt im Interview, wie das Garchinger #Startup einen Quantenchip mit rund 1000 #Qubits aus neutralen Atomen aufbauen will und warum #Wirtschaft und #Forschung sich allmĂ€hlich einschwingen aufs #quantencomputing Dazu hat Table Media interessante Marktzahlen rund ums Quantencomputing gesammelt. Sollte man sich in Ruhe mal anschauen: https://table.media/ceo/ceotalk/wann-lassen-sich-quantencomputer-kommerziell-nutzen-und-wofuer-alexander-glaetzle

Ein MĂŒnchner Start-up will bis 2030 kommerziell nutzbare Quantencomputer entwickeln. Entscheidend fĂŒr den industriellen Einsatz ist die Fehlerkorrektur, die LaborgerĂ€te in leistungsfĂ€hige Maschinen verwandelt.
La Clonage Illusoire : Une Révolution dans la Sauvegarde Quantique

Quantum Cloning's Cryptographic Bypass

Quanten-Backup ohne Theorem-Bruch

https://journals.aps.org/prl/abstract/10.1103/y4y1-1ll6 Well⊠thatâs not something I expected to read today.
A new Physical Review Letters paper claims a way to effectively clone qubits.
The trick: the copies are quantum-encrypted. You can make many of them, but the decryption key only works once â so at any moment only one usable quantum state exists, keeping the no-cloning theorem intact.
Interesting idea for quantum backups and distributed quantum data.
Probably worth investigating more deeplyâŠ
adding it to the already far too long TODO list. #QuantumPhysics #QuantumInformation #Qubits #QuantumComputing #Physics #AcademicMastodon