He/Him/His
I'm a Software Engineer and a teacher.
There's no feeling quite like the one you get when you watch someone's eyes light up learning something they didn't know.
Learning Quantum computing is totally doable and even without most of the Maths that people think are necessary. I've been recordin myself while learning here youtube.com/channel/UC-2knDbf4kzT3...
He/Him/His
I'm a Software Engineer and a teacher.
There's no feeling quite like the one you get when you watch someone's eyes light up learning something they didn't know.
Hey! I'm Dan!
I have been coding professionally for over 10 years and have had an interest in cybersecurity for equally as long!
I love learning new stuff and helping others
Location
Brighton / London, UK
Education
Edinburgh Napier (Postgrad Cert Advanced Security & Digital Forensics)
I'm currently reading Quantum Computing Since Democritus by Scott Aaronson. I really like his writing style and his ability to not dumb down quantum computing, but also make it accessible.
The best I understand it, quantum computing relies on the weird ability of electrons to be "neither here nor there" to create a tri-state bit: 0, 1, or 2. By working in ternary instead of binary, data can be processed at much higher speeds (and stored in less memory space).
The hardware difficulty with quantum computing is that you have to keep things at darn near 0 Kelvin, so the electron's "neither here nor there" state can actually stay locked in place.
At least, that's the "explain like I'm 5" explanation.
Here's a simple explanation of quantum bits that is totally true, but unfortunately doesn't provide much intuition, or help you understand why quantum is faster than classical: A quantum bit is a pair of real numbers (let's call them a and b) such that a2 + b2 = 1. That's it. The simple and somewhat unsatisfying fact is that normal intuition doesn't really apply to quantum computing.
Here's a fantastic video that explains how a quantum bit leads to quantum speedup, entanglement, and quantum teleportation using the math.
I know less about quantum computer hardware, but your hardware difficulty explanation seems on point.
No worries, and thanks for your original comment. I think it's great to talk about common misunderstandings, that way everyone comes out with a better understanding.
He/Him/His
I'm a Software Engineer and a teacher.
There's no feeling quite like the one you get when you watch someone's eyes light up learning something they didn't know.
He/Him/His
I'm a Software Engineer and a teacher.
There's no feeling quite like the one you get when you watch someone's eyes light up learning something they didn't know.
Suppose that there is very very big box of candies that all taste different. Someone asks you to choose your favorite flavor. If you work like a regular computer then you need to try all the flavors one by one before knowing.
But if you work like a quantum computer then it's really weird but really convenient. You will try to put all the candies in your mouth at once but when you do then you realize that there is only one candy that you have been putting in your mouth and that it is the best one. All of that in one go.
Coding since 11yo, that makes it over 30 years now ~~~
Have a PhD in Comp Sci ~~~
Love to go on bike tours ~~~
I try to stay as generalist as I can in this crazy wide place coding is at now.
Many people can and have written papers, blogs and even book's on the topic of Quantum Computing. The level of background knowledge required to understand Quantum Computing is high, but not unachievable. The topic is so vast that no comment could truly explain the topic.
Now i'm no authority on Quantum Computing but the
Tl;DR is that Quantum Computing, is new paradigm of computing. Which is based on natural Mathematics and can vary probabilistic in nature. In essence allowing the use of algorithms that Classical Computing find hard or vary resource intensive. That said Quantum Computing, will not fully replace Classical Computing, but will run alongside it.
If you want to learn more about Quantum Computing
Learn to research if you don't all ready know and then do a lot of research, but be careful and don't believe everything you read. verify the information your reading.
If you live near a university look to see if they have evening lecturers open the public as many do, they may just have a few on Quantum Computing
If your up to it engage in the online community learning about Quantum Computing
Practice and play with Quantum Computing, it's normally free and IBM provide free access to their cloud quantum computers.
This video from Microsoft research goes over the actual fundamental math that explains qubits and quantum computing: youtube.com/watch?v=F_Riqjdh2oM&li...
If you have an understanding of linear algebra it isn't too too hard to follow along.
Probably the shortest and easiest definition is the following:
in classical computing we use electricity to model bits (0s and 1s) and then perform computations
in quantum computing we use quantum elements (there are different implementations possible: photons, superconducting stuff, etc.) and their quantum mechanical phenomena (entanglement, superposition, etc.) to model something called qubits (complex linear combinations of 0s and 1s) and then perform computations
Basically it is not better or faster or whatever it's simply a different computational model. This is not black magic. lots of buzzwords recently but really this is just another model of computation which is pretty damn interesting to learn and quite mind-bending :)
A coworker shared this with me recently:
smbc-comics.com/comic/the-talk-3
Learning Quantum computing is totally doable and even without most of the Maths that people think are necessary. I've been recordin myself while learning here youtube.com/channel/UC-2knDbf4kzT3...
Hope it's helpful ;)
Thanks!
LOL thanks for this! I didn't know what it is, and kinda getting it.
I've read your comment, but feel like I didn't :|
I'm currently reading Quantum Computing Since Democritus by Scott Aaronson. I really like his writing style and his ability to not dumb down quantum computing, but also make it accessible.
The best I understand it, quantum computing relies on the weird ability of electrons to be "neither here nor there" to create a tri-state bit: 0, 1, or 2. By working in ternary instead of binary, data can be processed at much higher speeds (and stored in less memory space).
The hardware difficulty with quantum computing is that you have to keep things at darn near 0 Kelvin, so the electron's "neither here nor there" state can actually stay locked in place.
At least, that's the "explain like I'm 5" explanation.
Although this "ternary" explanation is simple and seems to provide some intuition about why a quantum computer is faster, it's incorrect. Quantum bits are fundamentally different from binary and ternary bits.
Ternary bits are actually possible in classical computers. Ternary bits are (almost) never used because reliable binary bit hardware was developed first.
Some people have built ternary computers which are super cool though!.
Here's a simple explanation of quantum bits that is totally true, but unfortunately doesn't provide much intuition, or help you understand why quantum is faster than classical: A quantum bit is a pair of real numbers (let's call them a and b) such that a2 + b2 = 1. That's it. The simple and somewhat unsatisfying fact is that normal intuition doesn't really apply to quantum computing.
Here's a fantastic video that explains how a quantum bit leads to quantum speedup, entanglement, and quantum teleportation using the math.
I know less about quantum computer hardware, but your hardware difficulty explanation seems on point.
I certainly had picked up some incorrect information, then. Thanks for the insight!
No worries, and thanks for your original comment. I think it's great to talk about common misunderstandings, that way everyone comes out with a better understanding.
Also, this video is just a bit math heavy, but I managed to get through it (even though I didn't get all the details):
lol I just posted the same thing. Great video.
Great minds think alike 😉
Suppose that there is very very big box of candies that all taste different. Someone asks you to choose your favorite flavor. If you work like a regular computer then you need to try all the flavors one by one before knowing.
But if you work like a quantum computer then it's really weird but really convenient. You will try to put all the candies in your mouth at once but when you do then you realize that there is only one candy that you have been putting in your mouth and that it is the best one. All of that in one go.
If you learn best by doing IBM has an interface to an actual real-life quantum computer -- quantum-computing.ibm.com/support
Once you've made an account you can design your circuit and queue it up to simulate, or run on a real one.
Yep, we're in the future 🤖
Many people can and have written papers, blogs and even book's on the topic of Quantum Computing. The level of background knowledge required to understand Quantum Computing is high, but not unachievable. The topic is so vast that no comment could truly explain the topic.
Now i'm no authority on Quantum Computing but the
If you want to learn more about Quantum Computing
This video from Microsoft research goes over the actual fundamental math that explains qubits and quantum computing: youtube.com/watch?v=F_Riqjdh2oM&li...
If you have an understanding of linear algebra it isn't too too hard to follow along.
Probably the shortest and easiest definition is the following:
in classical computing we use electricity to model bits (0s and 1s) and then perform computations
in quantum computing we use quantum elements (there are different implementations possible: photons, superconducting stuff, etc.) and their quantum mechanical phenomena (entanglement, superposition, etc.) to model something called qubits (complex linear combinations of 0s and 1s) and then perform computations
Basically it is not better or faster or whatever it's simply a different computational model. This is not black magic. lots of buzzwords recently but really this is just another model of computation which is pretty damn interesting to learn and quite mind-bending :)
Cool thread!
You can also check my post about taking a look to available QC Python packages: medium.com/coinmonks/qubits-a-sham...