Quantum computing is, arguably, the biggest step in computing technology since the development of the microprocessor at the end of the 1960s. Once quantum computing becomes routine enough that you can have a desktop PQC (personal quantum computer), we'll truly have entered the fifth generation of computing.* For now, though, we'll have to be happy with IBM's newly-released IBM Q System One, the world's first commercially-available quantum computer.
Reading the text of that linked article above might raise a few questions, though. In particular, it notes that the System One only has 20 qubits, and it implies that qubits aren't useful for "more than 100 microseconds". What does all that mean? How do 20 qubits compare to 20 bits? Is 100 microseconds bad? IBM hasn't put a price tag on the System One and seems intent on selling access to it via the cloud... will the cost be worth it?
To most people, quantum computing seems like a cutting-edge field, but its theoretical foundations were laid in the early 1980s. Materials science and engineering are just beginning to catch up to theory, though, and quantum computing promises to be a major industry in the coming decades. If you want to get ahead of the curve, there's no better time than the present to start to learn.
In this series of articles, I'd like to outline the basics of quantum computing -- superposition, qubits, quantum logic gates, and the potential applications of quantum machines. I am by no means an expert in the field, so if you spot any inconsistencies or errors, please let me know in the comments.
* Remember: AI is just glorified statistics.
As software gets more and more integrated into our lives, the industrialization of its crafting process becomes inevitable. But the over-generalization of software engineering can be crushing the creative side of programming.