Don’t Ask What Quantum Computers Could Do

Do this instead…

Do you want to get started with Quantum Machine Learning? Have a look at Hands-On Quantum Machine Learning With Python.

Recently, we have seen computers — classic computers — do amazing things. Computers have learned to drive cars and beat world champions at chess and Go. The latest thing everyone is talking about is ChatGPT.

ChatGPT is version 3.5 of OpenAI’s Generative Pre-trained Transformer 3 network, first released in 2020. ChatGPT is capable of producing human-like text. Training this network would have taken 355 years on a single GPU (Graphics Processing Unit) and cost $4.6 million. By parallelizing, the training was much faster. However, it required an enormous amount of computing resources.

Such advances in computing would not have been possible without the transistor. First Created at Bell Laboratories in 1947, the transistor enabled the proliferation of integrated circuits, the core element of any hardware device.

Over the years, transistors have gotten smaller and smaller as advances in semiconductor technology have allowed higher yields in the manufacturing process. Miniaturization has continued at a steady pace for several years. But the size of transistors is now approaching the physical limit of a single atom of silicon.

This means that technological progress — what we know as Moore’s Law — is about to end.

--

--

Frank Zickert | Quantum Machine Learning
Frank Zickert | Quantum Machine Learning

Written by Frank Zickert | Quantum Machine Learning

You're interested in quantum computing and machine learning. But you don't know how to get started? Let me help

Responses (1)