In my work as a quantum engineer, I wear two hats. At the National Physical Laboratory (NPL) in London, where this photo was taken in April, I research quantum metrology, the scientific study of measurements based on quantum-physics principles. The instrument in this image is a dilution refrigerator, which allows us to cool our semiconductor quantum devices to 0.007 kelvin: that’s a fraction of a degree above absolute zero (−273.15 °C), a temperature that, in nature, exists nowhere in the Universe.
In experiments at the NPL, we clock the transfer of single electrons so accurately that I know exactly how
Computer science continues to break boundaries today. Wearable electronic devices, self-driving cars, and video communications shape our lives on a daily basis.
The history of computer science provides important context for today’s innovations. Thanks to computer science, we landed a person on the moon, connected the world with the internet, and put a portable computing device in six billion hands.
In 1961, George Forsythe came up with the term “computer science.” Forsythe defined the field as programming theory, data processing, numerical analysis, and computer systems design. Only a year later, the first university computer science department was
Scientists from the University of New South Wales in Sydney, Australia, have announced the discovery of a major breakthrough in quantum computing.
To date, quantum scientists and computer engineers have only been able to use proof-of-concept models of quantum processors that work with just a few spin qubits, the quantum equivalent of a bit.
Now, new research published in the journal Science Advances has identified a technique which the researchers claim will enable them to control millions of these qubits.
The team considers their design the “missing jigsaw piece” in quantum computer architecture.