In the science of quantum communication, the challenge has always been prolonging the entangled state that the particles are in. As quantum information is carried by these entangled particles, the length of time the entanglement is sustained affects the distance that the information can travel.
Quantum communication systems do this using direct optical-fiber connections, which are rather limited because the way that fibers absorb light can disrupt the entanglement needed to carry quantum information.
Building a quantum internet, which is essentially a network of quantum entangled routers linked by fiber that can store quantum information, requires a function of routers that can store and send entangled particles. A team of researchers from the University of Vienna in Austria, led by Ralf Riedinger, supposedly built such a router.
This device is a nanomachine capable of receiving and storing quantum information sent through ordinary fiber optic cables. It contains a pair of nanofabricated silicon resonators that use electron-beam lithography and plasma reactive-ion etching, which are tiny silicon beams that vibrate like a guitar string.
Scientists have been working for years to make quantum computing a reality, and if progress made in 2017 is anything to go by, it doesn’t look like such a far-off dream anymore. This year has seen a number of advancements proving quantum computing is within reach, from the construction of a quantum simulator to the physicists who made quantum states using light.
This work is notable, because quantum computing has the potential to transform our technology (and our world) in remarkable ways, and it’s all due to how these computers process information. As a brief recap, traditional computing uses bits to carry information that’s in a state of 1 or 0 (on or off), whereas quantum computing can represent 1 and 0 simultaneously, enabling it to handle millions of more lines of code.
“Quantum computers will let us solve certain problems that we think are inaccessible to ordinary classical computers, like breaking encryption and calculating properties of molecules and other materials,” Andrew Childs, a professor at the University of Maryland’s Department of Computer Science and co-director of the Joint Center for Quantum Information and Computer Science (QuICS), said to Futurism over email.
“While they only appear to give an advantage for very specific kinds of problems, some of those problems have significant applications that might not be attainable without a quantum computer,” he added.
Fred Chong, Seymour Goodman Professor in the Department of Computer Science at the University of Chicago, took it one step farther, saying there are “totally unknown uses for quantum computers that we will discover once we have larger ones.” Childs echoes some of Chong’s opinion, explaining that such applications are “hard to predict with any accuracy.”
Challenges to Overcome
There are multiple examples of quantum computing breakthroughs in recent months and years, but don’t take that to mean scientists and researchers have a clear understanding of the technology in its current form. There’s a lot more that we don’t know.
To that end, as exciting as the future of quantum computing is, Chong doesn’t foresee the technology being used for practical purposes in the extremely near future — no quantum-enhanced iPhones or Macs are coming within the next year. He does believe that the technology will likely arrive during his lifetime. “Optimistically, we may demonstrate a practical application in five years,” Chong said. “Even more likely in 10 years.”
When asked about what areas could use more attention, Childs pointed to quantum simulations and how a lot of the focus is on simulating chemistry. Instead, he would like to see simulations of simpler condensed matter systems. Not only could this be achieved with lower overhead costs, Childs believes it could also have an invaluable impact on the scientific community. Chong, however, would like to see a shift to software, as developing the hardware has garnered more attention — Intel’s chip is proof of that.
“It is going to be a huge need and we need to catch up both in terms of developing that software and in terms of training people that can work on this problem,” said Chong.
Needless to say, those invested in quantum computers — both academically and financially — have their work cut out for them; even more so if they want the technology to go mainstream and eventually move away from the classical computing that powers every device we own. To that end, there are a number of challenges that need to be overcome in both the short and long-term. For Childs, he sees the development of systems capable of outperforming their predecessors as one goal that needs to be achieved.
“Developing experimental systems that can convincingly outperform classical computers at some task (not necessarily a useful one) is an important short-term goal that seems to be on the horizon,” he said. “But the field is full of challenges on both the theoretical and the experimental sides.”
Another short-term goal proposed by Chong is the development of 100-qubit computers. For an estimate of where we are currently, one of the largest quantum systems out there only uses 51-qubits. There is, of course, D-Wave’s 2000-qubit system, but it was made with optimization in mind rather than design. After we’ve managed 100-qubits, the next step is 1000-qubits, both of which will hopefully be multi-functional. “These will require a lot of precise engineering,” noted Chong.
It’ll be interesting to see what the next big advancement for quantum computing will be, and when the average person will be able to benefit from the technology. Regardless of how long it takes, or how much work is required, one thing is clear: we’re incredibly close to the age of quantum computing…assuming, that is, that we’re not already living in it.
At the IEEE Industry Summit on the Future of Computing in Washington D.C. on Friday, IBM announced the development of a quantum computer capable of handling 50 qubits (quantum bits). This breakthrough puts IBM on the cutting edge of quantum computing research, as a 50-qubit machine is so far the largest and most powerful quantum computer ever built.
Seen by experts as the future of advanced computing, a quantum computer performs rather differently compared to traditional computers. Instead of processing information using binary bits of 0s and 1s, a quantum computer uses qubits, which can simultaneously be a 0 and/or a 1. This is made possible by the quantum effects known as entanglement and superposition.
Aside from their 50-qubit machine, IBM also has a 20-qubit quantum computing system that’s accessible to third-party users through their cloud computing platform. IBM managed to maintain the quantum state for both systems for a total of 90 microseconds. That may seem short — because it is — but it’s already a record feat in this growing industry, where one of the biggest challenges is sustaining the life of qubits.
“We are really proud of this; it’s a big frickin’ deal,” IBM’s director for AI and quantum computing Dario Gil, who made Friday’s announcement, told the MIT Technology Review.
A Step Closer
IBM has been making significant advances in quantum computing ever since their researchers helped to create the field of quantum information processing. But they aren’t the only one in on the race to build working quantum computers. Google and Intel are also developing their own quantum computing systems, and San Francisco-based startup Rigetti wants to revolutionize the field. Meanwhile, Canadian quantum computing company D-Wave has already developed a couple of quantum computers which have been used by NASA and Google.
A 50-qubit machine can perform extremely difficult computational tasks, but with Google suggesting that this many qubits could outclass the most powerful supercomputers, IBM’s machine isn’t yet ready for widespread, commercial, or personal use. Like all of today’s quantum computers, IBM’s 50- and 20-qubit systems still require highly specialized conditions to operate.
Furthermore, as University of Maryland professor Andrew Childs pointed out to MIT Tech Review, IBM hasn’t yet published the details of their new machine in a peer-reviewed journal. “IBM’s team is fantastic and it’s clear they’re serious about this, but without looking at the details it’s hard to comment,” he said, adding that more qubits doesn’t necessarily translate to a leap in computational ability. “Those qubits might be noisy, and there could be issues with how well connected they are.”
At the very least, this development is bringing us one step closer to a future where quantum computing transforms how we process information and helps us to solve many of the world’s most difficult problems. IBM is set on making their quantum computer work, and they’re expected to announce an upgrade to their quantum cloud software today. “We’re at world record pace. But we’ve got to make sure non-physicists can use this,” Gil told the MIT Tech Review.
In 1935, Albert Einstein famously collaborated with physicists Boris Podolsky and Nathan Rosen on a paper designed to illuminate the weaknesses in quantum mechanics, a branch of physics focused on the very small. Though the phenomenon of “entanglement” was so new the term hadn’t even been coined yet, Einstein wasted no time voicing his skepticism of it, calling the “spooky action at a distance” simply impossible.
Nearly a century later, we now have abundant proof that entanglement is possible. In fact, we’re now on the brink of building the quantum computers that will allow us to use the phenomenon to answer some of humanity’s greatest questions.
Quantum computers handle and process information in a way far different from that of classical computers.
While a classical computer works with bits as information placeholders, a quantum computer works with quantum bits (qubits). While bits carry information in either a 0 or 1 state, qubits can be 0s and 1s at the same time thanks to quantum superposition.
Meanwhile, entanglement allows particles to be manipulated despite the distance between them — anything that happens to one particle will instantly be reflected in the other. Information can, therefore, be sent across greater distances far more quickly than with classical computers.
“Nobody is saying ‘never’ anymore,” Scott Totzke, Chief Executive of the Canadian firm Isara Corp., told The Wall Street Journal. “We are in the very, very early days, but we are well past the science-fiction point.”
So, what can we expect from the future of quantum computing?
Hurley elaborated on this in an email to Futurism:
It’s my personal belief that quantum computing will help us make sense of the deluge of data we find ourselves creating to solve some very interesting problems. There are systems generating billions of data sets a day, and those might be the solution to some critical problems affecting society, but we can’t possibly begin to [work] through [all the data]. To me, that’s extremely exciting.
In short, the future of quantum computing will see us solving some of the most complex questions facing the world today and not just in fields like physics or science.
“I think there’s a huge opportunity for quantum computing to disrupt a number of industries,” said Hurley. “However, I believe that the financial, pharmaceutical, and security industries will see the most change in the shortest amount of time.”
There will always be reasons to temper our optimism around any new technological advance. Still, I am incredibly excited about the potential positive outcomes of quantum computing. From finding new cures to diseases to helping discover new particles, I think this is the most excited I have been in my entire career. We should be focusing our energy as much on the positive outcomes as we currently are the negative ones.
A New Kind of Internet
So-called quantum communications is one of the more interesting applications planned for the future of quantum computing.
While researchers have managed to create quantum computers, they are all bulky and only work in specialized environments. Therefore, the future of quantum computing may include the adoption of alternatives to the systems currently being created in cutting-edge laboratories.
The path forward may be through the creation of specialized computer chips capable of functioning like quantum computers but sans the state-of-the-art components those systems require. The quantum computing revolution could also well begin with the average person accessing a quantum computer through a cloud service, such as those IBM and Google have created.
“I think you’re going to see general availability of simulators and emulators that are useful in the next 24 months,” said Hurley. Still, those systems would just be a placeholder for the real deal, which he sees arriving in the not-so-distant future: “I would love to see a universal quantum computer available as earlier as three to five years out, and I believe that could be possible.”
Has the Higgs boson been rediscovered? Perhaps technically, with the creative application of a computer capable of sorting through massive quantities of data. By making computation faster, researchers are hoping that quantum computers might aid in the exploration of our natural world and expand our knowledge of the known universe.
In a proof-of-principle study, a team of physicists has been using a quantum circuit to sift through mountains of data from experiments involving particle smashing — like the experiments that led to the observation and formal discovery of the Higgs boson particle. In fact, because it is both a recent and major discovery, the team used the Higgs boson observation as testing grounds for this machine built by quantum-computing company D-Wave. The general idea was that, because of this increased ability to consume and sort data, they could more easily “find” the Higgs boson. That said, the method proved to be on par with conventional methods, rather than more efficient.
But, despite what might seem like a setback, this development still signifies positive progress. Kyle Cranmer, a physicist at New York University who wasn’t involved in the work, said that “Before this point, people were aware that this maybe someday will be relevant. This makes it looks like maybe it is.” He finds it innately refreshing that, in lieu of more traditional mathematical solutions, this team is using a quantum machine to try and solve a practical physics problem.
In the initial (true) discovery of the Higgs boson particle at the European Organization for Nuclear Research (CERN), physicists, with the experiments ATLAS and CMS, distinguished between photons and non-photons in the wake of proton collisions (very simply put). They also trained machine-learning algorithms, with simulated data, to do the same.
The team searching for quantum solutions obtained a machine at D-Wave, wanting to see if it could follow instructions to search for the photon signature of the Higgs particle. This wasn’t to replicate the excitement of the initial discovery or prove that they could do the same, however — what it allowed researchers to do was determine if quantum computing could actually be a useful tool in physical exploration. Simply showing that it would be possible was “the coolest part” of this work, according to Cranmer.
Intel has announced that it has successfully fabricated a 17-qubit superconducting test chip for quantum computing. The superconducting chip has been submitted to the company’s quantum research partner QuTech for further testing.
Quantum computing has the potential to be a truly revolutionary technology, providing a currently unprecedented amount of computational power. However, the qubits that underlie the hardware are notoriously fragile — Intel’s implementation requires an operating environment that maintains a temperature of 20 millikelvin to function.
The new chip boasts an improved design that provides better reliability, enhanced thermal performance, and a reduction of the amount of radio frequency interference between qubits. It also introduces a scalable interconnect scheme that makes it possible to exchange between 10 and 100 times more signals in and out of the chip, compared to a wirebonded alternative.
Crucially, Intel has employed processes, materials, and designs that will allow Intel to scale up its packagine for quantum integrated circuits, which are much larger than a standard silicon chip. This could prove to be an important step in moving from the production of components to a fully fledged quantum computer.
Next Generation of Superconducting
“Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on a regular basis in our leading-edge manufacturing facilities,” commented Dr. Michael Mayberry, the corporate vice president and managing director of Intel Labs, in a press release.
While the field of quantum computing has made significant advances in recent years, there is still plenty of work to be done before a large-scale universal quantum computer is viable. Intel is keeping its options open, continuing research into spin qubits in silicon even while focusing on superconducting qubits.
Theoretical research has propelled quantum computing forward by leaps and bounds over the past decade, but Intel’s investigations into the practical side of fabrication are essential for the next stage of the process.
Quantum theory predicts entanglement; that huge numbers of atoms can be intertwined due to quantum forces, across distances, or inside macroscopic structures. However, “predicts” has been the key phrase up until recently — as actual hard evidence from experiments has been lacking. Experimental evidence was just presented by University of Geneva scientists, who demonstrated the entanglement of 16 million atoms in a one-centimeter crystal.
Achieving entanglement hasn’t been the real challenge for physicists looking to generate empirical proof of the concept, though. Researchers can generate entangled photons by splitting a photon. It is the observation and recording of entanglement that has proven next to impossible — until now. With one caveat, as explained by UNIGE applied physics group researcher Florian Fröwis explained in a press release about the team’s research. “But it’s impossible to directly observe the process of entanglement between several million atoms since the mass of data you need to collect and analyze is so huge.”
Therefore, Fröwis and his team took inventory of which measurements they were able to take, and of those, which might be able to generate the evidence they were searching for. They settled on the single direction of light re-emitted by the crystal, and analyzed its statistical properties. This was how the the team was able to show the entanglement of 16 million atoms, rather than a few thousand.
Quantum networks will be essential to data protection in the future, because they make it possible to send a signal and detect any interception of that signal by a third part immediately. To send and receive these kinds of signals, you need quantum repeaters which can unify entangled atoms with a strong quantum relationship despite being separated by great distances. These quantum repeaters house crystal blocks supercooled to 270 degrees below zero and enriched with rare earth atoms. Once these blocks are penetrated by a photon, entanglement is created.
Particle entanglement is at the heart of the coming revolutions in quantum computing and quantum encryption, which will themselves be driving everything from artificial intelligence to personalized medicine. And while this is high-level stuff, it all depends on the entanglement of atoms at the quantum level, which this research has demonstrated on an unprecedented scale.
When we say an emerging technology represents a “paradigm shift,” it’s often hyperbole. In the case of quantum computers, it’s an understatement.
In traditional computing—everything from PCs to ATMs to smartphones—all data is represented in bits that exist in one of two states: 0 or 1, off or on. In quantum computing, bits can be 0, 1 or both at the same time. That might not seem significant, but it means quantum computers can perform vastly more complex computations. For example, quantum computers can debug millions of lines of software code in seconds, making reliable aircraft, cars, MRI scanners, etc. more efficient to produce.
Scientists are eager to use quantum computers to analyze microbes so they can create new vaccines, which quantum computers could then be used to optimize to reduce unwanted side effects. Some scientists believe that quantum computers are essential for achieving breakthrough preventative and treatments protocols for healthcare. According to Donald Parsons, a New York State Department of Health research physician, “Without quantum computers, new DNA sequencing data, the learning of the specific activities of the folded conformations of proteins, and the search for new drugs by docking algorithms, are being held back from full clinical application.”
Quantum computing enables models that can more accurately predict future demand. Sticking with our smart cities example, brownouts and blackouts are complex phenomena with dozens of variables. In part, they are caused by utilities’ reliance on limited historical data and traditional computer models to allocate tax dollars to grid upgrades. What if cities could predict which neighborhoods will have households with one or two electrical vehicles (drawing 10 kW or more at night, five times the current average) years in advance? That would give utilities a much-needed head start on funding, designing and deploying the additional infrastructure.
If you were to shop for a quantum processor today, one vendor might advertise that its model has 2048 qubits, while another says its equivalent model has 50 qubits. “Confusions exist on what quantum computing or a quantum computer means,” says Hidetoshi Nishimori, a Tokyo Institute of Technology professor and a member of the IEEE P7130 working group. “This partly originates in the existence of a few different models of quantum computing. It is urgently necessary to define each keyword.”
A lack of common definitions for fundamental terms such as “qubit” isn’t just a comparison-shopping headache. Competing to own the lexicon wastes time and resources that would be better-spent refining and applying quantum computing technologies. A set of common definitions puts the world one big step closer to enjoying the innovations that quantum computing can enable.
By Whurley (William Hurley), chair, IEEE Quantum Computing Working Group.
Quantum computing: it’s the brass ring in the computing world, giving the ability to exponentially outperform and out-calculate conventional computers. A quantum computer with a mere 50 qubits would outclass the most powerful supercomputers in the world today. Surpassing the limits set by conventional computing, known as achieving quantum supremacy, has been a difficult road. Now, a team of physicists at the University of California Santa Barbara (UCSB) and Google have demonstrated a proof-of-principle for a quantum computer that may mean quantum supremacy is only months away.
Quantum states are difficult to isolate and sustain, so the practical task of isolating quantum processing machinery from outside interference has proved to be the sticking point in pushing quantum supremacy out of reach. However, to demonstrate quantum supremacy, a computer system doesn’t need to be an all-purpose quantum dynamo; it just needs to show one quantum capability that is beyond the capacity of conventional systems.
To do that, the Google and UCSB team’s strategy comes down to qubits. Qubits are different from ordinary bits (the smallest unit of data in a computer) because they can exist in superposition. Each ordinary bit can be either a 1 or a 0 at any given time, but a qubit can be both at once. Two ordinary bits have 2² potential positions, but again, only one at a time. Two qubits have that same potential all at once. Adding qubits expands potential exponentially, so 50 qubits represent 10,000,000,000,000,000 numbers — an amount a traditional computer would need a memory on the petabyte-scale to store.
Quantum Supremacy In Action
The team’s plan, then, isn’t to create a fully functional quantum computer, but to instead create a system that can support 49 qubits in superposition reliably. If it can do that, so the theory goes, the rest is relatively easy.
Their system is a series of nine superconducting qubits, consisting of nine metal loops cooled to a low temperature with current flowing through them in both directions simultaneously. They were able to show that the supported qubits represented 512 numbers at once, and that the results were reliable, without an accompanying exponential increase in errors.
This is much lower than the number of qubits needed to declare supremacy, but it’s a promising result. The next step will be to create a 50-qubit chip and test if its errors increase at the manageable pace seen in the nine-qubit experiment.
If the team is right, they may achieve quantum supremacy in a matter of months. If they do, the applications will be staggering. We can expect to see machine learning take place exponentially faster, and artificial intelligence progress much more rapidly. If it does, we may see the singularity approaching long before most predicted.
Quantum computers will make personalized medicine a reality, parsing out the function of every protein in the human genome and modeling their interactions with all possible complex molecules very quickly. We will see simulation-based climate change solutions come to light, and find new chemistry-driven solutions to carbon capture. We are likely to see huge leaps in material science and engineering that allow us to create better magnets, better superconductors, and much higher energy density batteries. And we are almost certainly going to see more technological advances through biomimetics as we find ourselves achieving more insights into natural processes, such as photosynthesis.
In other words, the idea that quantum supremacy will change everything isn’t just hype.
The money will be split between a hardware and a software team, with the former expected to have the funding renewed for five years and the latter for three. However, renewal is contingent upon the DOE’s future budget, according to Jonathan Carter, Berkeley Lab’s computing sciences area deputy.
Quantum computers are worlds ahead of the kind you probably use daily. Instead of encoding information in bits representing specific states, quantum computers encode information in qubits, which can represent multiple states simultaneously. As Carter explained to The Daily Californian, if traditional bit encoding is analogous to walking 50 paces and traversing 90 ordinary feet, qubit encoding is like taking a three-foot step, then a nine-foot step, then a 27-foot step, and so on.
This $3 million in annual funding from the DOE will go a long way toward making this geometric analogy a quantum computing reality. Berkeley Lab’s hardware team will use their half to focus on constructing a physical quantum computer. Meanwhile, the software team will use their $1.5 million annually to construct algorithms and investigate optimal methods of programming a quantum computer.
“I am confident that quantum computing will become a reality,” Head of Computational Chemistry Bert de Jong told The Daily Californian, and if the DOE funding is renewed as expected, he has every right to be.
Today’s working quantum computers are already more powerful than their traditional computing counterparts, but a pair of researchers from the University of Tokyo think they’ve found a way to make these remarkable machines even more powerful. In a research paper published in Physical Review Letters, Akira Furusawa and Shuntaro Takeda detail their novel approach to quantum computing that should allow the machines to perform a far greater number of computations than other quantum computers.
At the center of their new method is a basic optical quantum computing system — a quantum computer that uses photons (light particles) as quantum bits (qubits) — that Furusawa devised in 2013.
This machine occupies a space of roughly 6.3 square meters (67 square feet) and can handle only a single pulse of light, and increasing its capabilities requires the connecting of several of these large units together, so instead of looking into ways to increase its power by expanding the system’s hardware, the researchers devised a way to make one machine accommodate many pulses of light via a loop circuit.
In theory, multiple light pulses, each carrying information, could go around the circuit indefinitely. This would allow the circuit to perform multiple tasks, switching from one to another by instant manipulation of the light pulses.
The Power of Qubits
Unlike traditional binary bits that are either a one or a zero, qubits are entangled particles that can be either a one, a zero, or both at the same time. These qubits allow quantum computers to perform computations much faster than regular computers can, but most quantum computing models today can manipulate only a dozen or so qubits. Earlier this year, a team of Russian researchers revealed their quantum computer that could handle 51 qubits, and that was a huge breakthrough in the field.
Furusawa and Takeda believe they’ve managed to go well beyond this, asserting in a press release that one of their circuits is theoretically capable of processing over a million qubits. That sort of computing power is unlike anything we’ve ever experienced before. It would be enough to solve the greatest computing problems of today, facilitating breakthroughs in medical research or handling large datasets to improve machine learning models.
The next step is for Furusawa and Takeda to translate their theory into a working model. “We’ll start work to develop the hardware, now that we’ve resolved all problems except how to make a scheme that automatically corrects a calculation error,” Furusawa said, according to TheJapan Times. If it works as expected, this system will truly live up to its moniker as the “ultimate” quantum computing method.
Now, a team of Canadian and U.S. researchers has taken an all-important step towards making quantum networks more realistic, affordable, and secure. Their work, published in the journal Quantum Science and Technology, explains the benefits of using measurement-device-independent quantum key distribution (MDI-QKD) systems.
The team used commercially available and relatively cheaper components like distributed feedback (DFB) lasers and field-programmable gate arrays (FPGA) to make their experimental system. According to senior author Dr. Qiang Zhou, this MDI-QKD system allows quantum bits (qubits) to generate keys in random states, making them harder to individually identify.
Securing Quantum Networks
Still, making QKD systems completely secure will remain a difficult challenge for a while. As explained by fellow author Raju Valivarthi, the components used to make QKD systems never fully agree with security proofs, allowing those with the technical knowledge to gain access to secure information using “blinding attacks.”
“So-called ‘blinding attacks’ exploit vulnerabilities of single photon detectors (SPDs) to open a side-channel, via which an eavesdropper can gain full information about the (assumed-to-be) secure key,” he said.
Professor Wolfgang Tittel, the research group’s leader, remained optimistic about his team’s work, saying, “our experimental demonstration paves the way for MDI-QKD-based star-type quantum networks with kbps secret key rates spanning geographical distances of more than 100km.”
Making quantum networks more secure may take some time, but it’s an invaluable step forward. Quantum computers, as well as AI, are expected to be our best defense in cybersecurity, and they’ll only remain that way if they’re unable to be compromised. It’s unclear when they’ll be available to consumers, but expect them to change the world in radical ways once they are introduced.
Engineers have modelled the interactions between subatomic components of a complex molecule using a quantum computer, making a significant leap forward in our modelling of chemical reactions.
The simulations were carried out by IBM on superconducting hardware, and this milestone just pushed into new territory for what can be achieved using quantum computing.
The molecule in question was beryllium hydride – or BeH2. It’s not the fanciest molecule in town, but there’s still a lot going on between those two hydrogens and single beryllium for a computer to figure out.
Last year, Google engineers simulated the bonding of a pair of hydrogen atoms on its own quantum computer, demonstrating a proof of principle in the complex modelling of the simplest arrangement of energies in molecules.
Molecular simulations aren’t revolutionary on their own – classical computers are capable of some pretty detailed models that can involve far more than three atoms.
But even our biggest supercomputers can quickly struggle with the exponential nature of keeping track of quantum interactions of each new electron involved in a molecule’s bonds, something which is a walk in the park for a quantum computer.
Achieving Quantum Supremacy
These revolutionary devices have been big news of late, with big players in the information technology world investing heavily in the race for quantum supremacy – the line in the sand where quantum computers become truly practical tools that surpass the power of traditional computing systems.
For a quick 101; quantum computers are devices that use a particle’s binary states in specific kinds of calculations, much like a 1 or 0 in binary code.
Specifically, this property has a blurred in-between state called a superposition, the nature of which can be applied in calculations that would take a classical computer a long time to run through.
This makes quantum computers a big deal for some things, such as finding supersized prime numbers or – as in this case – crunching the numbers on particle interactions within a molecule.
Unlike those solar-system style diagrams your high school chemistry teacher drew on the board, electrons don’t behave like little spheres whizzing around a nucleus.
Instead they exist in a mind-bending state of possibilities that only get more complicated as you add more particles into their surroundings.
This constitutes what’s called a many-body problem in physics, and even just a few particles in one or two dimensions demands some hardcore problem solving.
Usually physicists will find short-cuts. One such simplification, for example, is called the Monte Carlo method, which applies a statistical sampling process to solve rule-based problems.
When it comes to increasing numbers of charged particles, these kinds of short cuts can quickly fall apart.
Having a working quantum computer can potentially provide a neat way to avoid these problems.
The things is, even the latest quantum computers are small and prone to making mistakes.
As cutting edge as it is, the seven qubit device used in this study still relied on delicate states that could only be used in calculations for microseconds, leaving little time for lengthy processes.
The goal was to come up with an efficient algorithm that would describe the arrangement of particles in molecules with three atoms, including lithium hydride and beryllium hydride.
“Our scheme contrasts from previously studied quantum simulation algorithms, which focus on adapting classical molecular simulation schemes to quantum hardware – and in so doing not effectively taking into account the limited overheads of current realistic quantum devices,” IBM researchers explain on their blog.
Ultimately this means we’ll be better prepared for the next generation of quantum computers that tackle bigger molecules.
It’s hoped that one day, we’ll have such detailed solutions to various many-body problems that we’ll be able to predict the interactions of compounds far more accurately, pointing the way to improved drugs or spotting obscure side effects before clinical trials even begin.
Eventually, the sky will be the limit for quantum computers. But even the biggest and best devices will be giant paperweights without the right software to drive them.
The word “quantum” sounds so advanced and complex that people tend to get hyped up about anything attached to it. While not every quantum breakthrough elicits a positive response, in the case of a so-called quantum internet, people have a reason to be excited.
In the simplest of terms, a quantum internet would be one that uses quantum signals instead of radio waves to send information. But let’s explain that a bit further.
The internet as we know it uses radio frequencies to connect various computers through a global web in which electronic signals are sent back and forth. In a quantum internet, signals would be sent through a quantum network using entangled quantum particles.
Researchers have recently made significant progress in building this quantum communication network. China launched the world’s first quantum communication satellite last year, and they’ve since been busy testing and extending the limitations of sending entangled photons from space to ground stations on Earth and then back again. They’ve also managed to store information using quantum memory. By the end of August, the nation plans to have a working quantum communication network to boost the Beijing-Shanghai internet.
Leading these efforts is Jian-Wei Pan of the University of Science and Technology of China, and he expects that a global quantum network could exist by 2030. That means a quantum internet is just 13 years away, if all goes well.
Quantum Web Surfing?
So, what does a quantum internet mean for regular internet users? As far as typical internet surfing goes, probably not much.
It’s highly unlikely that you’ll be using the quantum internet to update your social media feed, for one. “In many cases, it doesn’t make a lot of sense to communicate quantum mechanically,” University of Washington physicist Kai-Mei Fu told WIRED. For such things, regular internet communication is enough.
The quantum internet would excel, however, at sending information securely. Through what’s known as quantum encryption or quantum cryptography, people would be able to send “unhackable” data over a quantum network. This is because quantum cryptography uses a mechanic called quantum key distribution (QKD), which means an encrypted message and its keys are sent separately. Tampering with such a message causes it to be automatically destroyed, with both the sender and the receiver notified of the situation.
A quantum internet could also speed up access to a working quantum computer by putting quantum computing in the cloud. Instead of trying to get your hands on a physical quantum computer, which we still haven’t quite managed to make publicly available, you could access one through the cloud.
A regular personal computer could transmit or access quantum-encrypted information through this cloud-based quantum computer. At the very least, you could send “unhackable” emails. “Users might not want to send their information classically, where it could be eavesdropped,” Fu told WIRED.
Essentially, a quantum internet would most likely become a specialized branch of the regular internet, one we would only connect to for specific tasks. However, even if the quantum internet doesn’t work the same way the current internet does, one thing is for sure: the cutting-edge technology has the potential to benefit everyone, from hardcore physicists to regular Joes streaming the latest (not leaked) episode of Game of Thrones.
A photonic Bose-Einstein condensate is when individual photons are collected together in a single location, cooled, and brought together to create what is known as a super-photon. Recently, Weitz — of the Institute of Applied Physics at Germany’s University of Bonn — set out to conduct an experiment with a newly made one.
In this new experiment, Weitz and his team were able to create “wells” that allowed super-photons to flow from one well to the next, an achievement that could one day lead to much-anticipated quantum computing.
The team accomplished this task by bouncing a laser between two mirrors, moving the light through a pigment between the mirrors that cooled the light and turned it into a super-photon. Before introducing the laser light, a polymer was mixed in with the cooling pigment used to cool the light. Using this polymer allowed Weitz to influence the experiment’s refractive index using heat; increasing the temperature would let longer light wavelengths travel back and forth between the two mirrors.
By inducing different temperature patterns, Weitz’s team was able to induce a pseudo-warping effect in the polymer, creating “wells” at certain points that had a different refractive index than the polymer as a whole. The team then found that the super-photon would flow into the wells, just as a liquid might flow into a hollow space.
“The special thing is that we have built a kind of optical well in various forms, into which the Bose-Einstein condensate was able to flow,” Weitz said in a press release. “With the help of various temperature patterns, we were able to create different optical dents.”
Another Step Towards Quantum Circuits
Following the creation of the photonic Bose-Einstein condensate, Weitz team of researchers observed the behavior of two adjacent optical wells. By adjusting the temperatures of the polymer, the light in both wells came to have similar energy levels, thereby allowing the created super-photon to move from one to the other.
The work done by Weitz and his group could also lead to better developed lasers, such as ones used for welding or drilling.
Computing applications of this technology aren’t expected for quite a while, but some believe the first true quantum computers may debut as early as next year. It was only in July that two Swedish PhD students broke a quantum computing record, nudging use slightly closer to such a reality.
It’s currently a race to see who gets us to that point first, but it’s only a matter of time before we figure out how to create the right machines capable of handling quantum circuits. When we do, whole new aspects of our universe may become open to us, as our computer systems inevitably become faster and more powerful.
Today, encryptions are based on traditional mathematics. For now, they are mostly safe from hacking, but quantum computing would be able to completely change encryption as we know it now. Therefore, China is hoping to use transform encryption using quantum cryptography, and QKD technology in particular. QKD uses photons to transmit data, allowing two users in different places to, together, produce a common string of random bits called a secret key.
This kind of encryption is unhackable because there is no way to copy a photon in a precise enough way for hacking purposes, and measuring a photon would disturb it, clueing in the users about the disruption.
An Unhackable Future
This technology could have huge implications in cybersecurity. Businesses would be safer online, and e-commerce would be free of problems caused by hacking and identity theft. The tech would also make it far more difficult for governments to spy on private communications — something that global leaders and agencies around the world have a vested interest in.
China’s breakthrough transmission traveled about 1,200 kilometers to Earth from space, making it up to 20 orders of magnitudes more efficient than an optical fiber of the same length would be. This transmission is also much further than the previously understood limits of several hundred kilometers. This advancement is part of China’s overall push to become a major presence in space by 2030, a plan that includes reaching Mars by 2020.
China envisions a future with ground-based QKD networks and a global satellite system interacting to form a powerful, worldwide secure network. This transmission is the first step toward making this vision a reality.
One of the most interesting (and confusing) phenomena in quantum physics is quantum entanglement. We observe this quantum effect when we see entangled particles affect each other regardless of distance. For example, when we measure the state of one particle at a distance from another and the measurement of the state of the first instantly influences the state of the other, we have quantum entanglement.
Einstein was disturbed by this, and didn’t like the idea that quantum entanglement might violate the speed of light if the particles were somehow sending each other information faster than light could travel. Therefore, he developed the idea of local realism, which assumes a pre-existing value for any possible measurement of a particle — an objective value a particle must have. This theory is based on the idea of locality, the principle that there is a minimum amount of time it takes for distant objects to influence each other, and realism, the idea that objects exist whether or not they are measured.
In the 1960s, Physicist John Bell developed a famous test to determine whether particles really do influence each other in the way quantum entanglement suggests. In the Bell test, a pair of entangled particles are sent in different directions toward different locations. A device measures the state of each particle in each location, and the settings of each device are set at random; this way it’s impossible for device one to know the setting of device two at the time of measurement, and vice versa.
If quantum entanglement is real, then local realism shouldn’t work, and the Bell inequality test should be violated. If scientists do observe violations of the Bell inequality test, it means that quantum mechanics violates locality, realism, or both — making local realism incorrect. In recent research, physicists have reported some of the best evidence to date that quantum entanglement exists, and the quantum world is free of the constraints of local realism. Researchers performed a Bell inequality test that was, essentially, loophole-free, and demonstrated that two atoms one-quarter of a mile apart shared correlations probably caused by quantum entanglement. According to local realism, this should be impossible.
A quantum simulator isn’t a full-blown quantum computer, let’s get that out first. The main difference is that the former is built to solve only one equation model while the latter is able to perform — theoretically — any equation put to it. This quantum simulator could model, for example, the minute behavior of molecules and drugs, and researchers working Harvard University recently announced that they’ve made the largest one yet, operating with 51 qubits.
Instead of using photons like many quantum computer researchers do, the Harvard team’s qubits are each made from a single rubidium atom. Trapped in place using lasers, information is programmed into these qubits by modulating the laser beam.
A Model for Quantum Computers
Qubits are at the heart of quantum computing. While conventional computers rely on bits of 0s and 1s to process information, quantum computers use qubits, each of which are capable of being a 0 or 1 at the same time. This allows quantum computers to handle information faster. The difficulty is in keeping the qubits stable.
Currently, Google is working on what could be the largest quantum computer, which would run on a 49-qubit chip. Lukin’s quantum simulator beats that with 51 qubits. While the simulator is designed to handle one problem at a time, the method used could translate into a full-blown quantum computer.
“The full-blown quantum computer is the hardest system to get right,” Simon Devitt form Macquaries University in Sydney told New Scientist. Quantum simulators are also costly to build, Devitt noted, which could limit the potential applications of Lukin’s technology to just inside the physics lab for now.
Nevertheless, the achievement is a breakthrough. It shows how possible it is to develop a quantum computing system using 51 qubits. And the more qubits there are, the more powerful a quantum computer could be. It may still take time, though, before this could translate to a universal quantum computer.
The students used 8,192 of the 9,688 Intel Xeon Phi processors on Cori, NERSC’s newest supercomputer, for the largest of their simulations. Unfortunately, they could not run an even larger simulation using all of the supercomputer’s nodes as that would risk the system collapsing.
The Quantum Revolution
Quantum computing has the potential to revolutionize the entire world by increasing the processing power of computers by orders of magnitude. However, two questions have thus far stumped quantum computer creators: how to create machines with sufficient processing power and how to scale those machines for mass production.
While large-scale quantum computers are not yet available, their performance can be inferred using quantum compilation frameworks and estimates of potential hardware specifications. However, without testing and debugging quantum programs on small scale problems, their correctness cannot be taken for granted. Simulators and emulators … are essential to address this need.
The potential uses for quantum computers once they are developed are seemingly infinite. While most center on complex data analysis, which classical computers can only perform very slowly or not at all, others have considered even more innovative uses for quantum systems.
Kindred has hypothesized that a robotic exoskeleton capable of managing the work of four people could be powered using a quantum computer. A molecule has been modeled successfully using one, paving the way to computing entire chemical systems, and Google has considered using quantum computing to enable their autonomous vehicle to distinguish cars from other objects more effectively.
Truly, the era of the quantum computer is just on the horizon, and once we reach it, every computer system we use will have the potential to become faster and more powerful.
We’re all familiar with the occasional heating from our gadgets and devices, especially when we ramp up usage. Usually, in the case of desktop and laptop computers, there’s a fan somewhere to keep the temperature inside from frying circuits. More modern computers have dropped the fans and now rely instead on materials that dissipate heat or batteries that don’t heat up that much.
This necessity to keep things cool is a similarity that quantum computers share with their classical cousins. The difference is, quantum computers need cooling at the nanoscale level — keeping the temperatures of quantum bits (“qubits”) from rising uncontrollably. The solve this, researchers from the Aalto University in Finland have designed the first standalone refrigeration device for quantum circuits.
The nanoscale refrigerator Möttönen and his colleagues designed works by manipulating electrons. “Here we demonstrate direct cooling of a superconducting resonator mode using voltage-controllable electron [tunneling] in a nanoscale refrigerator,” the researchers, led by Mikko Möttönen, wrote in their study published in the journal Nature Communications.
A Cool Solution
In short, hotter electrons jump to a superconducting lane, leaving the cooler electrons in a photon resonator — a device that works as a qubit. This process cools both the resonator and the electrons over time. “It’s kind of like a gate similar to Maxwell’s Demon, where you only allow electrons with energy above a certain threshold to cross,” CalTech’s Spiros Michalakis told New Scientist, commenting on the device.
This technology could help quantum computers run faster and more efficiently by solving a problem that lies at the heart of tech that uses qubits. Qubits are able to store more information than binary bits in classical computing by being a 0 or a 1 at the same time, thanks to a phenomenon known as superposition. However, qubits must start in their low-temperature ground state to perform calculations, which becomes difficult because they tend to heat up in computers. So, to run multiple algorithms, qubits have to go through a cooling cycle.
Once the nanoscale refrigerator gets built, tested, and found to be sufficient to cool qubits, we may be a step closer to commercial quantum computers. “Maybe in 10 to 15 years, this might be commercially useful,” Möttönen told New Scientist. “It’s going to take some time, but I’m pretty sure we’ll get there.”
Google is maintaining its edge in the world of quantum computing. Its 20-qubit processor is currently undergoing tests, and the company appears to be on schedule to have its working 49-qubit chip ready by the end of 2017 as promised. Until it began trialing the 20-qubit chip, Google’s most powerful quantum chip was the 9-qubit effort from 2015.
Traditional computer bits are binary, only existing as either 0 or 1; they’re like light switches that are either on or off. Qubits, on the other hand, can be 0 or 1 like regular bits, but can also have quantum properties that allow them to exist in a superposition where they are both 0 and 1 simultaneously. This makes qubits potentially far more powerful, because instead of figuring something out by trying each option one by one, they can simultaneously compute more than one possibility.
Google’s 49-qubit chip will allow them to develop a 49-qubit quantum system that can solve problems that are far beyond the capacity of ordinary computers: Google calls this goal quantum supremacy. The 20-qubit system that the Google quantum computing team is now working on currently boasts a “two-qubit fidelity” of 99.5 percent. The higher the rating, the fewer errors the system makes. Quantum supremacy demands not only a 49-qubit system, but also sufficient accuracy to achieve a two-qubit fidelity of at least 99.7 percent—which Google is on track to deliver by the end of 2017.
Quantum Computing, Quantum Speed
Google isn’t alone in their quest for advancing quantum computing. In 2016, IBM was running a 5 qubit computer, but by May 2017, it was offering beta access to its 16 qubit platform to the public for testing purposes. Furthermore, qubits alone aren’t the only consideration for actually achieving working quantum computers; error correction and scaling will also be critical to quantum systems. However, if Google does achieve quantum supremacy, it will be a major step forward.
Traditionally, measuring quantum states is a tedious affair. The technique used, called quantum tomography, requires measuring multiple copies of the quantum state in various ways, in order to count all possible outcomes and arrive at a full set of probabilities. Although important in testing quantum systems, this is not very practical. That’s why researchers from the Center for Quantum Technologies at the National University of Singapore and the California Institute of Technology devised a much simpler method.
In a study published in the journal Nature Communications, the researchers proposed a measurement system that can pinpoint the fingerprint of any two-particle entangled quantum state. This device-independent way of certifying quantum states, among other things, “can bound specific quantities like the amount of randomness, the length of the secret key in quantum cryptography,” according to the study.
But first, a bit of a background: Quantum entanglement is a phenomenon where two particles are held in a multitude of undecided outcomes or possibilities. The change in one affects the change in another, regardless of distance. As such, entanglement is at the heart of quantum technologies like quantum computing and quantum cryptography — as well as the possibility of quantum teleportation.
For instance, quantum computing relies on entangled particles known as quantum bits (qubits). These qubits hold quantum states capable of being either a 0 or 1, in terms of carrying information. Building on existing findings regarding qubits, the team extended their work to higher-dimensional qubits known as qudits — capable of storing more information, not just 0s and 1s, but a 0, 1, 2, 3, 4, and so on.
Securing Quantum Technologies
The general problem is determining whether quantum systems actually work and can deliver on the properties expected of them. “I like to see our work as bringing the power of testing quantum devices to the consumers who use them,” NUS researcher Goh Koon Tong explained to Phys.org. “Currently, only those who build the devices or understand the engineering aspect of them can perform the test.”
The team hopes that it would be possible for engineers and consumers of quantum technologies to perform such tests in the future. They encourage other researchers to develop ways to incorporate their device-independent checks that would allow for self-testing quantum technologies. According to researcher Valerio Scarani from NUS, there’s already interest. “Of all my work in the past five years, this has attracted the most attention,” he said.
This would allow engineers to spot errors in quantum technologies and devices that don’t perform what they promise to do. That’s especially crucial, since quantum computing is poised to be the future of information processing, which could improve the way we handle problems and conduct research in various fields. Likewise, quantum cryptography is also being promoted as the future of cybersecurity.
Quantum computing is, if you are not already familiar, simply put, a type of computation that uses qubits to encode data instead of the traditional bit (1s and 0s). In short, it allows for the superposition of states, which is where data can be in more than one state at a given time.
So, while traditional computing is limited to information belonging to only one or another state, quantum computing widens those limitations. As a result, more information can be encoded into a much smaller type of bit, allowing for much larger computing capacity. And, while it is still in relatively early development, many believe that quantum computing will be the basis of future technologies, advancing our computational speed beyond what we can currently imagine.
It was extremely exciting then when researchers from MIT, Harvard University, and Sandia National Laboratories unveiled a simpler way of using atomic-scale defects in diamond materials to build quantum computers in a way that could possibly allow them to be mass produced.
For this process, defects are they key. They are precisely and perfectly placed to function as qubits and hold information. Previous processes were difficult, complex, and not precise enough. This new method creates targeted defects in a much simpler manner. Experimentally, defects created were, on average, at or under 50 nanometers of the ideal locations.
The significance of this cannot be overstated. “The dream scenario in quantum information processing is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it,” says Dirk Englund, an associate professor of electrical engineering and computer science, in an interview with MIT. “We’re almost there with this. These emitters are almost perfect.”
Image Credit: carmule / Pixabay
A Quantum Future
While the reality of quantum computers, let alone mass produced quantum computers, is still a bit of a ways off, this research is promising. One of the main remaining hurdles is how these computers will read the qubits. But these diamond defects aim to solve that problem because they naturally emit light, and since the light particles emitted can retain superposition, they could help to transmit information.
The research goes on to detail how the completion of these diamond materials better allowed for the amplification of the qubit information. By the end, the researchers found that the light emitted was approximately 80-90 percent as bright as possible.
If this work eventually leads to the full creation of a quantum computer, life as we know it would change irrevocably. From completely upending modern encryption methods to allowing us to solve previously “unsolvable” problems, our technology and infrastructure would never be the same. Moreover, the limitations that currently exist in how we store and transmit information would shatter, opening new opportunities for—as yet—unimaginable exploration.
Due to their complexity, quantum computers are still largely inaccessible for the average person, which is why developers and programmers jumped at the chance to test out IBM’s five qubit quantum computing processor when the company offered the public free access to it last year, running more than 300,000 experiments on the cutting-edge machine.
Now, the company is taking the tech to the next level, announcing yesterday that it has built and tested its two most powerful platforms for quantum computing to date: the 16 qubit Quantum Experience universal computer and a 17 qubit commercial processor prototype that will serve as the core for its IBM Q commercial system.
IBM’s 16 qubit processor will make far more complex computations possible without breaking a symbolic quantum sweat. Once again, the company is hoping that developers, programmers, researchers, and anyone working in the field will make use of the platform. To that end, anyone interested in using it for experiments to help usher in the age of quantum computing is encouraged to visit GitHub’s Software Development Kit to request beta access. Otherwise, they can simply access the IBM experience library to play around with the technology.
Of course, IBM is far from satisfied with just 16 or 17 qubits. The company hopes to significantly ratchet up the power with a goal of achieving a 50 qubit quantum computing platform — or maybe one with even more power — in the next few years.
Beta Testing and Beyond
Quantum computing technology has the capacity to solving extraordinarily complex problems — problems that in many cases may be difficult for us to even conceive of right now. This potential has been propelling research forward at a remarkable rate, with researchers smashing through milestone after milestone along the path toward commercial quantum computing.
In August 2016, a quantum logic gate with an amazing 99.9 percent precision was achieved, removing a critical theoretical benchmark. Meanwhile, researchers used microwave signals to encode quantum computing data, offering an alternative to optical solutions. In October 2016, researchers used silicon atoms to produce qubits that remained in stable superposition 10 times longer than any qubits before them.
However, as each technical barrier has fallen, the need for public collaboration has become more apparent. In January, Canadian quantum computing company D-Waveopen-sourced its own quantum software tool, Qbsolv, allowing programmers to work on a quantum system whether or not they had any prior experience with quantum computing. With IBM now offering an even-more-powerful system for experimentation, the public now has at its disposal a tool that could lead to remarkable advancements in nearly every field imaginable. As experts have announced, we truly are now living in the age of quantum computing.
In classical computing, information is stored in bits that are read by physical phenomena like electricity. You might recognize them as 1’s and 0’s, also called binary code. In quantum computing, it’s stored in quantum bits, or “qubits.” However, computers aren’t the only way we can store information: chemistry is also capable. Scientists at the Institute of Physical Chemistry of the Polish Academy of Sciences (IPC PAS) in Warsaw have developed a way in which chemical droplets can store information like bits and qubits in a one-bit chemical memory unit called the “chit.”
The chit is made up of three droplets. Between the droplets, chemical reactions take place, circulating cyclically and consistently. This memory is rooted in the Belousov-Zhabotinsky (BZ) reaction, which reacts in an oscillatory manner. Each reaction creates the reagents necessary for the next reaction, continuing ad infinitum. These reactions are helped by a catalyst — ferroin — which causes a color change. There is also a second catalyst — ruthenium — which makes the reaction light sensitive. It’s this light sensitive feature, when blue light is shone upon the reaction, that stops it from oscillating. That’s important, because it allows researchers to control the process.
The chit essentially allows for “chemical computing.” So, instead of traditional bits, the components are all chemical. While quantum computing continues to advance, this brand new type of computing could create an entirely new way to store, read, and transfer information.
Everything from smartphone technology to classified digital files depend on our ability to store and read information — the basis of computing. Completely changing the very base of most technology that we rely upon today could have incredible consequences. Perhaps technologies that are currently being developed to battle climate change could face major upgrades and modifications. Perhaps the devices and vehicles that we use to explore space will go through changes as well. This type of advancement could completely revolutionize so much of the technology that we know, and in ways we may not even yet be able to imagine.
Caltech physicists at the Institute for Quantum Information and Matter have discovered the first 3D quantum liquid crystal. This is a new state of matter they expect will have applications in ultrafast quantum computing, and the researchers believe this discovery is just the “tip of the iceberg.”
The molecules of standard liquid crystals flow freely as if they were a liquid, but stay directionally oriented like a solid. Liquid crystals can be made artificially, like those in display screens of electronic devices, or found in nature, like those found in biological cell membranes. Quantum liquid crystals were first discovered in 1999; their molecules behave much like those in regular liquid crystals, but their electrons prefer to orient themselves along certain axes.
The electrons of the 3D quantum liquid crystals exhibit different magnetic properties depending on the direction they flow along a given axis. Practically speaking, this means that electrifying these materials changes them into magnets, or changes the strength or orientation of their magnetism.
The research team expects that 3D quantum liquid crystals might advance the field of designing and creating more efficient computer chips by helping computer scientists exploit the direction that electrons spin. The 3D quantum liquid crystal discovery could also advance us along the road toward building quantum computers, which will decrypt codes and make other calculations at much higher speeds thanks to the quantum nature of particles.
Achieving a quantum computer is a challenge, because quantum effects are delicate and transient. They can be changed or destroyed simply through their interactions with the surrounding environments. This problem may be solved by a technique requiring a special material called a topological superconductor — which is where the 3D quantum liquid crystals come in.
“In the same way that 2D quantum liquid crystals have been proposed to be a precursor to high-temperature superconductors, 3D quantum liquid crystals could be the precursors to the topological superconductors we’ve been looking for,” Caltech assistant professor of physics David Hsieh, principal investigator on the new study, said in an interview for a Caltech press release.
“Rather than rely on serendipity to find topological superconductors, we may now have a route to rationally creating them using 3D quantum liquid crystals,” Hsieh lab postdoctoral scholar John Harter, the lead author of the new study published in Science, said in the press release. “That is next on our agenda.”
Time crystals can exist — it’s already been proven. Previously, two teams of researchers created their own time crystals that bend the laws of space and time. One of these, from the University of Maryland, used a chain of charged particles called ytterbium ions. Meanwhile, the other team from Harvard University created an artificial lattice using synthetic diamond.
Both setups demonstrated the quantum system behind such an object, and both produced new materials that work as time crystals.
Since it was first conceptualized in 2012 by physicist and Nobel laureate Frank Wilczek, time crystals have made its way out of the theoretical and into the real world. It took — you guessed it — time, because time crystals are essentially impossibilities. Simply put, a time crystal is like conventional crystals, but with an added twist. Instead of just lattices repeating in space, it also repeats in time by breaking what’s called a time-translation symmetry.
Because of this unique quality, time crystals exhibit movement while remaining in its lowest energy or ground state. This bizarre phenomenon is the first example of non-equilibrium phases of matter.
Applications in Quantum Technologies
Now, the Harvard researchers, led by physics professors Mikhail Lukin and Eugene Demler, want to understand more about this new kind of material. Its physics has been proven, yes, but a lot more remains to be uncovered about it. “There is now broad, ongoing work to understand the physics of non-equilibrium quantum systems,” Lukin explained in an interview for a Harvard press release.
Indeed, research into materials like time crystals are helping us understand the physics of both the quantum world and our own. Such scientific anomalies often lead to knowledge that allows us to transform these worlds. In this case, the time crystals have provided Lukin and the other researchers with an avenue to develop new technology — such as precision measurement tools, quantum sensors, and even atomic clocks.
A particularly interesting potential application of time crystals is in quantum computing. Quantum computers are going to be the next generation of computing machines, as these will be more powerful, more precise, and more efficient than existing computers. Its use of quantum bits (or qubits) to process data and information is the key. Instead of relying on bits of 0s and 1s, quantum computers use qubits that can be both a 0 and 1 at the same time because of a quantum phenomenon called “superposition.” Time crystals can, perhaps, help develop these into workable models.
“This is an area that is of interest for many quantum technologies,” Lukin said, “because a quantum computer is basically a quantum system that’s far away from equilibrium. It’s very much at the frontier of research…and we are really just scratching the surface.”
Once realized, quantum computers could revolutionize the way we do research by solving the most complex problems in a flash.
Scientists at the University of East Anglia (UEA) have discovered that when photons are created in pairs, these fundamental light particles can emerge from different locations. This may have a significant impact on both the study of quantum physics and quantum computing, because up until now scientists believed that these kinds of photon pairs originated from single points in space.
“When the emergent pairs equally share the energy of the input, this is known as degenerate down-conversion, or DDC. Until now, it has been assumed that such paired photons come from the same location. Now, the identification of a new delocalized mechanism shows that each photon pair can be emitted from spatially separated points, introducing a new positional uncertainty of a fundamental quantum origin.”
Quantum entanglement is critical to a range of quantum applications, from computing to teleportation. These findings, published in Physical Review Letters, illustrate that photons are not the pinpoint-accurate, sharply-defined bits of light we’ve conceived of in the past, and that there are limits to spatial resolution.
“Everything has a certain quantum ‘fuzziness’ to it, and photons are not the hard little bullets of light that are popularly imagined,” Andrews concluded int the press release.
Since Rigetti Computing launched three years ago, the Berekely and Fremont-based startup has attracted a host of investors — including private American venture capital firm, Andreessen Horowitz (also known as A16Z). As of this week, Rigetting Computing has raised a total of $64 million after successfully hosting a Series A and Series B round of funding.
The startup is attracting investors primarily because it promises to revolutionize quantum computing technology: “Rigetti has assembled an impressive team of scientists and engineers building the combination of hardware and software that has the potential to finally unlock quantum computing for computational chemistry, machine learning and much more,” Vijay Pande, a general partner at A16Z, said when the fundraising was announced.
Quantum Problem Solving
Quantum computers are expected to change computing forever in large part due to their speed and processing power. Instead of processing information the way existing systems do — relying on bits of 0s and 1s operating on miniature transistors — quantum computers use quantum bits (or qubits) that can both be a 0 or a 1 at the same time. This is thanks to a quantum phenomenon called superposition. In existing versions of quantum computers, this has been achieved using individual photons.
“Quantum computing will enable people to tackle a whole new set of problems that were previously unsolvable,” said Chad Rigetti, the startup’s founder and CEO. “This is the next generation of advanced computing technology. The potential to make a positive impact on humanity is enormous.” This translates to computing system that are capable of handling problems deemed too difficult for today’s computers. Such applications could be found everywhere from advanced medical research to even improved encryption and cybersecurity.
How is Rigetti Computing planning to revolutionize the technology? For starters, they’re building a quantum computing platform for artificial intelligence and computational chemistry. This can help overcome the logistical challenges that currently plague quantum computer development. They also have an API for quantum computing in the cloud, called Forest, that’s recently opened up private beta testing.
Rigetti expects it will be at least two more years before their technology can be applied to real world problems. But for interested investors, investing in such a technological game-changer sooner rather than later makes good business sense.
While many developers focus on increasing the intelligence of artificially intelligent (AI)algorithms, IBM is eyeing a different area of technology: quantum computing.
Quantum computers are going to be game changers, bringing with them faster data processing and information handling. This increase in speed is made possible through the use of quantum bits (or qubits) instead of the binary bits that current computers employ. Qubits rely on the quantum phenomenon of superposition, which allows them to be 0s or 1s at the same time. This ability to exist in multiple states at once enables qubits to process information more quickly.
Now, IBM is pushing for the development of a truly universal quantum computer, and to that end, it has launched IBM Q, “an industry-first initiative to build commercially available universal quantum computers for business and science.”
A Neat Trick and More
Through IBM Q, the company hopes to improve its current quantum computing models by enlisting the help of others interested in the field. IBM is updating its quantum computing cloud service with a new application program interface (API) designed to give developers and programmers who don’t have a background in quantum physics the ability to create interfaces between IBM’s cloud-based quantum computer and traditional computers.
The computing industry giant hopes that these updates will encourage researchers and other interested parties to use their experimental quantum computing system to build more sophisticated applications. “While technologies like AI can find patterns buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where patterns cannot be seen and the number of possibilities that you need to explore to get to the answer are too enormous ever to be processed by classical computers,” IBM explained.
IBM’s goal is to build quantum systems with roughly 50 qubits in the next few years. Once we have those, we’ll be able to truly begin to harness the power of quantum computing, and the applications are endless. Everything from medicine and finance to cloud security and even the modern technological era’s golden child of AI will be faster and more advanced.
If you aren’t already, you’re likely soon to find yourself looking forward to the day when quantum computers will replace regular computers for every day use. The computing power of quantum computers is immense compared to what regular desktops or laptops can do. The downside is, current quantum computing technology are limited by the bulky frameworks and extreme conditions they require in order to function.
Quantum computers need specialized setups in order to sustain and keep quantum bits — the heart of quantum computing — working. These “qubits” are particles in a quantum state of superposition, which allows them to encode and transmit information as 0s and 1s simultaneously. Most computers run on binary bit systems which use either 0s or 1s. Since quantum computers can use both at the same time, they can process more information faster. That being said, Sustaining the life of qubits is particularly difficult, but researchers are investigating quantum computing studies are trying to find ways to prolong the life of qubits using various techniques.
Survival of the Qubits
Now, for the first time ever, two quantum computers have been pitted against one another. One is a chip developed by IBM and used qubits made from superconducting materials. The other is a chip designed by the University of Maryland that relies on electromagnetic fields to trap a quantum material called ytterbium ions, which can be harvested for for its qubits. Although they used different methods, both chips run algorithms the same way and worked with just five qubits.
Because both were still modest in power, the test couldn’t really show which had better qubits. While IBM’s quantum computer proved to be faster, it was also less reliable. IBM’s qubits also broke down much easier than the University of Maryland’s. The latter had qubits that were interconnected — thanks to the nature of ytterbium — which made them capable of sharing information with each other. IBM’s, on the other hand, needed a central hub to swap information.
Still, it was a valuable experiment, and definitely a sign of improving quantum computing technology. It also stands to help researchers figure out which qubit technology would more efficient and viable for further development. “For a long time, the devices were so immature that you couldn’t really put two five-qubit gadgets next to each other and perform this kind of comparison,” said Simon Benjamin, a University of Oxford physicist who wasn’t part of the study. “It’s a sign that this technology is maturing.”
There are only a few quantum computers currently in existence, like D-Wave’s controversial quantum computers. But, without a doubt, when quantum computers become more accessible, these powerful machines will change the world. This technology is a game changer for many industries, and an international team of scientists unveiled today the first practical blueprint to build your own quantum computer.
Quantum computers operate at a different level compared to today’s regular computers, due primarily to how they process and encode information. Regular computers use bits (0s and 1s) to represent data. Quantum computers, on the other hand, rely on quantum bits (or qubits) that could be a 0 or a 1 at the same time. This is due to a strange quantum phenomenon called “superposition” which makes quantum computers more capable of handling computational processes. This “superposition,” however, is incredibly difficult to sustain.
“For many years, people said that it was completely impossible to construct an actual quantum computer. With our work we have not only shown that it can be done but now we are delivering a nuts and bolts construction plan to build an actual large-scale machine,” said Winfried Hensinger, lead researcher from the University of Sussex (UK), which worked on this project with Google, Aarhus University (Denmark), RIKEN (Japan), and Siegen University (Germany). They detail their research in the journal Science Advances.
Accessing the Future
According to lead author Bjoern Lekitsch, “It was most important to us to highlight the substantial technical challenges as well as to provide practical engineering solutions.” The team’s blueprint features an innovative design that allows actual qubits to be transmitted between individual quantum computing modules. These modules, instead of being connected via fiber optics, rely on connections generated by electric fields which transport ions from one module to another. These interconnected quantum computing modules create a machine that’s capable of achieving incredibly high levels of processing power.
However, this quantum computer won’t be replacing your home computer just yet. The blueprint is for building quantum computers at an industrial scale. They are intended to be built alongside individual sophisticated vacuum apparatus and integrated quantum computing silicon microchips where the ions are stored using electric fields. They are likely to take up entire buildings, not just space on an office desk.
Still, this is a huge step in making quantum computers more universally accessible. “The availability of a universal quantum computer may have a fundamental impact on society as a whole. Without doubt it is still challenging to build a large-scale machine, but now is the time to translate academic excellence into actual application building on the UK’s strengths in this ground-breaking technology. I am very excited to work with industry and government to make this happen.” The team has yet to construct a prototype using their blueprint.
Such a machine would absolutely revolutionize our ability to solve the most complex of science’s problems, develop superior lifesaving medicines, help explain the universe’s deepest mysteries, and more. With a publicly available blueprint, more scientists could actually collaborate to improve this technology, which would bring us closer to more practical quantum computers. Eventually, quantum computing could truly become available to all.
Within quantum mechanics, there has been an observable limit (the quantum backaction limit) on how low you can cool an object experimentally. Up until very recently, that limit had not been challenged by conventional laser cooling techniques. However, physicists at the National Institute of Standards and Technology (NIST) cooled an object to a temperature below this quantum-limit. Theoretically, the novel technique they used could even cool objects to absolute zero.
The object cooled, a microscopic vibrating aluminum drum, was cooled to less than one-fifth of one quantum. NIST researchers were previously able to cool the drum to one-third of one quantum by using a similar technique called sideband cooling. They used the tones of microwaves to create a charge to make the drum beat about 10 million times per second, which generated photons (light particles). As the photons leaked out of the electromagnetic cavity, they took phonons (mechanical units of energy) with them. This departure of energy created the extreme cooling.
In the more recent experiment, the researchers were able to cool the drum to such a low temperature by using what is called squeezed light to power the beating of the drum. As described in the most recent issue of Nature, the researchers squeezed the electromagnetic vacuum whose quantum fluctuations are what limit the lowest reachable temperatures through conventional methods.
The “squeezing” refers to a technique in which unwanted fluctuations are moved from the light (a useful property) to a property irrelevant to the experiment. This new technique eliminates the generally accepted limit on cooling, and it even applies to objects that are more difficult to cool, like those that are large or that operate at low frequencies.
Better Quantum Computing
“The colder you can get the drum, the better it is for any application,” according to NIST physicist John Teufel. “Sensors would become more sensitive. You can store information longer. If you were using it in a quantum computer, then you would compute without distortion, and you would actually get the answer you want.”
This experiment has shown that objects can be cooled below the previously accepted quantum limit (even, theoretically, as low as absolute zero). This could have a drastic impact on research and technology. Increasing sensor sensitivity could allow for more control and precision in research, especially in rising fields like nanotechnology.
Quantum computers are theoretically capable of solving problems that are currently “unsolvable.” Currently, they are limited by distortion created by high temperatures, but this advancement completely eliminates this issue, at least theoretically. One day, the development of this light squeezing supercooling method could be viewed as a major milestone on the path to a highly advanced future where research lacks the limitations it faces today.
Quantum computing sounds really complicated, and it is for pretty much everyone except maybe those developers trained in quantum physics, advanced mathematics, or, ideally, both. Seeing as how specialized their training is, those people are usually already involved in some quantum computing project. But now, seeing the value of pooling more minds into developing the technology, D-Wave, the Canadian company behind the quantum computers being tested by Google and NASA, has decided to open-source one of its software tools.
Essentially, quantum computers are really powerful computers that uses quantum mechanics to store and process information. Today’s computers use bits of 0s and 1s to store information. A quantum computer, on the other hand, utilizes a strange phenomenon called “quantum superposition” to encode 0s and 1s into quantum bits (or qubits). These qubits are quantum particles that spin in two directions at once, allowing them to be a 0 and 1 at the same time.
Sounds complicated, right? That’s because it is. What makes it even more complicated is the fact that sustaining “superposition” is still tricky. This so-called “problem of qubits” (i.e., extending its life) is the subject of many ongoing studies. D-Wave has managed to make it possible, but only in very extreme environments and under conditions that make everyday use of quantum computers for regular people still impractical if not impossible.
However, the company wants to change that, and they want you to help them.
Quantum Computing for All
A software tool known as Qbsolv allows developers to program D-Wave’s quantum computers even without knowledge of quantum computing. It has already made it possible for D-Wave to work with a bunch of partners, but the company wants more. “D-Wave is driving the hardware forward,” Bo Ewald, president of D-Wave International, told Wired. “But we need more smart people thinking about applications, and another set thinking about software tools.”
To that end, D-Wave has open-sourced Qbsolv, making it possible for anyone to freely share and modify the software. D-Wave hopes to build an open source community of sorts for quantum computing. Of course, to actually run this software, you’d need access to a piece of hardware that uses quantum particles, like one of D-Wave’s quantum computers. However, for the many who don’t have that access, the company is making it possible to download a D-Wave simulator that can be used to test Qbsolv on other types of computers.
This open-source Qbsolv joins an already-existing free software tool called Qmasm, which was developed by one of Qbsolv’s first users, Scott Pakin of Los Alamos National Laboratory. “Not everyone in the computer science community realizes the potential impact of quantum computing,” said mathematician Fred Glover, who’s been working with Qbsolv. “Qbsolv offers a tool that can make this impact graphically visible, by getting researchers and practitioners involved in charting the future directions of quantum computing developments.”
D-Wave’s machines might still be limited to solving optimization problems, but it’s a good place to start with quantum computers. Together with D-Wave, IBM has managed to develop its own working quantum computer in 2000, while Google teamed up with NASA to make their own. Eventually, we’ll have a quantum computer that’s capable of performing all kinds of advanced computing problems, and now you can help make that happen.
The advent of better quantum computers is something to be excited about. Research on developing more practical quantum computers abound, including those that can make it work as a consumer product—and not just limited to its current enterprise versions.
As a brief recap of this tech, quantum computers operate on a faster and more efficient level of computing. This is made possible by the use of quantum bits (or qubits) to carry information. Qubits are 0s and 1s encoded with two distinguishable quantum states. Unlike the binary bits used in classical computers, qubits are capable of processing vastly more data and information, largely because they can function as both 0 and 1.
Qubits, however, have a short lifespan and require rather extreme conditions to sustain the “superposition” and “entanglement” they rely on. Much of quantum computing research has been devoted to solving “the problem of qubits.” As soon as this hurdle is definitively overcome, it would only be a matter of time before practical quantum computers are realized.
A Risk to Today’s Encryption
All this is well and good. Their improved processing power makes quantum computers the ideal tools for research, and even solving questions currently unanswered due to the lack of adequate equipment. Quantum computing will revolutionize a number of important fields, including medicine and astronomy.
But it looks like it will also change cybersecurity—thanks to how quantum computing is expected to be absurdly good at cracking complex mathematical problems, the backbone of major encryption approaches today.
According to a recent report by the Global Risk Institute, there is a “one in seven chance that some of the fundamental public-key cryptography tools upon which we rely today will be broken [by emerging quantum computing technologies] by 2026 and a 50% chance by 2031.”
Breaking cryptography isn’t like the number of computer hacks executed in cyber attacks we see today. Cryptography, according to the report, is a fundamental building block of cybersecurity and it takes many years to replace. Basically, cryptography provides protection to online transactions, emails, financial and medical records—all of which could be rendered vulnerable by quantum computers.
Quantum for a Quantum
Of course, the threat isn’t there yet. More importantly, people are beginning to pay attention, including the NSA. The threat to encryption posed by quantum computing isn’t unsolvable. The same mechanism that makes it vulnerable can also turn it “quantum computing-proof,” so to speak.
There is such a thing as quantum cryptography, which uses photon-based qubits to securely transmit information encoded into the quantum states of particles. This quantum communication makes it possible for the recipient to detect attempts to intercept incoming messages. And it isn’t exactly new.
Its applications include what’s called the quantum key distribution (QKD). Basically, it uses quantum communication to share keys securely, which will be used to decrypt messages sent over conventional networks. Unfortunately, low bandwidth makes the system currently untenable, despite having been demonstrated to work in several cities.
This is just one possible work around. Other methods are being developed, including code-based cryptography and lattice-based cryptography. In any case, there’s time to improve it. In the same way that quantum computing is still being refined, network infrastructure can be improved to allow for quantum secure cryptography.
A very rarely occurring quantum behavior was observed by physicist Martin Mourigal inside a new exotic crystal, according to a study published in the journal Nature Physics. The new crystal showed signs of peculiar behavior which lead researchers to believe it contains a quantum spin liquid (QSL), something that would make it and invaluable discovery.
Quantum materials “exhibit exciting physical phenomena whose description requires new quantum mechanical models to be developed,” according to Oxford University. Because these materials exhibit unusual behavior, successfully replicating these in less extreme conditions “could lead to technologies far more advanced than those available today,” CIFAR explains. All of these are part of the unusual realm of quantum physics.
And QSLs are one such example of this strange behavior. According to Mourigal and his team, QSL “is an exotic state of matter in which electrons’ spins are quantum entangled over long distances, but do not show magnetic order in the zero-temperature limit.”
Entanglements, spins, and computing
The synthetic crystal Mourigal’s team examined is an ytterbium compound (YbMgGaO4) and is most likely full of quantum physics’ unusual behaviors, most notable of which is entanglement, which is at the heart of Mourigal’s research. In an entangled state, electrons becomes tied to one another despite physical distances, so much so that actions applied on one is reflected in the other.
Massive entanglement, like that found in the crystal, creates a system of electrons a quantum spin liquid — aka the collective spins of electrons in the crystal. “In a spin ‘liquid,’ the directions of the spins are not tidily aligned, but frenzied, although the spins are interconnected, whereas in a spin ‘solid’ the spin directions have a neat organization,” Mourigal explained.
Entanglement is the same principle followed in creating quantum bits (or qubits), which is the center of quantum computing. This makes Mourigal’s crystal interesting for quantum computing researchers. Given time and more research, a material with QSL can make for perpetual qubits.
“Imagine a state of matter where this entanglement doesn’t involve two electrons but involves, three, five, 10 or 10 billion particles all in the same system,” said Mourigal. “You can create a very, very exotic state of matter based on the fact that all these particles are entangled with each other.”
In short, it can be a quantum computing researcher’s dream material.