Here’s a great explainer video as to why Microsoft is focusing its efforts on topological qubits.

“Our teams are combining theoretical insights with experimental breakthroughs to develop both the hardware and the software that will enable quantum computers to fundamentally transform the face of computing.””Our teams are combining theoretical insights with experimental breakthroughs to develop both the hardware and the software that will enable quantum computers to fundamentally transform the face of computing.”

In this episode, Seeker tackles the question that’s on everyone’s minds: what will it take to have quantum internet in our home?

Yes, Virginia, a quantum internet is in the works.

The U.S. Department of Energy recently rolled out a blueprint describing research goals and engineering barriers on the way to quantum internet.

The DOE’s latest blueprint for a quantum internet in the U.S. has four key milestones. The first is to make sure quantum information sent over current fiber optic cables is secure. Then to establish entangled networks across colleges or cities, then throughout states, and finally for the whole country.

Quantum computers are advancing at a breathtaking pace, but this progress may soon stall due to out of this world force — literally.

Cosmic rays streaming down to Earth could interfere with the integrity of the information in these quantum computers.

An MIT study has measured how much cosmic rays could interfere with quantum computers Quantum computers are advancing at an exciting pace, but unfortunately this progress may soon stall. Cosmic rays streaming down to Earth could interfere with the integrity of the information in these quantum computers, and now […]

In this video, Sabine Hossenfelder explains how public key cryptography works on the internet today, using RSA as example, what the risk is that quantum computers pose for internet security, what post-quantum cryptography is, how quantum key distribution works, and what quantum cryptography is.

Video contents:

  • 0:00 Intro
  • 0:31 Public Key Cryptography
  • 2:43 Risk posed by Quantum Computers
  • 4:03 Post Quantum Cryptography
  • 5:31 Quantum Key Distribution
  • 10:25 Quantum Cryptography and Summary
  • 11:16 NordVPN Sponsor Message
  • 12:28 Thanks

You’ve likely noticed an uptick of content related to quantum computing over the last few months.

This article in Forbes sheds some light on why.

“We can have quantum impact right now,” says Krysta Svore, General Manager of Quantum Software at Microsoft. “Quantum-inspired solutions allow you to have improvements today,” she adds. Svore received her Ph.D. in computer science from Columbia and a B.A. in mathematics from Princeton, where she first encountered quantum computing in a seminar on cryptography and realized that “there is a different model of computation that could unlock solutions to problems that we couldn’t expect to unlock with classical computers.”

Quantum is coming. Get ready.


A few weeks ago, I posted this article on LinkedIn about the imminent arrival of Quantum Computing.

Given IBM’s recent announcement and the launch of the Impact Quantum Podcast, I thought it would be a good idea to repost here.

Winter is coming — an AI Winter that is.

If you’re not familiar with the term “AI Winter,” it refers to the period of time when AI innovation stalled due, in large part, to a lack of available processing power. This led to a lack of innovation, which then resulted in a lack of funding, which essentially froze work in the field.

AI Innovation & Computation

We like to think that we are immune to limitations of hardware. After all, your phone has millions of times more power than all of NASA had at their disposal during the Apollo 11 program. If your web server is under heavy load, just pony up more money to use the cloud to scale up. Access to computing power is generally not an issue.

However, at the cutting edges of research and development, we may be reaching the outer limits of what’s feasible, or even possible. Recently, MIT researchers sounded the alarm that “deep learning is approaching computational limits.

Sound alarmist? Maybe even far fetched?

Think again.

Consider this: the GPT-3 model has 175 billion parameters. Depending on whom you believe, it cost somewhere between $4 million and $12 million to train. Either way, that’s a lot of money.

However, it was money well spent. GPT-3 represents a milestone in the field of Natural Language Processing. Its immediate predecessor, GPT-2, could predict and generate text with uncanny human-like ability. It had 1.5 billion parameters and would likely pass the Turing Test. Imagine how well GPT-3 would do.

This Has Happened Before and It Will Happen Again

The Turing test, developed in 1950 by computing pioneer Alan Turing, was designed to a machine’s ability to demonstrate intelligence equivalent to, or indistinguishable from, human intelligence. The test consists of a human that would judge natural language conversations between a human and a machine that would respond with human-like responses.

It surprises many people to learn that AI has been around for decades. In fact, we are in the seventh decade of AI research. Since Alan Turing, there have been advances and, more importantly, stalls in AI research. They tend to come in cycles. The first was in the 1960s and 1970s, when among other reasons, DARPA cut funding to research programs not tied directly to “mission-oriented direct research.” The 1980s saw the rise of “expert systems” and early work into neural networks. However, the costs of such systems were prohibitive due to processing power constraints.

Sound familiar?

The longest AI Winter to date occurred between the late 1980s until the the early 2010s. Although one could make the argument that the early days of Big Data were part of this cycle and, therefore, could push back the hype cycle to the mid to late 2000s.

While actual AI research and innovation had stalled, science fiction AI blossomed. The Terminator, Star Trek: The Next Generation’s Commander Data, and, of course, Star Wars’ R2-D2 and C3PO were all pop culture mainstays.

Prolog Textbook.

My First Brush with AI

It was during this AI Winter that I had my first brush with AI. In the mid 90’s as a computer science student, I had the chance to take a class on AI with a noted researcher in the field.

He touted the wonders of a programming language called Prolog. To hear him tell it, this was the programming language and paradigm of the future. It was inevitable.

After working through various class projects, I kept waiting for the “big reveal” of Prolog’s innate intelligence. It never came. In fact, the final project ended up to be a case study in recursion, with a little bit of logical inference thrown in.

SkyNet it was not.

I finished the class deflated and thought that AI was merely science fiction. To be completely honest, this experience made me skeptical of when I heard of the advancements in AI. After seeing an early demo of computer vision at the DC Tech Fair in 2015, I was not impressed. Instead, I suspected that there must be some underwhelming explanation behind it all.

I could not have been more wrong.

A Quantum Leap Forward

Since 2015, I have done a much better job of keeping an open mind.

In that light, when whispers of “slowing innovation” in AI research started, I didn’t dismiss it right away. As luck would have it, later that month I was at a conference organized by Microsoft Research and attended a talk that included a look at why the future is quantum.

The presentation opened my eyes to a new model of computing, one that could radically change the world we live in by solving problems that we simply cannot with current tools.

Needless to say, I was excited. Immediately, I recorded a podcast episode about what I just saw. In it, you can hear the breathless excitement of someone who has just had an “aha” moment.

This excitement led me to explore more about quantum computing and, fortunately, there’s already a number of SDKs on the market to try out. Unfortunately, after firing up my first Q# project, I was not exactly clear on what to do next.

A New Way of Thinking

Quantum computing adds new logic gates and new types of algorithms. Algorithms, that as of now, require some familiarity with quantum physics. Imagine if you needed to know electrical engineering to write code. At one time, that was a prerequisite. We are just so far removed from the bits nowadays, we forget about all the underlying infrastructure.

Yes, Quantum Computing is coming and it will require new skills.

It will likely elevate physicists to rock star status just as data science did for statisticians. It may just avoid another AI Winter and, more importantly, change the world as we know it.

What do you think?

Let me know in the comments below.

ExplainingComputers just posted his annual Quantum Computing update.

Quantum computing review, including Google’s quantum supremacy claims, quantum cloud developments (QCaaS), trapped ion quantum computing, and a brief look at Python quantum coding!

More information on quantum computing can be found on his web page at https://www.explainingcomputers.com/quantum.html

Chapters:

  • 00:00 Introduction
  • 00:41 Quantum basics
  • 01:56 Quantum supremacy
  • 05:19 Quantum cloud computing
  • 08:10 Trapped ions
  • 09:53 Quantum coding
  • 11:09 Getting there!