noospheer

The Quantum Graph

Category: Uncategorized

Why Turing Machines are Quantum

Quantum computing (QC) is a flexible model of computation and there are many physical means of implementing it. One obvious means of implementation is by classical simulation. In other words, performing QC by running ‘quantum software’ on conventional hardware.

Many believe that fundamentally new hardware must be designed in order to exploit the power of QC, such as this 7-qubit trapped ion QC. And while new scalable hardware is the end game for the field, it is not necessarily true that new hardware must be designed to achieve scalability. The question of whether Turing machines (classical computers) can efficiently emulate universal QC is entirely open. This problem is formally known as P vs BQP.

Noospheer argues that classical computers and quantum computers are polynomially equivalent (P = BQP), ie. that the device you’re reading this blog with is a scalable quantum device. We can make this argument based on a few, well accepted findings:

  1. Certain quantum gates (“Cliffords”) and certain quantum states (“stabilizers”) are classically efficient. This is shown by the well known/loved Gottesman-Knill theorem and has been tested in code with GraphSim and other software. Recently, this theorem was extended to include all “positive Wigner states”, which include both the stabilizer states and “bound” states.
  2. Clifford circuits plus a non-Clifford gate results in universal QC. Whereas the Gottesman-Knill theorem is sub-universal, the addition of a non-Clifford gate type to the GK theorem results in BQP computing (BQP = universal QC — the ability to run any linear quantum algorithm). Everyone wants BQP computers.
  3. As Bravyi and Kitaev showed in 2004, magic states allow for universal QC with only Clifford gates. Magic states are a special resource which can be consumed to apply the action of a non-Clifford gate in a Clifford-only circuit. However, since magic states are “pure, non-Pauli eigenstates”, they are slow to simulate classically.
  4. It is known that noise can be applied to make the action of universal QC circuits classically easy to simulate. However, when applying noise to a system, there must be some kind of error correction or the end result is erroneous.
  5. Quantum error correction (QEC) is typically performed at each gate, so if used in a classical simulation, noise will be removed as soon as it is applied, resulting in an inability to efficiently simulate the system. Therefore QEC must be applied at the end of computation in order to avoid exponential overhead as a result of correction. “Post-computation QEC” was previously thought hard, yet it has been recently shown that it is doable.

Therefore: Implement a Clifford circuit (easy). Use a classically prepared magic state to inject a non-Clifford gate (such as a T gate) into the circuit to achieve universality. Apply sufficient noise to turn the T gate output into an easy-to-simulate ‘barely negative’ bound state (states that have a very small amount of Wigner negativity) – avoiding slowdown. Repeat this process until the computation has evolved to completion. Finally, apply post-computation QEC to recover the error-free answer to your quantum question.

P = BQP

Classically efficient universal quantum circuit

In order to be proven, this result must be implemented as classical software. If successful, it will have applications ranging from cryptography to finance, energy, medical care and beyond.

poster2

Full Size

On the power of Turing machines: Random notes from the quantum information community

Q8

Over the last year and half, I (Jordan Ash) have had correspondence with many quantum information students, postdocs, faculty, government personnel and business people at institutions around the world. Noospheer’s short term goal is to code a piece of software which allows for efficient, universal quantum computation using classical hardware. Here are snippets of those conversations.

Basically, the majority of quantum information people are in agreement with Mr. Chips:

[…]

RI

Thanks for your interest in our paper.

SS

I would like to know something more about your project.

MK

Would it be possible to state a formal problem?

JD

I’m afraid I’m not an expert on complexity classes and hence I am unable to evaluate your work.

AY

Simulation of Quantum systems on conventional computers is a classical problem.

MF

Suppose such a simulation was achieved.

YW

I follow the logic of your claim

CS

This sounds like a challenging project for the quantum computing community.

AT

I would be interested to hear more about it.

SP

What is the objective of your approach? Is it to show that any quantum evolution (so in particular any quantum computation) can be efficiently simulated classically?

RVM

Got it, will take a look.

JB

From what you’ve told me, graduate school would be good times for you. If you’re serious about your project, I think a phd working towards it would straiten things out.

NJ

I wish you the very best of luck with this, it sounds exciting.

TW

The best known general classical simulations of quantum computations take exponential time (if you want the correct answer with high probability or with certainty). To be practical, you would need a polynomial-time simulation, and there is an enormous gap between polynomial time and exponential time. Many very smart researchers have thought about the problem, and nobody has uncovered any evidence to suggest that arbitrary quantum computations can be simulated classically with less than exponential time overhead. Thus the default is to assume that it’s not doable. The basic issue is that a quantum computation can have exponentially many computation paths which are all maintained implicitly on a quantum computer (if one is ever actually built), but a classical computer would essentially need to keep track of all these paths explicitly.

DD

My take is that [it from bit] all depends on the misconception that information is a logically prior concept to the laws of physics. The reverse is the case.

AB

I think you are mistaken, and the basic unit of information in our world is the qubit and not the bit – but of course, this is just my opinion.

BR

Good luck, though.

EH

I wish you good luck with your project.

ND

We would love to hear more about your emulations as they develop.

AC

I don’t think this is likely to be a fruitful direction — to run quantum algorithms, you need to have a quantum computer.

GV

I stopped working on quantum computing many years ago.

DJ

I’m not the person to talk to regarding the logic of the argument: my initial reaction is skeptical, since various smart people have been pursuing these ideas for two decades or more; someone like Daniel Gottesman or Richard Cleve at Waterloo might well be able to either point out a fallacy or confirm the importance of your work much more reliably than me.

DG

There are a number of different potential methods of classical simulation of a quantum system, and the exponential factors enter into different places in different types of simulations.

MVDN

The commuting circuit setting is an example where the choice of measurement drastically affects the hardness of classical simulations.

AF

The global phase depends on the measurement results.

DARPA DSO

Your white paper was reviewed against the published criteria – we do NOT recommend that it be expanded into a full proposal.

NASA JPL MSL

Sounds like some interesting work.

DL

You’re right of course that there is no room for belief when there is a mathematical proof. […] If you can prove P=BQP it would be an amazing accomplishment, almost as big a deal as P=NP.

SA

Go ahead, implement it, use it to factor 1000-digit numbers and break the RSA cryptosystem, and become world-famous!

NW

Also, somewhat surprisingly, factoring represents one of the few known cases of a problem that is known to be in the complexity class NP but is not NP complete.  Therefore, even if an efficient classical algorithm is found for factoring then there would be no consequences for complexity theory because it would not show that P is the same as NP.

SV

I guess you could try to advertise on various QI sites, e.g. Quantiki.

JA

Suddenly, A Wild Chess Game Appears.

♜♞♝♛♚♝♞♜

♟♟♟♟♟♟♟♟

♙♙♙♙♙♙♙♙

♖♘♗♕♔♗♘♖

LY

Doesn’t white win after the first step?

HA

To be honest I think that the argument is very trivial, i.e. a direct extension of the GK theorem.

NCJ

The point of magic states is that they enable one to teleport a non-Clifford gate using only Clifford gates and the magic state.

TY

In some sense, if a magic state is stored classically then it is automatically reusable; just store it in memory. But the decoding is still a problem.

AF

Any state can be converted into any other state by some series of gates.

CF

If you have a universal gate set, then I think any state will enable UQC.

VV

By a perfect computation space you mean no noise?

VM

Thanks for the clarification!

SVA

I myself am interested in building classical computer software for simulating quantum systems from both academic and business perspectives (in addition to my professorial activities, I have two companies, one of them devoted to developing scientific software.)

GR

Good luck with your effort. People trying new things is what drives progress.

AB

I was skimming the arxiv and thought this might be relevant. http://arxiv.org/abs/1212.0506

FF

Anyway, I hope you can get funding and start your business soon and I am always looking forward to work with you.

MG

You will have to wait for the paper, it’s complicated to explain…

[/…]

The Real Risk Takers… Bureaucrats?

Doing business with the Illuminati; logo of the now-defunct Information Awareness Office (IAO)

Doing business with the Illuminati — IAO logo

According to wikipedia, modern venture capital goes back to the post-World War 2 era when weary American soldiers needed support to help their business ideas take flight. This was a key factor in the 1950s US boom and economic dominance which lasted until recently.

Of course, venture capital goes way back in time. Columbus famously lobbied the Spanish crown for years before they gave in and funded his vision to establish a trade route with India by sailing West, at the risk of falling off the face of the Earth. His voyage provided hard evidence for our planet’s roundness, and also happened to introduce Europeans, Africans and eventually Asians to the Americas on a large scale. For better or worse, venture capital is generally a full half of the equation in any successful discovery or invention.

Every venture has associated risk. The more developed an endeavor is, the less risky. I.e., a company which is making money is better proven than a company which hasn’t yet seen a penny of revenue. Of course, higher risk may imply higher reward. If you invested $10,000 in Google when it was founded in late 1998, you would see a greater return on investment than if you invested the same amount in Google yesterday.

Venture capital geared for unproven businesses is called early stage or seed stage capital. Investors at this stage mitigate their risk by looking for things such as impressive teams and working prototypes. What if a team doesn’t have a working prototype just yet? Institutional venture capital is then a highly unlikely source of funding. Angel (individual) investors are one option, the other is government. If your idea is truly awesome but unproven, there are a number of big-spending governments which will give you money to realize your vision.

Organizations which invest at this stage include Defence Research & Development Canada (DRDC), the United States’ Intelligence Advanced Research Projects Activity (IARPA), and Britain’s Defence Science and Technology. All these agencies are concerned with having a leg up on their “adversaries”, and all are interested in spending large sums on high-risk, high-reward research projects. According to IARPA,

Failure is completely acceptable as long as…

  • It is not due to failure to maintain technical and programmatic integrity.
  • Results are fully documented.”

The British Ministry of Defence accepts funding proposals for systems which are at a Technology Readiness Level (TRL) of 3-5. This means the tech is at worst still being proven feasible, and at best is just mature enough to be demonstrated. Despite some initial reservations one might have about working directly with the military industrial complex, the solicitation/RFP process is open. Further, 21st century Western governments don’t have the gall/cojones/chutzpah to ask for exclusive rights to a piece of tech. In other words, they will fund open source projects if it means they can avoid being surprised by the unexpected arrival of novel, disruptive technology.

If you are in the midst of building something “not just evolutionary, but revolutionary”, and are being turned downed by risk-averse investors, then look for government defense contract money. They’re blowing so much cash on BS these days, they might as well spend some of that on a good cause like your idea.

An Open Note To “Disruptive” Venture Capitalists

Over the last year I’ve had correspondence with ~20 investors from around our little planet; in person, over the phone/skype and via email. Each one is unique and beautiful like a snowflake, and like a snowflake each one melts when exposed to heat. Some of the investors I’ve spoken with are ‘mid’ and ‘late stage’, meaning they put money into companies that have already generated some revenue. Others are ‘seed’ and ‘early stage’ investors; these are the ones I have a bone to pick with as they are supposedly the most open to risk. — (In fairness, criticism and rejection are among the most valuable things an entrepreneur can hope for)

Venture capitalists (VCs) will tell you they require the following for them to consider investment: 1) a large and preferably untapped market to address, 2) a diverse team of dedicated, tenacious and experienced people, 3) some kind of traction such as strong relationships with potential customers, and 4) a product/service that is cool and does something original.

Yeah? Then put your money where your mouth is, people !!!

 

June 2013 editor’s note; VCs are wise.

Dear Sir: Plea$e give us ¥our Mon€y

Yesterday I tracked down mining tycoon Rob McEwen. He infamously turned Goldcorp, a relatively small Canadian gold mining company into the world’s second largest — from $50 million to $10 billion market cap. Rob spoke at a Scotiabank mining conference at the King Edward hotel in Toronto. After unsuccessfully trying to haggle my way into the non-public conference ballroom, I sat in a stately lobby with fat Greek columns next to a giant Christmas tree and waited for him to come out. Recognizing his face from a Bloomberg interview, I noticed and followed him upstairs to the mezzanine. I approached Mr. McEwen at the snack table and got my introduction in before the nice gatekeeper lady who denied me earlier came to shoo me away. Luckily he’s a cool guy; we went back down to the lobby and sat near the pine tree to discuss a mixture of medicine and technology.

McEwen is familiar with what can come of good data and good software. In 2006 Canadian Business magazine named him the ‘Most Innovative CEO’. He’s also an investor, and as Noo Corp is seeking investment, I thought it wise to pitch him.

Athabasca tar sands, Alberta

Harvesting natural resources is probably an older profession than prostitution. It’s the backbone of any given civilization – famous or forgotten. When supplies collapse, so do empires. In recent times, as the Earth’s population has exploded, demand for things like copper or cadmium has increased dramatically. As a result, natural resource extraction has gained a well-deserved reputation as being downright dirty [July ’13 edit: it is truly ugly]. Yet with companies like Planetary Resources looking to the heavens as the next great frontier for the elements we use, it seems fair to say that our appetite for raw stuff is insatiable.

As a side note, before the San Francisco / Bay Area became Silicon Valley, it was the site of a great mining rush, as pointed out in this fascinating talk (thanks Pete).

Back to the story, McEwen’s insight was that the mining industry is mostly aloof regarding its use of data. This was a light bulb moment: noospheer has finally found its niche. Unifying the entirety of a given mining corporation’s data — both geospatial and logistical — implies an increase in efficiency. Overlaying this with open data on a given parcel of land from the wider network implies an increase in awareness. Efficiency and awareness in this context means less environmental impact. By integrating multiple data fields gathered from increasingly non-invasive exploration technologies, we can get more out of this planet while scarring her surface less. In the future, we should one day be able to teleport gold right from within the Earth’s crust.

Noospheer’s vertical is the natural resource mining industry as its a mess, and its data is too. Clean up the data, clean up the world. We’ve now met and spoken with numerous mining software companies for feedback, raw test data for piloting the system, and agreement to run the beta once ready.

Was Noo Corp successful with the ask? I’ll let you know.   ~Jordan

[July ’13: not yet]

Executive Bot

If you wanted to get rich, how would you do it? I think your best bet would be to start or join a startup. That’s been a reliable way to get rich for hundreds of years. The word “startup” dates from the 1960s, but what happens in one is very similar to the venture-backed trading voyages of the Middle Ages.

Paul Graham, Hackers & Painters

Web software companies are interesting from an economics perspective because they maximize leverage. If a piece of software becomes popular, the company behind it often becomes enormously successful.

Like any business, web companies have expenses. The two cold goals of any corporation are to turn a profit and simultaneously reduce overhead. In the recent past, a web-based company would have to buy and operate pricey server hardware. New web companies don’t bother maintaining hardware, they pay other companies such as Amazon do that. A cluster of 4 GPUs (6144 total cores) running at 100% utilization, with 1tb of storage, 1tb of data transfer in and 1tb of data transfer out comes to well under $2000 for the month. A great example of a company that uses ‘the cloud’ to host its services is soundcloud.

Besides compute/bandwidth costs, a web company must also maintain a payroll. At last check, Google was paying 53,546 individuals. With an average starting salary of $82k, that’s roughly $366m in pay per month. Also, bigger companies tend to get involved in expensive lawsuits.

So, with hardware being cheap and easy to deploy, how can a web company reduce its employee and lawsuit overhead? Open source, of course! Open source companies generally avoid lawsuits because they are never accused of patent infringement. They also don’t need as many employees for two reasons: 1) the use of open code written by others and, 2) free development from the wider community. By using cloud hosting services and open source, a startup may keep its core team small, outsource hardware, and in theory remain nimble / efficient well into maturity.

With miniscule expenses and some sort of scaling model (catch 22?), the coins flow freely. Business can avoid the inevitable clinging and bloat that comes with success.

The next question is, “will the role of corporate Officer soon be non-human?”

Assumptions (tsk tsk)

Human beings like to make assumptions. Assumptions seem to make the world a less demanding place to understand. Ironically, no kind of human is more susceptible to falling prey to assumptions than the scientist. In the strict definition, a scientist prides him/herself on proof through rigorous observation of well-constructed experiments.

However, if a question is hard, it’s difficult to devise an experiment which can answer it. When this is the case, many scientists fall back on beliefs which they assume to be true. This is unfortunate, as the universe is mysterious, and assumptions generally turn out to be false (ex: “all heavenly bodies orbit the Earth”).

A modern scientific assumption is that factoring is hard. Factoring involves taking a number and determining which two prime numbers multiply together to equal the original number. For example, 21 factors into 3 times 7. Despite the simplicity of the problem, there is no obviously efficient way to solve it. So, if the original number is sufficiently large, a computer may take a billion years to get the answer. Due to its hardness, factorization is at the heart of public key cryptography (PKC).

PKC is responsible for keeping computer systems secure. Virtually all institutions encrypt data with PKC. Virtually all ‘secure’ online transactions and digital signatures rely on PKC for their relative security. Symmetric-key cryptography (SKC – using a secret key that cannot be guessed logically) doesn’t directly rely on PKC, but the problem with SKC keys is a critical requirement that they be shared by both parties. This implies a secure connection must be established in order to share the private key in the first place — unless you share the key off-network. Simply, SKC needs factoring to be hard too.

Shor’s quantum algorithm factors in polynomial time (fast). Thus, Shor’s discovery ruins current security systems. However, it is a quantum algorithm and requires a quantum computer. Recently, Shor’s algorithm was implemented on a quantum computer and succeeded in factoring the number 21 in polynomial time. However, PKC uses numbers much larger than 21. So, it would be interesting to see whether Shor’s algorithm can be efficiently implemented on a classical computer, such as an iPhone.

It is proven here and extended here, that at least one major component of Shor’s factorization algorithm – the quantum Fourier transform (QFT) – can run efficiently on classical computers. The harder part of Shor’s is something called modular exponentiation (ME), and this is commonly assumed to be an impossible task for classical computers. However, it is also now proven that ME can be broken down and reconstructed in terms of the QFT. Therefore, the totality of Shor’s algorithm can run on an iPhone. If you find this alarming, rest assured such a breakthrough brings QKD much closer to fruition.

There are deep subtleties in the papers cited above, which we are carefully investigating. Should our assumption that factoring is classically easy prove true, we will publish our findings here.

System Map

Last week was the Free Software and Open Source Symposium 2012, hosted and organized by Seneca @ York + CDOT. In the spirit of openness, here is a bird’s eye view of noospheer’s components. (Click for full size)

Graph generated with yED

Breakdown ‘The system’ is designed as an open source solution stack, intended to put to rest 4 important problems in computer science:

  1. Integration ~ merging data from many disparate sources, while retaining the integrity of each original source.
  2. Distribution ~ efficiently spreading this federated set of information across a potentially infinite number of web-enabled devices.
  3. Privacy ~ using emulated quantum cryptographic schemes to provide data security better than the current state of the art.
  4. Visualization ~ making intuitive sense, in-browser, out of complex data; so all users can ask advanced questions and get insightful answers [note: we further intend to make the network fully accessible to those with special needs].

Graphical User Interface (GUI)

Data Sources

Application Programming Interface (API)

Security

Storage

Optimization

Collapsing the Polynomial Hierarchy is ‘Dangerous’

Not only is P vs. NP the defining question of our field; it’s one of the deepest questions ever asked for which we’d know how to recognize an answer. (In other words, one of the deepest questions in NP.)

Aaronson, p.1

quantum relic

A rigorous and therefore undeniable mathematical proof demonstrating a collapse of the polynomial hierarchy (PH) has a large number of big implications for the world beyond math and science.

  1. First, in terms of math and science; a successfully collapsed PH is closely related to efficient classical emulation of quantum systems (via post-IQP circuitry); in other words, regular computers may be fully capable of being quantum computers. The mainstream understanding — as alluded to in the post-IQP paper — is the rather naive one that quantum computation requires processors which demonstrate quantum effects as we currently define them. The mainstream is missing the point: computer science is, at its core, linguistics. If we know the math (which we do quite well), and assuming the Church-Turing thesis is actually universal, then we just need to teach our machines how to run the calculations (by writing insanely great code). One is now free to imagine a planet (such as this one) suddenly bustling with quantum computers, which include your laptops, cell phones and satellites. tada! [For the skeptic: W, X, Y, Z, 1, 2, 3 (pIQP demands a linear sample rate for PH collapse) It is also important to note that a collapse of the PH does not necessarily imply P = NP, however the two are inextricably related and are both considered ‘improbable’]
  2. What about our institutions/ourselves — specifically with regard to information privacy? The only factor which maintains a state of separation among entities in a shared space is unshared data. That said, with 1. in mind, the first thing a security guy/gal might consider is a ubiquitous encryption method called RSA being compromised. [Fun Fact: RSA was published 7 years after P v NP was formalized.]  Long story short, if 1. becomes a reality, then the invincible armor shielding governmental, financial, corporate, and an increasing amount of personal data from ‘adversaries’ — suddenly meets Kryptonite.Please consider this another early public WARNING: Any software we write which runs Shor’s factoring algorithm (or any QA) in polynomial time (efficiently) may likely render some encryption methods crackable, and will be published and distributed as a proof under language outlined in the Free Berkeley Software Distribution license.
  3. 2. outlines the atomic tip of the iceberg. Implications for medicine, energy, and time travel abound as well.

Dangerous, or benign? Probably both.