noospheer

The Quantum Graph

Tag: noospheer

Why Turing Machines are Quantum

Quantum computing (QC) is a flexible model of computation and there are many physical means of implementing it. One obvious means of implementation is by classical simulation. In other words, performing QC by running ‘quantum software’ on conventional hardware.

Many believe that fundamentally new hardware must be designed in order to exploit the power of QC, such as this 7-qubit trapped ion QC. And while new scalable hardware is the end game for the field, it is not necessarily true that new hardware must be designed to achieve scalability. The question of whether Turing machines (classical computers) can efficiently emulate universal QC is entirely open. This problem is formally known as P vs BQP.

Noospheer argues that classical computers and quantum computers are polynomially equivalent (P = BQP), ie. that the device you’re reading this blog with is a scalable quantum device. We can make this argument based on a few, well accepted findings:

  1. Certain quantum gates (“Cliffords”) and certain quantum states (“stabilizers”) are classically efficient. This is shown by the well known/loved Gottesman-Knill theorem and has been tested in code with GraphSim and other software. Recently, this theorem was extended to include all “positive Wigner states”, which include both the stabilizer states and “bound” states.
  2. Clifford circuits plus a non-Clifford gate results in universal QC. Whereas the Gottesman-Knill theorem is sub-universal, the addition of a non-Clifford gate type to the GK theorem results in BQP computing (BQP = universal QC — the ability to run any linear quantum algorithm). Everyone wants BQP computers.
  3. As Bravyi and Kitaev showed in 2004, magic states allow for universal QC with only Clifford gates. Magic states are a special resource which can be consumed to apply the action of a non-Clifford gate in a Clifford-only circuit. However, since magic states are “pure, non-Pauli eigenstates”, they are slow to simulate classically.
  4. It is known that noise can be applied to make the action of universal QC circuits classically easy to simulate. However, when applying noise to a system, there must be some kind of error correction or the end result is erroneous.
  5. Quantum error correction (QEC) is typically performed at each gate, so if used in a classical simulation, noise will be removed as soon as it is applied, resulting in an inability to efficiently simulate the system. Therefore QEC must be applied at the end of computation in order to avoid exponential overhead as a result of correction. “Post-computation QEC” was previously thought hard, yet it has been recently shown that it is doable.

Therefore: Implement a Clifford circuit (easy). Use a classically prepared magic state to inject a non-Clifford gate (such as a T gate) into the circuit to achieve universality. Apply sufficient noise to turn the T gate output into an easy-to-simulate ‘barely negative’ bound state (states that have a very small amount of Wigner negativity) – avoiding slowdown. Repeat this process until the computation has evolved to completion. Finally, apply post-computation QEC to recover the error-free answer to your quantum question.

P = BQP

Classically efficient universal quantum circuit

In order to be proven, this result must be implemented as classical software. If successful, it will have applications ranging from cryptography to finance, energy, medical care and beyond.

Advertisements

On the power of Turing machines: Random notes from the quantum information community

Q8

Over the last year and half, I (Jordan Ash) have had correspondence with many quantum information students, postdocs, faculty, government personnel and business people at institutions around the world. Noospheer’s short term goal is to code a piece of software which allows for efficient, universal quantum computation using classical hardware. Here are snippets of those conversations.

Basically, the majority of quantum information people are in agreement with Mr. Chips:

[…]

RI

Thanks for your interest in our paper.

SS

I would like to know something more about your project.

MK

Would it be possible to state a formal problem?

JD

I’m afraid I’m not an expert on complexity classes and hence I am unable to evaluate your work.

AY

Simulation of Quantum systems on conventional computers is a classical problem.

MF

Suppose such a simulation was achieved.

YW

I follow the logic of your claim

CS

This sounds like a challenging project for the quantum computing community.

AT

I would be interested to hear more about it.

SP

What is the objective of your approach? Is it to show that any quantum evolution (so in particular any quantum computation) can be efficiently simulated classically?

RVM

Got it, will take a look.

JB

From what you’ve told me, graduate school would be good times for you. If you’re serious about your project, I think a phd working towards it would straiten things out.

NJ

I wish you the very best of luck with this, it sounds exciting.

TW

The best known general classical simulations of quantum computations take exponential time (if you want the correct answer with high probability or with certainty). To be practical, you would need a polynomial-time simulation, and there is an enormous gap between polynomial time and exponential time. Many very smart researchers have thought about the problem, and nobody has uncovered any evidence to suggest that arbitrary quantum computations can be simulated classically with less than exponential time overhead. Thus the default is to assume that it’s not doable. The basic issue is that a quantum computation can have exponentially many computation paths which are all maintained implicitly on a quantum computer (if one is ever actually built), but a classical computer would essentially need to keep track of all these paths explicitly.

DD

My take is that [it from bit] all depends on the misconception that information is a logically prior concept to the laws of physics. The reverse is the case.

AB

I think you are mistaken, and the basic unit of information in our world is the qubit and not the bit – but of course, this is just my opinion.

BR

Good luck, though.

EH

I wish you good luck with your project.

ND

We would love to hear more about your emulations as they develop.

AC

I don’t think this is likely to be a fruitful direction — to run quantum algorithms, you need to have a quantum computer.

GV

I stopped working on quantum computing many years ago.

DJ

I’m not the person to talk to regarding the logic of the argument: my initial reaction is skeptical, since various smart people have been pursuing these ideas for two decades or more; someone like Daniel Gottesman or Richard Cleve at Waterloo might well be able to either point out a fallacy or confirm the importance of your work much more reliably than me.

DG

There are a number of different potential methods of classical simulation of a quantum system, and the exponential factors enter into different places in different types of simulations.

MVDN

The commuting circuit setting is an example where the choice of measurement drastically affects the hardness of classical simulations.

AF

The global phase depends on the measurement results.

DARPA DSO

Your white paper was reviewed against the published criteria – we do NOT recommend that it be expanded into a full proposal.

NASA JPL MSL

Sounds like some interesting work.

DL

You’re right of course that there is no room for belief when there is a mathematical proof. […] If you can prove P=BQP it would be an amazing accomplishment, almost as big a deal as P=NP.

SA

Go ahead, implement it, use it to factor 1000-digit numbers and break the RSA cryptosystem, and become world-famous!

NW

Also, somewhat surprisingly, factoring represents one of the few known cases of a problem that is known to be in the complexity class NP but is not NP complete.  Therefore, even if an efficient classical algorithm is found for factoring then there would be no consequences for complexity theory because it would not show that P is the same as NP.

SV

I guess you could try to advertise on various QI sites, e.g. Quantiki.

JA

Suddenly, A Wild Chess Game Appears.

♜♞♝♛♚♝♞♜

♟♟♟♟♟♟♟♟

♙♙♙♙♙♙♙♙

♖♘♗♕♔♗♘♖

LY

Doesn’t white win after the first step?

HA

To be honest I think that the argument is very trivial, i.e. a direct extension of the GK theorem.

NCJ

The point of magic states is that they enable one to teleport a non-Clifford gate using only Clifford gates and the magic state.

TY

In some sense, if a magic state is stored classically then it is automatically reusable; just store it in memory. But the decoding is still a problem.

AF

Any state can be converted into any other state by some series of gates.

CF

If you have a universal gate set, then I think any state will enable UQC.

VV

By a perfect computation space you mean no noise?

VM

Thanks for the clarification!

SVA

I myself am interested in building classical computer software for simulating quantum systems from both academic and business perspectives (in addition to my professorial activities, I have two companies, one of them devoted to developing scientific software.)

GR

Good luck with your effort. People trying new things is what drives progress.

AB

I was skimming the arxiv and thought this might be relevant. http://arxiv.org/abs/1212.0506

FF

Anyway, I hope you can get funding and start your business soon and I am always looking forward to work with you.

MG

You will have to wait for the paper, it’s complicated to explain…

[/…]

The Real Risk Takers… Bureaucrats?

Doing business with the Illuminati; logo of the now-defunct Information Awareness Office (IAO)

Doing business with the Illuminati — IAO logo

According to wikipedia, modern venture capital goes back to the post-World War 2 era when weary American soldiers needed support to help their business ideas take flight. This was a key factor in the 1950s US boom and economic dominance which lasted until recently.

Of course, venture capital goes way back in time. Columbus famously lobbied the Spanish crown for years before they gave in and funded his vision to establish a trade route with India by sailing West, at the risk of falling off the face of the Earth. His voyage provided hard evidence for our planet’s roundness, and also happened to introduce Europeans, Africans and eventually Asians to the Americas on a large scale. For better or worse, venture capital is generally a full half of the equation in any successful discovery or invention.

Every venture has associated risk. The more developed an endeavor is, the less risky. I.e., a company which is making money is better proven than a company which hasn’t yet seen a penny of revenue. Of course, higher risk may imply higher reward. If you invested $10,000 in Google when it was founded in late 1998, you would see a greater return on investment than if you invested the same amount in Google yesterday.

Venture capital geared for unproven businesses is called early stage or seed stage capital. Investors at this stage mitigate their risk by looking for things such as impressive teams and working prototypes. What if a team doesn’t have a working prototype just yet? Institutional venture capital is then a highly unlikely source of funding. Angel (individual) investors are one option, the other is government. If your idea is truly awesome but unproven, there are a number of big-spending governments which will give you money to realize your vision.

Organizations which invest at this stage include Defence Research & Development Canada (DRDC), the United States’ Intelligence Advanced Research Projects Activity (IARPA), and Britain’s Defence Science and Technology. All these agencies are concerned with having a leg up on their “adversaries”, and all are interested in spending large sums on high-risk, high-reward research projects. According to IARPA,

Failure is completely acceptable as long as…

  • It is not due to failure to maintain technical and programmatic integrity.
  • Results are fully documented.”

The British Ministry of Defence accepts funding proposals for systems which are at a Technology Readiness Level (TRL) of 3-5. This means the tech is at worst still being proven feasible, and at best is just mature enough to be demonstrated. Despite some initial reservations one might have about working directly with the military industrial complex, the solicitation/RFP process is open. Further, 21st century Western governments don’t have the gall/cojones/chutzpah to ask for exclusive rights to a piece of tech. In other words, they will fund open source projects if it means they can avoid being surprised by the unexpected arrival of novel, disruptive technology.

If you are in the midst of building something “not just evolutionary, but revolutionary”, and are being turned downed by risk-averse investors, then look for government defense contract money. They’re blowing so much cash on BS these days, they might as well spend some of that on a good cause like your idea.

An Open Note To “Disruptive” Venture Capitalists

Over the last year I’ve had correspondence with ~20 investors from around our little planet; in person, over the phone/skype and via email. Each one is unique and beautiful like a snowflake, and like a snowflake each one melts when exposed to heat. Some of the investors I’ve spoken with are ‘mid’ and ‘late stage’, meaning they put money into companies that have already generated some revenue. Others are ‘seed’ and ‘early stage’ investors; these are the ones I have a bone to pick with as they are supposedly the most open to risk. — (In fairness, criticism and rejection are among the most valuable things an entrepreneur can hope for)

Venture capitalists (VCs) will tell you they require the following for them to consider investment: 1) a large and preferably untapped market to address, 2) a diverse team of dedicated, tenacious and experienced people, 3) some kind of traction such as strong relationships with potential customers, and 4) a product/service that is cool and does something original.

Yeah? Then put your money where your mouth is, people !!!

 

June 2013 editor’s note; VCs are wise.

Dear Sir: Plea$e give us ¥our Mon€y

Yesterday I tracked down mining tycoon Rob McEwen. He infamously turned Goldcorp, a relatively small Canadian gold mining company into the world’s second largest — from $50 million to $10 billion market cap. Rob spoke at a Scotiabank mining conference at the King Edward hotel in Toronto. After unsuccessfully trying to haggle my way into the non-public conference ballroom, I sat in a stately lobby with fat Greek columns next to a giant Christmas tree and waited for him to come out. Recognizing his face from a Bloomberg interview, I noticed and followed him upstairs to the mezzanine. I approached Mr. McEwen at the snack table and got my introduction in before the nice gatekeeper lady who denied me earlier came to shoo me away. Luckily he’s a cool guy; we went back down to the lobby and sat near the pine tree to discuss a mixture of medicine and technology.

McEwen is familiar with what can come of good data and good software. In 2006 Canadian Business magazine named him the ‘Most Innovative CEO’. He’s also an investor, and as Noo Corp is seeking investment, I thought it wise to pitch him.

Athabasca tar sands, Alberta

Harvesting natural resources is probably an older profession than prostitution. It’s the backbone of any given civilization – famous or forgotten. When supplies collapse, so do empires. In recent times, as the Earth’s population has exploded, demand for things like copper or cadmium has increased dramatically. As a result, natural resource extraction has gained a well-deserved reputation as being downright dirty [July ’13 edit: it is truly ugly]. Yet with companies like Planetary Resources looking to the heavens as the next great frontier for the elements we use, it seems fair to say that our appetite for raw stuff is insatiable.

As a side note, before the San Francisco / Bay Area became Silicon Valley, it was the site of a great mining rush, as pointed out in this fascinating talk (thanks Pete).

Back to the story, McEwen’s insight was that the mining industry is mostly aloof regarding its use of data. This was a light bulb moment: noospheer has finally found its niche. Unifying the entirety of a given mining corporation’s data — both geospatial and logistical — implies an increase in efficiency. Overlaying this with open data on a given parcel of land from the wider network implies an increase in awareness. Efficiency and awareness in this context means less environmental impact. By integrating multiple data fields gathered from increasingly non-invasive exploration technologies, we can get more out of this planet while scarring her surface less. In the future, we should one day be able to teleport gold right from within the Earth’s crust.

Noospheer’s vertical is the natural resource mining industry as its a mess, and its data is too. Clean up the data, clean up the world. We’ve now met and spoken with numerous mining software companies for feedback, raw test data for piloting the system, and agreement to run the beta once ready.

Was Noo Corp successful with the ask? I’ll let you know.   ~Jordan

[July ’13: not yet]

System Map

Last week was the Free Software and Open Source Symposium 2012, hosted and organized by Seneca @ York + CDOT. In the spirit of openness, here is a bird’s eye view of noospheer’s components. (Click for full size)

Graph generated with yED

Breakdown ‘The system’ is designed as an open source solution stack, intended to put to rest 4 important problems in computer science:

  1. Integration ~ merging data from many disparate sources, while retaining the integrity of each original source.
  2. Distribution ~ efficiently spreading this federated set of information across a potentially infinite number of web-enabled devices.
  3. Privacy ~ using emulated quantum cryptographic schemes to provide data security better than the current state of the art.
  4. Visualization ~ making intuitive sense, in-browser, out of complex data; so all users can ask advanced questions and get insightful answers [note: we further intend to make the network fully accessible to those with special needs].

Graphical User Interface (GUI)

Data Sources

Application Programming Interface (API)

Security

Storage

Optimization

Why should your firm grant Noo Corp with venture capital?

This August, David Sacks made a post on Facebook which caused some stir in the venture capital community.

I think silicon valley as we know it may be coming to an end. In order to create a successful new company, you have to find an idea that (1) has escaped the attention of major Internet companies, which are better run than ever before; (2) is capable of being launched and proven out for ~$5M, the typical seed plus series A investment; and (3) is protectable from the onslaught of those big companies once they figure out what you’re onto. How many ideas like that are left?

This blog post addresses Sacks’ three points:

  1. The vast majority of tech companies are focused on creating cool new products – ‘killer apps’. No issue there, but the problem which has resulted is that there is, (a) a deluge of apps/services on the web that don’t communicate with one another; and (b) there is a massive disarray of data generated by these technologies. Noospheer recognizes that in order for the web to make its next leap of usability, data from disparate sources must be fully integrated and apps should tie into one another far more easily. We address the deep challenges associated with these problems from an entirely new angle: classical-quantum computing (performing quantum operations by efficiently utilizing commodity hardware). Our project argues that this is an approach which has ‘escaped the attention of major Internet companies’. — As an additional point, the revenue generation potential associated with a data integration platform is enormous. As yet, there is no eBay of data. See the model.
  2. Our ask is $1.4m CAD. Including the seed we raised previously, we can prove out the design for less than a third of Sacks’ magic number.
  3. Being built on top of battle-tested open source parts, the software is fully protected by the open source paradigm. For an argument in favor of open source from a business perspective, see this.

The search for venture capital has taken this project around the world — North America, Europe and Asia. As yet, we have met resistance and continually refine our tactics. We seek investor(s) willing to take relatively small risk on a solid team for major reward.

Small Data

Big data is big buzz these days. Essentially, there’s a lot of data out there and its a massive, attic-style mess — crap is strewn all over the place!

Companies in this space love to talk about peta, exa, zetta or yottabytes (1 petabyte = 1 million gigs). Yet the entirety of Wikipedia English is ~200gb. Seemingly forgotten in the big data world is the process of normalization, where a data set is compressed without losing the information’s fidelity.

With some simple record deduplication, combined with a generic spatial schema that works with tabular and graph data, noospheer will make big data a lot smaller — thus permitting far greater informational diversity and complexity than currently achieved.

You can download wiki here.

Ghost in the Martian

Total Recall, 1990

Save the Moon landings, Mars Science Lab aka Curiosity is NASA’s crowning technical achievement. The complexity of its entry, descent and landing (EDL), the rugged capability of its large and flexible design, and high ingenuity of its instrumentation; all come together as a complete package for deep exploration of the Sun’s 4th planet. Mars is no longer such a distant frontier thanks to missions like this one.

With hardware that goes to other worlds, there come design challenges such as having to radiation-harden the computer. Since it currently takes many years to plan and build a robotic space probe, the processor and memory specs are well behind the cutting edge back here on Earth. MSL RCE (rover compute element) is equipped with two twin computers (1 as backup): 200MHz single-core processor, 256mb RAM, 2gb flash. The top data transfer rate between Earth and the rover via Mars reconnaissance orbiter (MRO) is 2mbits per second. All quite modest but still damn impressive given Mariner 4 returned a total of 634kb which included 22 images back in 1965.

The Mini Cooper-sized vehicle runs on a plutonium-powered generator that puts out 125 watts – for comparison, a mid-range computer power supply from your local hardware store will channel 5 times that!

Given that computational power and raw electricity are limited, the focus then becomes how you use it. The MSL software team writes and uploads new programs on a sol by sol basis, so the rover is not exactly autonomous – other than its hazard-avoidance capability. It faithfully goes exactly where Earth tells it and carries out precise instructions for imaging, surveying, sampling and so on. Given the one way time delay between Earth and Mars ranges from 4 to 22 minutes, controlling Curiosity presumably consists of sending over long, linear instructions.

As a software development outfit, Noo Corp has tasked itself with squeezing hidden potential out of typical computing hardware by emulating qubits. We recognize that the great challenges of modern computer science are relatively simple problems for quantum hardware. Therefore, writing classical code that takes efficient advantage of quantum math is the answer to many tough and important questions. Such questions include, ‘can Curiosity independently carry out advanced science with only general instructions from Earth?’ In other words, can software alone turbo boost the rover’s intelligence, and thereby transform her into a full blown scientist – not just a lowly researcher? 😉

We argue, ‘yes’.

Mars’ first closeup, 15 July 1965

Mission Updates

Classical Cat, Quantum Mouse

(12:15:11 PM) bradass87: hypothetical question: if you had free reign [sic] over classified networks for long periods of time … say, 8–9 months … and you saw incredible things, awful things … things that belonged in the public domain, and not on some server stored in a dark room in Washington DC … what would you do? […]

With the UK threatening to forcibly extricate Wikileaks‘ Julian Assange from the Ecuadorian embassy in London, its plain to see governments get… concerned, if state secrets become public domain.

When it comes to sensitive digital information, the world has relied on encryption for keeping secrets. The dominant encryption scheme is RSA, first made public in 1977. It relies on a hard problem: the fact that there is no known classical way to efficiently factor numbers. In other words, despite knowing how to do it, cracking RSA can take thousands of years with the current technique. However, in 1994, MIT mathematician/physicist Peter Shor showed that a quantum computer could factor numbers in polynomial time (milliseconds instead of millennia).

Lucky for the entities that use RSA to shield their data from prying eyes, there has yet to be a practical, scalable quantum computer and thus their data is safe for the time being.

Unfortunately or fortunately, depending on your side of the fence, it is only a matter of innovation (creativity + effort + time) before quantum computing is scalable and the world’s encrypted info is completely naked. Of course, the twist is that far stronger means of keeping data private have emerged in the quantum research space.

Whether you’re an individual who just wants their online activity to remain anonymous, or an agency with hard intelligence on a pending Martian invasion — don’t be too paranoid, but don’t delude yourself into believing security is forever.