This discussion is locked.
You cannot post a reply to this discussion. If you have a question start a new discussion

Are Quantum computers just analog computers under a new name?

There is plenty of scientific buzz about quantum computing, but almost no engineering.


I'm just old enough to remember the Analog computer (with patch cords and amplifiers) in the back of the of my university electronics lab, with an Intel 8008 in the other.


At that time 'analog' was coming to an end, and digital computing was coming to the masses with all its new languages and special logic (pascal, algol being new kids on the block to replace the veritable FORTRAN).


Now I see that Quantum computing is all the rage, if only someone can get it working, and fathom how to progamme it. However the question remains: "What is the 'it' of which we speak?".


I would posit that what we have a just a new way of interconnecting an 'analog' computer, where the 'feedback'/coding is meant to take the initial random noise, amplify and select the appropriate components, and finally stabilise on some particular bias level that indicates our solution. Hopefully with minimal energy or power consumed by the computation (apart from the cost of running the refrigerator at near 0K).


Where is the engineering explanation and conceptuallisation of Quantum computing?, and Is it just a new fangled Analog computer?


Thoughts...
  • It's more like a strange sort of digital computer.  The numbers in a quantum computer are held as a series of qubits, each of which may be 0, 1 or indeterminate.  So you take your initial data, feed it through a series of logic gates, to implement the algorithm, and then read out the answer.  The act of reading the answer forces any remaining indeterminate qubits to go to 0 or 1.


    If your algorithm is any good, then the result you read out is probably the correct one.  If in doubt, run the same algorithm several times, and pick the most common result.
  • Part of the issue is that there is a lot of smoke and mirrors, with no suitable analogies provided, for the magit "Qubits" and their connections. Try looking at any of the general articles that aren't 'hard nuclear physics' and it's almost all whitewash that is not sutable for engineers (that's in A1/A2 on the CPD scale;-), hence the engineering question.


    In many ways, folks familiar with analog had similar issues with the change to digital (Boole was long dead before is 'ridiculous' maths came back in vogue ;-).


    In some ways the quantum phenomena are just 'noise' (probability distributions and other analogies) and the Qubit (0,1) end points are just saturated/stabilised amplifiers. (with perhaps a pair of cross coupled amps for memory). i.e. we could simulate them in analog.


    Another issue in the discussions is 'entanglement', which is discussed as if the key aspect is to try to entangle quantum phenomena, when (as I understand it) the problem is to disentangle the quanta sufficiently that each is separate, and then to [very] carefully make the quanta interact (be just entangled).


    So that the analog computer 'simulation' will, as it did in the past, after a transient phase, stabilise on the expected system configuration / state. Entanglement then becomes patch cord interconectivity between the quanta / analog process blocks.


    Those who have never used an analog computer probably don't quite get it, but it (Qubit computing) does feel like it has a lot of equivalences. Plus, historically, there were considerations of noise levels and stability requirements to ensure the analog system was solved.


    The old analog computers were usually single volt/current output signals, while some quantum phenomena may need multi-level (e.g. quaternion view of spins & states, as per Maxwell), but the idea is there.



    Are you aware of any good engineer level explanations, with suitable analogies, that take the computation mode and method to a technical level?
  • Here's how I understand superposition and entanglement.


    Superposition is that state where a particle hasn't "decided" what state it's in yet.  Suppose we have a particle that coule be "spin up" or "spin down".  We create one and put it in a vacuum chamber so it can't meet any other particle.  What state is it in?  Nobody knows.  According to quantum physics, it's in both states at the same time until someone lets it out of the box and measures it.  At that point it "collapses" to one state - whatever your measurement says.


    Entanglement if when you create two particles, so that they must be in different states (so if one is spin up, the other must be spin down).  Put both in separate boxes, so that each is also in a superposition.  Put a stamp on one box and post it to the other side of the World.  As soon as you peek at the one in the box you still have, to see its state, you instantly know that the particle in the other box must be in the other state.  The "message" that changed the other particle from a superposition to a known state somehow travelled instantly, not at the speed of light.
  • Hi Richard Johnson‍,  As you note you had "no knowledge of analogue", by which I presume you mean 'no knowledeg of analog computers', which was, generically, part of the point of my question. It is like looking back at all the explanations, over the years, decades, centuries, of how the human brain works, and what you find is that the explanations match the current principal technology, so we now see the brain as a digital computer, rather than a late 19th century clockwork mechanical device (can't find the BBC article on that history at the moment).


    I was guessing that, actually, the vast majority of the working population of engineers and scientists, particularly those under 60 (who were 18 in 1975 and possibly just starting their tertiary education) will have completely missed the ideas and methods of analog computing, and the speed at which it calculated rather complex things (relatively speaking). But then again analog could not compute 2 x 2 and get 4.00000 !


    (for an easy analog computation, think of a mesh of electrical resistors that emulate flow of product though various channels that have linear flow characteristics - simply apply voltages at the edges and instantly measure the flows in any branch of the network (e.g. Rosen's theorem for mesh networks).


    At the moment the whole 'Quantum' thing is stuck in an 'it's too complicated fror the likes of you' way of describing it, and the osmosis is preventing the diffusion of the knowledge.



    At the moment I'm not a believer in the absolute ' instant' -ness of quantum computing (Simon Barker's). Certain aspects may have a mathematical instant effect, like 1+1=2 is instantaneous and reversible, but the thermodynamics is not in favour of quantum computing being that instant - It appears as if the physicists are hiding that under it being a 'measurement' issue, which is just a conceptual split (just like heliocentricity was useful to Galileo's maths smiley).


    Most of the 'noise in old analog computers was at the limit, quantum in nature, it just looks like they have cut down the size (inertia) of the items that are being 'processed', so they can respond (interact) more quickly, which then begs the question, why so cold - it usually makes everything respond more slowly. Maybe they are meeting in the middle (ambient quantum phenomena and interactions are too quick, so slow/cool them down until we can see them making their quesses!)

     


  • Arya Ray‍ You said "for n qbits will give rise to the result of storing (2^n) classical bits" - Now part of my question is to make sure folks get their heads up out of their digital computer thinking and look a bit wider. If I remember Shannon (sampling theorem) correctly, he used a simailar argument regarding the number of analog measurements sampled and the sampling/aliasing effects.


    As I discuss the 'quantum' phenomena more deeply one starts to see a number of cases where folk don't have a breadth of equivalent scenarios from which to explain what is going on, so resort to smoke and mirrors discussions. In particular the use of mathematics is an abstraction that leaves behind the real world realities. So when we reverse the maths we can pick an alternate reality (i.e. different phenomena) that uses the same mathematic abstraction. 


    You may notice that all engineering formula are of the sort F=m.a, or V=I.R, etc, where we create a local linear formula with a sutable parameter of proportionality which we hold 'fixed'. If we end up with a formula with eponentials or powers we call that science ;-). Slightly hypercritically, we do allow certain square law effects where upon we call it Energy...


    Thus back to Quantum, I do see Quantum phenomena as just faster electrical phenomena, while electrical phenomena are just faster mechabical phenomena (think of mechanical image stabilisation versus electronic image stabilisation - moving mirrors have higher inertia than the electrons/bits within the electronic image. Thus I expect, eventually, to see that Quantum computers do have a 'speed limit'.


    I've not seen any good lecture material on Quantum..


    If I understand correctly, one of the 'entanglement' techniques is photon splitting where a photon (stream) each of energy E is split into photon pairs of ~E/2 with matching characteristics, and half are sent (it's like creating biased 2p coins, splitting then into a pair of 1p biased coins and then trying to measure the bias from the half you recieve - the randomness of the initial 2p coin toss hides the entanglement, and because of attenuation you don't even recieve you full half)

  • Interestingly, the same topic came up on the Computing Magazines thought piece by

    Peter Cochrane: Quantum computing - a return to analogue computers?


    , with a few follow on discussions.


    As I noted in one of the replies there are also some lectures on-line from this years FOSDEM-2019 conference https://fosdem.org/2019/schedule/track/quantum_computing/