Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anecdotal, but I’ve talked to some physicists working in the field of quantum computing, and some of them think that it’s possible that advancements in AI will provide somewhat efficient solutions to some computational problems (namely in the NP class), and the solutions will be “good enough” for actual businesses (e.g. in logistics) and researchers (e.g. in chemistry), to a degree that it might negatively affect future funding for quantum computing research. And the pace of advancement in AI will continue to accelerate, while the pace of advancements in quantum computing is notoriously slow.


Yep, I'm very interested in "good enough" optimization techniques using ML, to massively speed up optimization problems that require a lot of computation that doesn't parallelize easily. I'm not an expert in it, but I work adjacent to it, and it seems like a promising direction to me.


They’ve always gotten lots of funding in quantum for techniques with big promises that haven’t delivered yet. I’m not even saying they’re misleading us so much as it has had basically no payoff compared to HPC, FPGA’s, and even analog computing.

The bigger concern of companies like that is if someone bankrolls dirt-cheap FPGA’s or Adapteva-style cores with open architectures targeted by a toolchain like Synflow’s or Cray’s Chapel. From there, domain-specific applications (esp optimized kernels) targeting those tools. Then, MPP-style hardware as cheap as commodity servers to scale it up and out. I’m talking engineering the thing by combining proven strategies, not doing new research.

Even $100 million put into such systems would deliver more value in more areas than $1 billion put into quantum computing. If not, we’d at least get huge speed-ups with the parallel architectures for a while and then the QC folks eventually deliver something better in some areas. I’d be happy both ways so long as one is in my building or it’s in the cloud for $2.30/hr. :)


Quantum doesn’t help for NP problems.

In the case of codebreaking, where QC may have an advantage, AI is unlikely to provide an alternative because modern AI is all based on finding probabilistic patterns and cryptography is explicitly designed to be resistant to that kind of attack.


^ Just pointing out, the parent didn't say NP-hard or -complete, so they weren't wrong (but could be misunderstood). But saying Quantum doesn't help for NP isn't right either.


Fair enough!


> Quantum doesn’t help for NP problems.

Notice that the GP is about the other way around. Solving NP-hard problems will absolutely make quantum computers useless.

And in the case of physics simulation (the actually valuable use for a quantum computer), there's no guarantee that you can't find a probabilistic solution.


Can you please elaborate?


The class of this problems where quantum computing gives an advantage is called BQP and it includes things like factoring but it is not known to include NP complete problems like the traveling salesman.


Thank you!


I have not programmed a quantum computer before. My current state of ignorance, and so im relying only on intuition here, is that it is massively parralel in the same way that the surface of water in your cup is massively parralel.

To program a serial algorithm in a quantum computer would be to miss the point. You could encode the interactions between clumps of qbits to function as nodes in a wave simulation.

There would be few enough nodes in the wave that it would be not equivalent to a true wave simulation. It would be a discretization, but now it functions as an analog asic like proxy for the real thing. If the groups of qbits are smaller than the true nodes in a wave surface your computation would be faster than the real thing, but lower precision.

In a classical computer this would have to be done with buffers, or a very narrow set of parralel deterministic programs, otherwise impossible. (examples: a subset of cellular automata rules, gravity sort)

Is any of that right or am i completely off base.


Quantum is not parallel.

Quantum is quantum. We did a few quantum gate simulations in my college back in the day.

The gist is that entangling bits (say b0, b1, b2, b3 are all entangled), has special properties. You can almost perform retroactive computing. you can send these values to quantum gates to define properties, like b0 + b1 = b2 * b3.

Later, you unentangled the values and they will 'snap' into some true state. Like b0,b1,b2,b3 == 1111, but the values 0001 or 0010 are also possible.

There is nothing 'parallel' about this. It's just a quantum operator using the properties of entanglement / disentanglement for the purposes of computation.

-------

I'd say that quantum is closer to retroactive computing than parallel. Operations seemingly go backwards in time thanks to entanglement.

That doesn't mean quantum is useful in all algorithms, but it seemingly has applications in cryptography and 3SAT / NP Complete and... Lol simulations of quantum effects.

--------

Ultimately, Quantum is only useful if you get enough bits at a time entangled. For now, it's faster for standard computers to brute-force all 2^32 bit combinations rather than using lasers+diamonds to entangle 32-bits and perform the operations.

Quantum needs more entanglement and at lower costs to become competitive. But maybe it happens. At very least, quantum computers do exist but who knows how long it will be before they are competitive.


One of the challenges of quantum computing is getting the useful results back into "classical" mode, where you can use them. Theoretically that's one of the hurdles of getting better theoretical performance than a classical computer. What you describe is more akin to how a GPU works, but in this case, you can't "just get" the result from the GPU.


Well that sounds like good news, are there any practical applications for quantum computing other than breaking commonly used encryption that would push the modern world even further towards a complete surveillance state?


The #1 proposed use of quantum computers is... simulating quantum effects.

Modern quantum simulators running on supercomputers is an exponential (O(k^n)) kind of operation. But a quantum computer can simulate any quantum effect in just one step, O(1) time. Because ya know... a quantum computer is just a computer that has isolated quantum effects and allows a programmer to control them easily.


Isn't O(1) an oversimplification? In comparison wouldn't electrical computers then also be able to simulate all electrical effects in O(1) Time?


> In comparison wouldn't electrical computers then also be able to simulate all electrical effects in O(1) Time?

Yes and OpAmps / analog electrical circuits were way faster than digital computers for decades because of this.

When you know that the current of a diode is related to the exponent of the diode's voltage, you can do silly things like calculate logarithms using OpAmps and diodes in a mere nanosecond or so.

Most 1980s synths used OpAmps as the basis of the math / calculations for signal processing, because digital computers just weren't fast enough to compete yet. Those circuits still work today.

Alas, digital computers are too cheap, accurate, and fast these days so nearly everything is digital now.

----------

I guess your point though is that maybe a quantum computer (or OpAmp / electrical computer) can only physically simulate the effects that the hardware contains.

Ex: an electrical computer cannot simulate BJTs unless it has a BJT somewhere. So it would still come down to the computers physical load out.

Similarly, a Quantum simulation on a quantum simulation computer would only have some subset of primitives implemented.


I'm not in the field but in my understanding it's going to make it easier to do parallel competitions and also will allow us to find the inputs that can lead to a particular output in some types of equations. Factoring numbers is only one of the "practical" equations.

Materials science and chemistry (e.g. predicting how drugs may behave in our bodies) will benefit, or other computationally heavy "try out many permutations and see which ones fall though" tasks may become a bit easier. Climate modelling, supply chain logistics, financial modelling, and of course AI.


I think AI and quantum computing are really the same thing, the difference being the same as girltalk in contrast with boytalk.

what I'm saying is that there's no quantum computing; but I actually mean to say I simply do not understand whatever 'quantum' means in the context of computing given my own opinions of what computing really is (I have a philosophical opinion).

in my view (as distorted and twisted as it is), what quantum computing devices really do is sensing or measuring. computing is classical in nature and no amount of hand waving will change this.

whatever happens to logic in a qubit? if both A and B are anything between 0 and 1, what's the negation of such a thing?


>whatever happens to logic in a qubit? if both A and B are anything between 0 and 1, what's the negation of such a thing?

The problem is the pop sci explanation where qbits are everything between 0 and 1 at the same time is super wrong and misleading.

Your qbits are in states that have "amplitudes". You do things that change the amplitudes, and that happens to do computation. From the amplitudes of your qbits, you can figure out what the probabilities of observing different outcomes is.

It's not that the qbit is anything between 0 and 1. It's in a precise state, that results in percentage probabilities for different outcomes.

The negation of true 80% of the time is false 80% of the time.


To be fair, in some quantum computers, it does work. For example if you count a beam of photons following the same path with (almost) all possible polarization combinations as a qubit, that is indeed closer to "everything between", however it's got limited utility indeed when compared to the broader view.


Apologies in advance—I don't know how to say this in a loving way, but I also don’t mean to pass judgment—couldn’t this just be a skill issue? Why not look into how quantum computing works and resolve that once and for all?

https://quantum.country/ is a pretty approachable starting point if you’re curious. The math is no more intimidating than any other skilled discipline.


you can just say it, it's not worse than getting downvoted for having off colored opinions

"aren't you just too stupid to understand?"

maybe I am... maybe I just think differently with another brand of depth

what I wonder now, given another comment about amplitudes, is what happens to combinatorics? (and alphabets, and languages as sequences of strings from finite alphabets) but it's just simpler to dismiss me as foolish idiot.


People responded with civility which, in retrospect, you may not have deserved.


Im just as ignorant as you about it. The quantum part to me doesnt seem to matter. I don't see why its any fundamentally different from making a computer out of any other substrate. Light, sand, chemicals, at the end of the day you compute a function by following a set of steps which may or may not have constraints of seriality.

I think von neumann cant handle purely parralel algorithms efficiently. Even gpus. I think thats whats where the "all qbits interacting simultaniously for n steps after being put in initial conditions" is about. Its purely parralel.

It could be the case that information cant be extracted before it is computed, and that for some algorithms the peak efficiency serial version is equal to the best efficiency purely parralel version. It could also be the case thats not true and that classes of quantum algorithms could be better.

At the end if the day the way i see it substrate is just an engineered means of applied ops per time.


The quantum part does matter in the sense that certain kinds of "maths" are not directly accessible in a classical computer. You can sort of simulate Shor's algorithm on a classical computer but you cannot get the same low complexity. Of course, whether the quantum computer uses substrate A or B doesn't matter (other than practically).

The only way I ever found an access to understanding quantum computing is by doing the math, as other pop-sci explanations don't really reflect what is happening (at least for me).


Regarding the negation, you can consider it in a few different ways, depending on your use case.

If your qubit is "all values", the negation is the lack of any value, e.g. "a measurement that results to 0".

However, in most cases, you will have a segment within the possibilities. E.g. one qubit can hold "vertically polarised light at frequencies between X and Y at phase P", then the negation could be one of "horizontally polarised light at that frequency and phase", or, more common, same polarization and frequency but the phase is 180 degrees off P. That way, if you add them together, you get the cancellation. Another negation is "all other possible combinations".

However I must note that when you call a qubit "all possible combinations" that's not a typical qubit value.

Think of it like this, "binary" is "0 or 1", but a "binary value" is only one of either 0 or 1. A qubit value is, depending on your interpretation, photons of some polarisation, frequency and phase, or electrons of a particular spin, or some other combination of those. Some of those can have multiple combinations, for example a qubit value could be the combination of a bunch of photons at different values. Or, an electron with an "unknown spin". Or an electron with an unknown spin that's the complement of another electron at another unknown spin, but when you measure either you will reveal the other, and so on. So, only some qubit values will have "all possible combinations" i.e. "when observed, it could literally be anything".

The internal of "what was the value before you observe it" is, well, controversial, and to most people, almost irrelevant, and even, "not good to consider".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: