Hacker Newsnew | past | comments | ask | show | jobs | submit | letstryagain's commentslogin

Always programming. During holidays I like to get familiar with new technologies to see what they offer. Exploratory programming I guess you could call it.


Yep!


That should not matter. Quantum computers promise to do things that classical computers will never be able to do. A 'proper' QC with on the order of a few thousand qubits can perform calculations that even a known-universe-sized classical computer will find intractable. Optimisation quickly becomes irrelevant when you scale these problems just a little bit more.


> Quantum computers promise to do things that classical computers will never be able to do. A 'proper' QC with on the order of a few thousand qubits can perform calculations that even a known-universe-sized classical computer will find intractable.

Your first statement is correct, but Shor's algorithm (the one I suspect you are referring to when you talk about a universe sized computer) is not an example of one. We do not know that there is not a classical factoring algorithm that is as good or better than Shor's, and we have no quantum algorithms as of yet that can give an exponential speedup over any possible (but possible yet unknown) classical algorithm (which is needed for the universe sized computer thing). We have some that give more modest speed ups that are known to be unbeatable by classical algorithms; Grover's search algorithm is such an example, but it is only quadratic. This means you have a quantum computer of size N, you only need a classical computer of size N^2 to match it.

> Optimisation quickly becomes irrelevant when you scale these problems just a little bit more.

The problem with quantum computers isn't optimization of algorithms, but it's still optimization of strategies for dealing with errors (which sometimes look a lot like algorithms). Building reasonably accurate logic gates was figured out with a lot less effort when we started even thinking of building such things, and their reliability didn't fall off with increasing size very quickly. Our technology for quantum computers, on the other hand, all have crippling flaws as of yet. The most common one (superconducting qubits and ion trap based designs both suffer from this) is that when we try to make the computer bigger, it needs an unscalable amount of error correction an eventually stops working altogether. Some other approaches (linear optical quantum computer for example) can scale up without getting worse per se, but the gates we can build are already so unreliable that we still need too much error correction to scale. So optimizing our error correction strategies is one possible avenue that is being explored.


I do not believe there are any problems known to be solvable by QC, and unsolvable by classical computers.

edit: I was trying to be polite, but Scott Aaronson has spilled quite a lot of blog ink denouncing remarks like the parent post as utter nonsense.


I do not believe there are any problems known to be solvable by QC, and unsolvable by classical computers.

The space between unsolvable by classical computers and solvable practically by classical computers is... significant.


You might have misunderstood my comment (or Scott's writings?)

Classical computers are limited by what they can compute because the universe has limits. A 10,000 qubit quantum computer can factor fairly large numbers. 10,000 qubits is pretty small, one day hopefully it will fit inside a small room.

To factor those same numbers using a classical computer you'd have to make a computer the size of the entire known universe and run that computer until the heat death of the universe. Obviously this is not possible, even in principal.

Theoretically any QC computation can be simulated by a classical computer but in our limited universe you quickly run into the wall where the classical computer just becomes too big (bigger than the entire universe) and too slow (slower than the life of the universe).


I definitely reread your comment several times and kept stewing on the word "intractable" since I'd seen it used somewhere else to talk about problems in NP-Complete. I assumed if I was misinterpreting what you were saying, it would hinge on that word.

Aside from factoring, what kinds of things do we think will meaningfully change if we get general QC? Cryptographers are already preparing for the post-quantum world. Who else needs to be preparing?

Everything I think I understand about QC suggests that a practical breakthrough will not cause any changes in society, other than the abandonment of RSA. Am I missing something?


Since you seem to already familiar with integer factoring, isn't factoring large integers something that "solvable by QC, and unsolvable by classical computers"?


Classical computers can factor large integers.

Before this thread, I knew that Shor's algorithm and Grover's algorithm were two very important QC results. I knew that Shor's algorithm means that a QC would be able to decrypt anything that was ever encrypted with RSA. (ECC schemes are likely just as vulnerable, so cryptographers are looking at purely hash-based schemes: http://pqcrypto.org/hash.html)

What I learned this morning based on hints in this thread:

1) Grover's algorithm is a far more modest speedup compared to Shor's algorithm

2) Shor's algorithm only factors integers, but Grover's algorithm can more generally invert a function (which still threatens many currently used cryptographic functions)

So I'd guess that Grover's algorithm is the sort of thing people are talking about as useful in machine learning. There are probably other QC results that are worth getting excited about for people working with neural networks. The Google/Microsoft workshop this weekend has a number of sessions on quantum annealing, as well.

A big reason "unsolvable by classical computers" is such a silly way to phrase things is (paraphrasing Dr. Aaronson here) that simulated annealing techniques on classical computers are already producing such good results without QC. In the previous Shtetl-Optimized post to this one, he posts a Powerpoint deck for a talk he gave at the same conference, and it is quite instructive (but still enough over my head that I'm sitting on HN asking dumb questions).


> Classical computers can factor large integers.

I mean, since the best-known quantum algorithm for factoring integers is asymptotically faster than the best-known classical algorithm for factoring integers, isn't there some definition of "large" for which this is no longer true?

(I'm assuming you mean "classical computers can factor large integers effectively", since the class of all problems solvable by a classical computer is exactly the same as the class of all problems solvable by a quantum computer)


> the class of all problems solvable by a classical computer is exactly the same as the class of all problems solvable by a quantum computer

I started this sub-thread because I'm pretty sure this statement is true. But the details are tricky.

> the best-known quantum algorithm for factoring integers is asymptotically faster than the best-known classical algorithm for factoring integers

The only reason that current commercial cryptography works is that classical computers can't factor large integers effectively. But quantum speedups to factoring isn't particularly profound, since factoring is just a small part of computing.


> > the class of all problems solvable by a classical computer is exactly the same as the class of all problems solvable by a quantum computer

> I started this sub-thread because I'm pretty sure this statement is true. But the details are tricky.

Actually that is one of the first problems that was solved when we started looking into quantum computers. Our models of classical and quantum computation are both Turing complete and only Turing complete. Any classical computer can simulate a quantum computer, and any quantum computer can simulate a classical one. A quantum computer's only advantage is in speed, and we don't yet have proof that there is an exponential speed benefit that cannot be circumvented by finding a better classical algorithm.

Since asymmetric cryptography generally relies on an exponential barrier between the process of using the encryption and the process of breaking the encryption, that is what we really need to completely break modern asymmetric cryptography.


It might provide a rapid, general approach to non-convex optimization for neural nets.

And that changes everything, probably more than anything since (the iPhone|the internet|computers|penicillin|the industrial revolution|fire) depending on how optimistic you are. It'll change things a lot, anyway.

See http://research.microsoft.com/en-us/events/qml/


Imagine simulating a human brain. I'm not sure how massive a computer would need to be today to simulate the neurons, but an efficient implementation can be made to be the size of.. Well.. A brain.

I could see this causing massive changes in society. An artificial intelligent simulation of a super-smart human, that can be tuned toward a specific problem area and made much more focused and efficient than a purely biological brain could work...


Is there a specific reason a quantum computer would be better at simulating a human brain than a classical computer?


Well, to simulate the chemistry in the brain, would, I think, involve simulating some quantum mechanical things, which a quantum computer might be better equipped to simulate?

Is the argument that I've heard.

I don't know how that works when the qubits are being used to deal with other sorts of variables than bits though?


lololomg replied to you about Grover's algorithm.

https://en.wikipedia.org/wiki/Grover%27s_algorithm

For some reason, his comment was killed.


Grover's algorithm is a quantum algorithm for searching an unsorted database with N entries in O(N^1/2) time and using O(log N) storage space


Not proven. The best currently known algorithms for factoring on a classical computer take asymptotically longer than the best known quantum ones. That doesn't necessarily mean it's a fundamental limit of the universe. More generally we have no proof that BQP =/= P.


One floating point operation per second

One flops

Two giga- floating point operations per second

Two gigaflops


Acronyms get pluralized separately from their contents, though. You wouldn't say "two CPU" or "two SSD" or "two taser."


But you would add an 's' to both of your examples to make two central processing units (CPUs) and two solid state drives (SSDs). I agree, it's a weird rule but it certainly sounds better.

When the pluralization happens in the middle of the acronym it becomes weird -- one FLOPS, two FLOPsS?


Both of my three examples?

Anyway, you're right, they would all get an S at the end either way.

A better example would be "captcha." You'd say "two captchas," not "two captcha" or "two captscha."

For a less polite example that works the same way, consider MILF.


So ...

Two gigaflosps?


Ideally you'd want your RDBMS to refuse to compile tests like x = y in cases where either x or y could be null. You would have to explicitly handle nulls in boolean statements.


That would be sensible if SQL dialects usually offered a sane comparison that returns true/false reasonably for null comparisons. Since they don't, and = is the only available equality test, I'm going to stick to my original point that null comparison in SQL is dumb and wrong for 99% of use-cases and compiler warnings would be solving the wrong problem.


MySql, Postgres, and Vertica, at least, all support <=> as an operator that treats NULL as an ordinary, distinct value.


I've been working on a linter for SQL - that's probably not a bad rule to toss in the mix...


They should have called it 'speedball'


It's not a default sub, you have to actually subscribe to get it.


It was a default sub last year, possibly with a different mission statement. Or they were just into murder at the time. I mean, lots of popular websites are.


Mission statement was always the same. It's just no longer a default.

Note that user who signed up before the default change took place still have the old defaults!


I love stories like this. By studying what happens to the brain when something goes wrong we gain a sliver of insight into how the brain works. In this case he was able to store memories but he could not retrieve them. When the cyst was drained he almost immediately gained access to all those memories again. Fascinating!


Better experience than a hotel or serviced apartment? Good luck with that! Hotels and serviced apartments are literally designed from the ground up for a great customer experience. Some random person's house or apartment is just not going to be able to compete on customer experience.


Sometimes it is just a better experience staying a real apartment or home instead of a hotel. For example, you can pick up local ingredients and cook your own food. You can get a private space with multiple separate bedrooms.

You say "serviced apartment"--is that a European thing? I've never heard of it. If I was traveling I would look to AirBnB or VRBO to find places that are not hotels.


In Australia you get serviced apartments everywhere. They are basically apartments that get serviced (cleaned, new linen etc) regularly like hotel rooms. You get separate bedrooms, kitchen, etc. just like a normal residential apartment.

Example: http://www.questapartments.com.au/


I change my username every few months


You can do that anyway, and some people do in fact replace their controller chips with modified clones to boost performance.

http://www.chipexpress.com/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: