Hacker Newsnew | past | comments | ask | show | jobs | submit | Aardwolf's commentslogin

Base 16 (or base 10, as they would call it) is the perfect base: http://www.intuitor.com/hex/

I'm standing my ground on optimal base, but I will absolutely be using those hex pronounciations in future

The "dividing things by two" argument makes a lot of sense! And if you need ⅓ and ⅕, they aren't too bad either: .5555 and .3333 repeating.

Sexagesimal (Base 60) is the way to go. Plenty of history behind it and can handle much larger numbers than decimal.

Once quantum computers are possible, is there actually anything else, any other real world applications, besides breaking crypto and number theory problems that they can do, and do much better than regular computers?

Yes, in fact they might be useful for chemistry simulation long before they are useful for cryptography. Simulations of quantum systems inherently scale better on quantum hardware.

https://en.wikipedia.org/wiki/Quantum_computational_chemistr...


More recently it's turned out that quantum computers are less useful for molecular simulation than previously thought. See: https://www.youtube.com/watch?v=pDj1QhPOVBo

The video is essentially an argument from the software side (ironically she thinks the hardware side is going pretty well). Even if the hardware wasn't so hard to build or scale, there are surprisingly few problems where quantum algorithms have turned out to be useful.


It is tough to beat classical computers. They work really well, and a huge amount of time (including some of mine) has gone into developing fast algorithms for them to do things they're not naturally fast at, such as quantum chemistry.

At 15:00, she says "quantum computers are surprisingly good at [...] quantum simulations [of electron behavior]", which would seem to contradict you.

One theoretical use case is “Harvest Now, Decrypt Later” (HNDL) attacks, or “Store Now, Decrypt Later” (SNDL). If an oppressive regime saves encrypted messages now, they can decrypt later when QCs can break RSA and ECC.

It's a good reason to implement post-quantum cryptography.

Wasn't sure if you meant crypto (btc) or cryptography :)


I will never get used to ECC meaning "Error Correcting Code" or "Elliptic Curve Cryptography." That said, this isn't unique to quantum expectations. Faster classical computers or better classical techniques could make various problems easier in the future.

What do you want it to mean?

I think he meant Error Correcting Code (just this, not this "or" ECCrypto)

From TFA: ‘One more time for those in the back: the main known applications of quantum computers remain (1) the simulation of quantum physics and chemistry themselves, (2) breaking a lot of currently deployed cryptography, and (3) eventually, achieving some modest benefits for optimization, machine learning, and other areas (but it will probably be a while before those modest benefits win out in practice). To be sure, the detailed list of quantum speedups expands over time (as new quantum algorithms get discovered) and also contracts over time (as some of the quantum algorithms get dequantized). But the list of known applications “from 30,000 feet” remains fairly close to what it was a quarter century ago, after you hack away the dense thickets of obfuscation and hype.’

It turns out they're not so useful for chemistry. https://www.youtube.com/watch?v=pDj1QhPOVBo

I believe the primary most practical use would be compression. Devices could have quantum decoder chips that give us massive compression gains which could also massively expand storage capacity. Even modest chips far before the realization of the scale necessary for cryptography breaking could give compression gains on the order of 100 to 1000x. IMO that's the real game changer. The theoretical modeling and cryptography breaking that you see papers being published on is much further out. The real work that isn't being publicized because of the importance of trade secrets is on storage / compression.

Someone just has to figure out how to actually implement middle out compression on a quantum computer.

> compression gains on the order of 100 to 1000x.

This feels like woo-woo to me.

Suppose you're compressing the text of a book: How would a quantum processor let you get a much better compression ratio, even in theory?

If you're mistakenly describing the density of information on some kind of physical object, that's not data compression, that's just a different storage medium.


Pretty sure quantum algorithms can't be used for compression.

But what if you weigh this by usage time? The firefoxes without extensions might be hardly ever used

> Post published 4 hours ago

Am I the only one who dislikes these relative times and prefers absolute date stamps?

Especially "1 year ago" (for something that was 23 months ago)


Relative times are nice for recent times (e.g. "5 minutes ago" is better than "2025-12-18 13:03"), but they should "decay" into absolute times for anything that isn't fairly recent - like a week or two, perhaps.

It varies by use case. I can think about e.g. an SRS flash card where you next review is in 2 years. I honestly don‘t care if 2 years here means 21 months or 28 months, and I especially don‘t care if the next review is on 21st of February 2028 at 13:52. All I want to know is that the next review is so far in the future it may not actually happen.

That's a fair point. I'm thinking of the use case of formatting a past date on something like a social media post/comment. (For example, a comment on HN - which uses a rather long cutoff for relative dates.)

I agree with you, I also prefer absolute date stamps, including because it might be printed out, etc. However, the <time> command would allow that to work, if it is implemented in a way that allows that to work.

It is particularly annoying in a screenshot or printed document. I rarely print onto paper, but occasionally, I will "print" an interesting blog post into a PDF.

I like it but I think the granularity needs to be fixed. For example, the cutoff points should be 21+ months -> 2 years instead of 13+ months -> 1 year.

So basically you want the cutoff to be > 1.66 of the next unit before you display in that unit. That means 40 hours, 2 days; 11 days, 2 weeks; 6 weeks, 2 months; 20 months, 2 years.

I'm annoyed by things moving unbidden, especially in clickable interfaces. This element being all over Slack, chat apps, etc. means that things are always shifting around slightly and at unpredictable times.

Agreed! What were we using before Let's Encrypt again? Maybe just plain HTTP


Mostly Verisign, which required faxing forms and eye-watering amounts of money. Then Thawte, which brought down prices to a more manageable US$500 per host or so. Which might seem excessive, but was really peanuts compared to the price of the 'SSL accelerator' SBus card that you also needed to serve more than, like, 2 concurrent HTTPS connections.

And you try telling young people that ACME is a walk in the park, and they won't believe you...


And then sketchy resellers for Verisign/Thawte, which were cheap but invariably had websites that ironically did not inspire confidence in typing in your credit card number.


As GP posited, because of this headache, lots of web traffic was plain ol' HTTP. Let's Encrypt is owed a lot of credit for drastically reducing plain ol' HTTP.


I was using StartCom StartSSL which was offering free 1 year certificates at least for my personal sites.


They were great in the beginning, and then when you issued a few more certs than they liked you were asked to pony up some $$$, and then when you did that and actually "verified" who you were on a personal international phone call, you got a grace, and then issued a few more, they decided they didn't like you so they would randomly reject your renewals close to the expiration date, and then they got bought out by some scummy foreign outfit which apparently caused the entire CA to be de-listed as untrustworthy in all major browsers. Quite the ride.

Also, the only website I've ever encountered that actually used the HTML <keygen> tag.


Self signed certs. I wasn't paying.


Some of them were not expensive but it was not convenient at all.


SSL/TLS via expensive and hard to work with providers and tooling. Let's Encrypt made it free and easy to maintain.


either you used http, self signed if you did not mind the warning, and i remember there being one company that did offer free certificates that validated, but cant remember the name of it


> i remember there being one company that did offer free certificates that validated, but cant remember the name of it

You're probably thinking of StartSSL, and it was a bit of a pain to get it done.


I believe it was StartSSL and/or WoSign back then


The pros were using client-side encryption :D


Hmm, nothing about quantum computing in there?


> The table occupies at most 32 GiB of memory.

This constraint allows making a linear array of all the 4 billion values, with the key as array index, which fits in 16 GiB. Another 500 MiB is enough to have a bit indicating present or not for each.

Perhaps text strings as keys and values would give a more interesting example...


> a linear array of all the 4 billion values, with the key as array index, which fits in 16 GiB

The hash table has the significant advantage of having a much smaller minimum size.

> Perhaps text strings as keys and values would give a more interesting example

Keep reading to "If keys and values are larger than 32 bits"


That's not a constraint as much as the worst case size.

If you actually only have a handful of entries in the table, it is measurable in bytes.

A linear array of all 4 billion possible values will occupy 16 GiB (of virtual memory) upfront. We have then essentially replaced the hash table with a radix tree --- that made up of the page directories and tables of the VM system. If only the highest and lowest value are present in the table, then we only need the highest and lowest page (4 kB or whatever) to be mapped. It's not very compact for small sets; storing N numbers in random locations could require as many as N virtual memory pages to be committed.


This hashtable implements a multiset. Not (merely) a simple set.


> Someone from the operational team just learned that business relies only on the first group to be successful.

Is there any possibility the presence of the people who are there just for fun still encourages/increases the size of the first group?


Imo, yes. There are where gamblers come from. They are also providing plausible deniality to gamblers.


Flash is still a big loss imho, the ecosystem of games, movies and demonstration thingies was amazing and they were accessible to create by many. Unlike Java applets that slowed the main browser UI thread to a crawl if they didn't load they usually didn't), Flash didn't have such slowdowns.

One exception is early 2000s Runescape: that was Java in browser but always loaded, no gray screen and hanging browser. They knew what they were doing.


Many of the old games and movies still play back well with Ruffle installed (https://ruffle.rs/). Newgrounds embeds it by default for old interactive flash media that they couldn't convert directly to video.

It's not a perfect fit, but it works. The speed of Ruffle loading on a page is similar to that of Flash initializing, so you can arguably still make flash websites and animations to get the old look and feel if you stick to the Ruffle compatibility range. The half-to-one-second page freeze that was the norm now feels wrong, though, so maybe it's not the best idea to put Flash components everywhere like we used to do.

Runescape proved that Java could be a pretty decent system, but so many inexperienced/bad Java developers killed the ecosystem. The same is true on the backend, where Java still suffers from the reputation the Java 7 monolithic mega projects left behind.


It's good that we have the runtime to run old Flash games. What we lost is an extremely easy environment for authoring/creating them. Nothing has come even close since Flash. Not just game, but any kind of interactions and animations on the web.


Where did the tools go?

Can a person not run Flash authoring tools with an era-appropriate operating system in a VM or something?


That's the only way I guess. What I meant, with the death of Flash, nothing appeared to offer the same tools for any other web technology


I think perhaps what was lost is mostly this: Macromedia. They had a knack for making content creation simple. Flash was just one of the results of this: It let people create seemingly-performant, potentially-interactive content that ran almost universally on the end-user computers of the time -- and do it with relative ease because the creation tools existed and were approachable.

Macromedia also provided direction, focus, and marketing; more of the things that allowed Flash to reach saturation.

Someone could certainly come up with an open JS stack that accomplishes some of the same things in a browser on a modern pocket supercomputer. And countless people certainly have.

But without forces like marketing to drive cohesion, simplicity, and adoption then none of them can reach similar saturation.


Having to run it in a VM already makes it less approachable


Can it already vertically and horizontally center unknown-beforehand-length multi-line text in a single html element, just like non-CSS table cells could already in 1995?


> Can it already vertically and horizontally center unknown-beforehand-length multi-line text in a single html element, just like non-CSS table cells could already in 1995?

Non-CSS table cells have never been able to do that – you need a wrapping <table> at minimum for browsers to render it how you want, a <tr> for it to be valid HTML, and <tbody> comes along for the ride as well as an implied element. So that’s four elements if you want to centre vertically with <td> or <th>. If you wait until the year 2000, then you can get that down to three elements by switching from HTML to XHTML because <tbody> is no longer implied in XHTML.

CSS, on the other hand, has been able to do what you want since 1998 (CSS 2) with only two elements:

    <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
    <title>.</title>
    <style type="text/css">

        html,
        body {
            height: 100%;
        }

        .outer {
            display: table;
            width: 100%;
            height: 100%;
        }

        .inner {
            display: table-cell;
            vertical-align: middle;
            text-align: center;
        }

    </style>
    <div class="outer">
        <div class="inner">
            Test<br>
            Test<br>
            Test<br>
            Test<br>
            Test<br>
            Test
        </div>
    </div>
(I’m using a <style> element here for clarity, but you can do the same thing with style attributes.)

https://www.w3.org/TR/1998/REC-CSS2-19980512/


align-content: center;

(supported on block elements since sometime last year)


Thanks, seems to work at first sight in combination with text-align for the horizontal alignment!

That means I may finally not need line-height or multi-element tricks for this anymore

Interesting that this is finally there since a year!

I wonder what made them decide to support it finally, since CSS's creation in 1996.

A button never looks professional if the text in it isn't centered, this was a really needed feature and I still can't understand why it took that long

Edit: it does seem worrying that for me this property vertically centers but not horizontally while the description of it doesn't say vertical but is: "The items are packed flush to each other in the center of the alignment container along the cross axis."


> The items are packed flush to each other in the center of the alignment container along the cross axis

You're right, the entire Values section seems to still be worded exclusively for flexboxes. The description at the top adds "or a grid or block-level element's block axis".


width: fit-content; margin: auto;


That changes the width, I guess I should have specified fixed width


What is a fixed with, that is not a has not a fix value?


I mean elements with a width set in pixels, ems or some other unit. Setting width to 'fit-content' would override the width you set and then the element may overlap others to the right of it


Then you just do width: <width in ems> em; ? I thought you didn't want to specify a width.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: