Hacker Newsnew | past | comments | ask | show | jobs | submit | prolurker's commentslogin

Tables layout is generally complex and slow, only hiding the text with CSS feels very fast.

Once layout is involved even without looping to set innerHTML it becomes a lot slower.


With control over the mirror list you can prevent certain users from getting updates which is a security problem but without being able to sign packages the danger is limited.


I would argue this particular example has more to do with the radix parameter of parseInt being optional and having a complex behavior.

More generally, javascript functions accepting any number of parameters, regardless of those specified in the function declaration, is quite error prone when passing functions around.

I always use anonymous functions or 'bind' to explicitly match the parameters unless all functions involved are curried.

The other reason to avoid passing 'naked' functions around too happily is the behavior of this.

I also find most optimizations to focus on simple, explicit code. Nothing like using the less common, more dynamic features of the language to hit deoptimizations.


Yep, ["1", "2", "3"].map(x => parseInt(x)) works.


That's an interesting attack vector, in the section 3 of the RFC they recommend to ignore the directive unless it's a secure connection which would mitigate that kind of problems.

Another solution would be to use an unpredictable versioning scheme so the attacker can't anticipate the name of the resources.


The attacker who buys a bunch of domains and legitimately owns them for a period of time wouldn't have any issue getting SSL certificates for them.


Correct, but, even if not explicitly said, the cached entries should be associated to the certificate's fingerprint and immediately discarded once the certificate expires or is changed.


Certificates often change for legitimate reasons, e.g. Let's Encrypt certificates which must be changed every 3 months.


That would be ok.


cache-control: private doesn't seem to imply that a resource won't ever change and on page refresh the browsers have to check if the resource has been updated, immutable would avoid the 304's responses cascade.


That's the entire point of the expiration time. Use a 2 year range and it's effectively immutable. No content will stay on device forever anyway and headers can easily be set to a smaller time-frame or must-revalidate if the content owner wants it.

Browsers mistakenly continue checking for new copies when they shouldn't within the expiration time. Fixing poor implementations with more standards never works well.


The problem is that servers are allowed to update their resources at any time without waiting for any specific expiration time. So when a user instructs it's browser to refresh the page, usually expecting to get the most up to date version, the browser has to choose between giving the still valid, but maybe not completely updated, cached version or actually checking if the resource has been updated.

Immutable makes it clear that the server won't update the resource in place and will handle updates by generating a new one so the browser can happily avoid checking those resources on page refresh.


I dont see a problem - browsers should honor the expiration time and choose the cache copy if it's still valid.

It's up to the server to use proper headers. Why say a file is ok to cache for years if it actually isnt? If the same url will change content then use shorter cache times or require active revalidation and/or etag checks - or just use the typical cache busting querystring parameters.

This "immutable" flag is unnecessary.


You disagree with the choices the specs have made.

From the cache control rfc:

   When a response is "fresh" in the cache, it can be used to satisfy
   subsequent requests without contacting the origin server, thereby
   improving efficiency.

From the immutable rfc:

   Clients SHOULD NOT issue a conditional request during the response's
   freshness lifetime (e.g., upon a reload) unless explicitly overridden
   by the user (e.g., a force reload).


What exactly do I disagree with? The specs are fine, it's the implementation (the browsers) that are broken which is what I've been saying with this whole thread.

If the implementation is faulty, what is another spec going to solve? Again, there is no need for an "immutable" flag because existing cache headers already express everything that's necessary.


Browsers are free to make a request "just in case" with cache control. With immutable, they are strongly discouraged from doing so. My point is, browsers aren't broken according to the spec if they make those just in case requests.


That is 100% on the browser to optimize itself. Otherwise it's diluting the point of the existing cache header if we need to add a flag to say "we're actually really sure about this expiration time"


And many people are almost certainly going to find that they actually need to either recall an old immutable thing, or mutate it.

Also, I will certainly want to clear out my browser's cache on a regular basis. I do not want it keeping immutable things just because they shouldn't ever change.


You can't 'recall' something you already sent out to browsers, and if you need to mutate then it's easy to make a new URL.

This header won't make browsers cache data any differently. It skips a step when the cache is being read from.


But in the current world, you can serve new content on the conditional check that caches currently do.

That said, I am ultimately for this. I think. There is plenty of data showing that this is a low hanging fruit to hit.


The conditional check that they do sometimes. Now half your users see the new version and half see the old version. Not much of a recall.


Still more of a recall than will be possible in the new world. And you can always detect the old code and prompt users to refresh. (Typically happens on a restart.)

Again, though, I am ultimately for this. I just remain skeptical of any panacea.


You sound as if people who consciously set an immutable header, or set cache expiration time header to 5 years, do not know what they are doing.

Should we optimize the web for clueless server operators?


With the numbers of folks that fix local development by clearing caches... Yes, I am comfortable claiming that. :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: