+1 recommended reading for "In Defense of Food". It's way outside my regular reading tastes (pop-ish science + nutrition related), but I enjoyed every moment of it, and it's definitely changed the way I think about eating.
You could still keep a record of all calls made, and the tower(s) used to make each call, without getting down to this level of granularity. I might be wrong here, but it seems that they're storing a stream of triangulated location data for a given handset, regardless of the actual network activity like calling.
Let's say that the carriers only stored call records with tower IDs for each call. If there was a dispute every so often because a call was made on a close-by tower that itself happened to be located in a different toll bracket (resulting in an incorrect overcharge to the customer), I'm sure the carrier would happily write off that charge if you disputed them about it, which is probably what they'd do right now anyway.
There must be another reason they're keeping it, even if it's just a case of it being super-cheap to store, and they think that they might figure out something to do with it later.
The discussion included that they use it for modeling traffic patterns and areas to help them plan capacity and new towers. I'm sure they use it for other things as well.
I'd also like to know whether this information can be or has ever been used in court as evidence. "Where were you on the night of such and such...?" may become a question of the past. Frightening.
Oh yeah, it does get used in court in the US. I paid attention to a murder trial of a neighbor that happened nearly a decade ago, and they presented evidence of his cell tower connections contradicting his story of whether he was in a certain state at the time. It was by no means a substantial part of their case though, they were just piling things on. But yes, it's definitely admissible, I assume it happens all the time.
Falls under the category of circumstantial evidence, which can certainly tip the balance beyond the "reasonable doubt" criteria if there is enough of it.
The new layouts for both gizmodo & lifehacker seem quite buggy to me (Mac, latest Chrome). Most obviously, there is a grey line that tracks its way up the screen as I scroll past 'the fold'. And I agree with earlier posters: user test before release (qualitative), and split test as part of the release (quantitative).
I think the better all round solution (in terms of privacy, functionality and maintenance) is to go with a free private git hosting provider like assembla.com. I have tried both the dropbox solution as well as setting up git+ssh on a vps, and assembla is now my goto for personal and side projects.
Because it's incredibly dangerous to think your machines are more secure just because you own them. How much time per day do you spend making sure your machine and everything on it is secure?
We employ teams of people to keep your code safe on GitHub.
For the time being, I will personally continue to build #!-only websites, designed exclusively for javascript enabled browsers. Maintaining two versions (along the guidelines of progressive enhancement) is just too much work (maintaining and testing html view templates as well as jQuery or moustache templates) considering that so few people lack javascript. I wouldn't let a release go live without running a selenium suite across it in any case. My perspective would be different, I imagine, if I worked on a large team that could 'afford' to take the progressive enhancement route.
The point of progressive enhancement is specifically to not maintain two versions. You have just one version of your code for all user agents, and then you enhance it for agents that have better features.
Now worrying about having to build your templates twice -- once in your back end language, and then in JavaScript -- is a valid concern. Code duplication is never good. However, mustache is supported by many many languages (http://mustache.github.com/) which means that you really can build your templates just once and call out to very similar code to populate them with data.
Yes, you would need to run tests against the site with and without JavaScript (or CSS or XHR or any other feature that you were using to progressively enhance your page, but how far you go is really up to you), but if that's automated, it isn't much effort on your part.
I'll be the first to admit that it's fine to cheat in the code to give your users a better experience, but I don't think it's a good idea purely to take development short cuts or cut costs. If our roads were built that way (and in India they are), you'd end up with potholes every six months (and in India we do).
>> I will personally continue to build #!-only websites, designed exclusively for javascript enabled browsers
We as web developers just spent the last 10 years as slaves to IE6 and saying "oh how I wish developers would/could develop for standards, not just with platform X in mind". And here we are again, setting ourselves up for an even worse situation than the IE6 problems. We have sites designed with only (iphone | IE6 | javascript-capable) in mind. The reason of "it's too much work" is the same answers given time and time again to the sites only designed for platform X, but that's not a good reason to do something when the platform you're delivering it on is by definition a network of variable capability platforms.
It's fine if we want to try to push the state of the art of rich internet apps, but at what point do we stand back and realize that we're not building a website (a collection of HTML documents on the world wide web) but rather delivering a javascript executable to our user that just happens to be renderable in most modern web browsers?
I don't mean to single you out, it's something that we all having to deal with, but is there anyone at the standards organizations that are listening to the pulse of the new web? If people want to deliver applications to users via a URI, why do we have to include all of the extra baggage of HTTP/HTML/CSS/Javascript?
If we as the artists of the web are going to break the implied contract of what the WWW is, we should at least be honest with ourselves and work toward a real solution rather than trying to staple on yet another update to HTML to try its hardest to pretend to be its big brother Mr. Desktop App.
There's a thing called "usability". A web application may need javascript; a website presenting documents and information has no excuse not to work in pure HTML.
Typical crap : you can't access nvidia.com anymore with lynx/links. When your goddam nvidia proprietary driver doesn't play well with your kernel upgrade, you have no way anymore to download a new one without running X11, though it used to work not too long ago.
I take almost the polar opposite approach to you, always providing a safe fallback to plain ol' HTML. I feel its a defensive style and doesn't provide a single point of failure. I run Noscript since it makes most pages load much faster. No offense, but I don't want to run your code, I'd rather read your content.
Are there still content heavy sites? Not to be facetious here but it seems user demand is aligned with smaller and smaller, more and more active bits of content served by web applications. At some point you may have a lot of content, howev it's not so much heavy as it is highly interactive.
Can you clarify something - what benefit does the #! syntax give your site specifically? The reason for using this seems to be misunderstood by the original article.
Not trolling. But I admit that I'm overstating my position a little bit to see if any interesting discussion comes of it. The reality of my current situation (a one, sometimes two person team) has meant that I have indeed been following the convention I just described. That's not to say I want to, but as I mentioned, I will probably continue to for the foreseeable future without too much hesitation.
I worked in Biometrics for a few years. I was on a fingerprint team, but I was part a wider team that was also responsible for forensic-quality facial recognition technology (a world leader). To put it bluntly, the technology is still very, very, very weak. When it works, it's really impressive, but that's normally during a demo from a nicely curated database. There are some facial technologies that can "work" from a distance (and they are used in airports already), but the success rate is low and the original image still needs to be of decent fidelity. Often times the results from the automated matching will be shortlisted for comparison by human operators. Maybe I was blinded by the awesomeness of fingerprint technology (still blows my mind thinking about it).
Combine this general lack of automated matching awesomeness with the fact that people age, can wear glasses or a beard, etc, and we are still many years away from this kind of capability.
Still, it's always fun to think about the possibilities that biometrics can and will provide.
I'd also like to hear some pragmatic CSS professionals weigh in on this topic. From what I've seen so far, I agree that some of the resets are over the top, but I'm not sure whether the opposition to them is a result of purism over pragmatism. I really just want to speed up CSS development...
> I really just want to speed up CSS development...
Meyer's reset is not supposed to speed up development. On top of ensuring basic consistency, Mayer's goal is (was?) to avoid taking default styles for granted, think more about document and re-create all those styles for it:
If you just want consistency, but don't want to spend extra time re-creating basic defaults, then unbolding of strong, unitalicizing of em and few other such resets don't make sense.
I'm finding that they are less relevant than they used to be. We are beginning to have more consistency across browsers, fortunately. I've been working on projects lately that don't employ them with very minimal issues.
Never use them, never will. I just don't see any benefits and I hate how reset stylesheets pollutes firebug or webinspector CSS panels with inheritance nightmare.