i read in an earlier thread for this on HN - "this is a classic example of data driven product decision" aka we can reduce costs by $x if we just stopped goo.gl links. Instead of actually wondering how this would impact the customers.
Also helps that they are in a culture which does not mind killing services on a whim.
The Google URL shortener stopped accepting new links around 2018. It has been deprecated for a long time.
I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.
They also might have wanted to use the domain for something else.
The nature of something like this is that the cost to run it naturally goes down over time. Old links get clicked less so the hardware costs would be basically nothing.
As for the actual software security, it's a URL shortener. They could rewrite the entire thing in almost no time with just a single dev. Especially since it's strictly hosting static links at this point.
It probably took them more time and money to find inactive links than it'd take to keep the entire thing running for a couple of years.
My understanding from conversations I've seen about Google Reader is that the problem with Google is that every few years they have a new wave of infrastructure, which necessitates upgrading a bunch of things about all of their products.
I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.
If a product has an active team maintaining it they can handle the upgrade. If a product has no team assigned there's nobody to do that work.
My understanding is that (at least at one point) binaries older than about six months were not allowed to run in production. But APIs are "evolving" irregularly so the longer you go between builds the more likely something is going to break. You really need a continuous build going to stay on top of it.
Best analogy I can think of is log-rolling (as in the lumberjack competition).
Google is famously a monorepo and is basically the gold standard of CI/CD.
What does happen is APIs are constantly upgraded and rewritten and deprecated. Eventually projects using the deprecated APIs need to be upgraded or dropped. I don't really understand why developers LOVE to deprecate shit that has users but it's a fact of life.
Second hand info about Google only so take it with a grain of salt.
Simple: you don't get promoted for maintaining legacy stuff. You do get promoted for providing something new that people adopt.
As such, developing a new API gets more brownie points than rebuilding a service that does a better job of providing an existing API.
To be more charitable, having learned lessons from an existing API, a new one might incorporate those lessons learned and be able to do a better job serving various needs. At some point, it stops making sense to support older versions of an API as multiple versions with multiple sets of documentation can be really confusing.
I'm personally cynical enough to believe more in the less charitable version, but it's not impossible.
I agree this is an overriding incentive that hurts customers & companies. I don't think there's an easy fix. Designing & creating new products require more relevant capabilities from employees for promotions then maintaining legacy code.
> I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.
They deprecate internal infrastructure stuff zealously and tell teams they need to be off of such and such by this date.
But it's worse than that because they'll bring up whole new datacenters without ever bringing the deprecated service up, and they also retire datacenters with some regularity. So if you run a service that depends on deprecated services you could quickly find yourself in a situation where you have to migrate to maintain N+2 redundancy but there's hardly any datacenter with capacity available in the deprecated service you depend on.
Also, how many man years of engineering do you want to spend on keeping goo.gl running. If you were an engineer would you want to be assigned this project? What are you going to put in your perf packet? "Spent 6 months of my time and also bothered engineers in other teams to keep this service that makes us no money running"?
> If you were an engineer would you want to be assigned this project?
If you're high flying, trying to be the next Urs or Jeff Dean or Ian Goodfellow, you wouldn't, but I'm sure there's are many thousands of people who are able to do the job that would just love to work for Google and collect a paycheck on a $150k/yr job and do that for the rest of their lives.
I'd like to encourage you consider the following two perspectives --
1. A senior Google leader telling the shareholders "we've asked 1% of our engineers, that's 270 people, costing $80M/year, to work on services that produce no revenue whatsoever." I don't think it would pass that well.
2. A Google middle manager trying to figure out if an engineer working exclusively on non-revenue projects is actually being useful or otherwise; this is made more complex by about 30% of the workforce trying to go for the rest and vest option provided by these projects.
> A senior Google leader telling the shareholders "we've asked 1% of our engineers, that's 270 people, costing $80M/year, to work on services that produce no revenue whatsoever." I don't think it would pass that well.
The business case for this is that Google lose a bunch of money in b2b (cloud mostly, potentially AI in future) because professional users (developers etc) don't believe that products will be supported. Every time Google shut down a service like this, this perception is re-inforced. We're investing this money into these services to change our brand perception and help us make more money in future.
As a bonus, this kind of cultural change would also force them to rebuild their engineering systems (and promotional systems) to make this easier. This may not have mattered for Search/Ads but it will matter if they actually care about winning in cloud and AI.
A Google shareholder that shortsighted might as well ask why they have an HR department or have custodians to maintain the offices, after all, they don't generate income either.
The manager in the trenches can tell if there's actual work happening, to move goo.gl from the internal legacy system to the new supported one doesn't magically happen, code needs to change for it to work after the old system gets shut off.
A lot of Google infra services are built around the understanding that clients will be re-built to pick up library changes pretty often, and that you can make breaking API changes from time to time (with lots of notice).
But if you don't downgrade the old, then you're endlessly supporting systems, forever. At some point, it does become cheaper to migrate everything to the new.
You know how Google deprecating stuff externally is a (deserved) meme? Things get deprecated internally even more frequently and someone has to migrate to the new thing. It's a huge pain in the ass to keep up with for teams that are fully funded. If something doesn't have a team dedicated to it eventually someone will decide it's no longer worth that burden and shut it down instead.
I think the concern is someone might scan all the inactive links and find that some of them link to secret URL's, leak design details about how things are built, link to documents shared 'anyone with the link' permission, etc.
> I think the concern is someone might scan all the inactive links
How? Barring a database leak I don't see a way for someone to simply scan all the links. Putting something like Cloudflare in front of the shortener with a rate limit would prevent brute force scanning. I assume google semi-competently made the shortener (using a random number generator) which would make it pretty hard to find links in the first place.
Removing inactive links also doesn't solve this problem. You can still have active links to secret docs.
> I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.
Yeah I can't imagine it being a huge cost saver? But guessing that the people who developed it long moved on, and it stopped being a cool project. And depending on the culture inside Google it just doesn't pay career-wise to maintain someone else's project.
I think the problem with URL shorteners like Google’s that includes the company name is that to the layperson there is possibly an implied level of safety.
Here is a service that basically makes Google $0 and confuses a non-zero amount of non-technical users when it sends them to a scam website.
Also, in the age of OCR on every device they make basically no sense. You can take a picture of a long URL on a piece of paper then just copy and paste the text instantly. The URL shortener no longer serves a discernible purpose.
Sure it can. It takes X people Y hours a day/month/week to perform tasks, including planning and digging up the context behind, related to this service. Those X people make Z dollars per year. It's an extremely simple math equation
Goo.gl didn't have customers, it had users. Customers pay, either with money or their personal data, now or the future. Goo.gl did not make any money or have a plan to do so in the future.
Why is it evil? If we assume that a free URL shortener is a good thing, and that shutting one down is a bad thing, and given that every link shortener will have costs (not just the servers -- constant moderation needs, as scammers and worse use them) and no revenue. The only possible outcome is for them all to eventually shut down, causing unrecoverable linkrot.
Given those options, an ad seems like a trivial annoyance to anyone who very much needs a very old link to work. Anyone who still has the ability to update their pages can always update their links.
The monetary value of the goodwill and mindshare generated by such a free service is hard to calculate, but definitely significant. I wouldn't be surprised if it was more than it costs to run.
Which raises the obvious question -- why make a service that you know will eventually be shut down because of said economics. Especially one that (by design) will render many documents unusable when it is shut down.
While I generally find the "killed by Google" thing insanely short-sighted, this borders on straight-up negligence.
There was a time in Google where anything seemed possible and they loved doing experimental or fun things just for doing them. The ad machine printed money and nobody asked questions. Over time they started turning into a normal corporation with bean counting.
I always figured most of the real value of these url hashing services was as an marketing tracking metric. That is, sort of equivalent to the "share with" widgets provided that conveniently also dump tons of analytics to the services.
I will be honest I was never in an environment that would benefit from link shortening, so I don't really know if any end users actually wanted them (my guess twitter mainly) and always viewed these hashed links with extreme suspicion.
One of the complaints about Google is that it's difficult to launch products due to bureaucracy. I'm starting to thing that's not a bad thing. If they'd done a careful analysis of the cost of jumping into this url-shortener bandwagon, we wouldn't be here. Maybe it's not a bad thing they move slower now.
I would bet that the salaries paid to the product managers behind shutting this down, during the time they worked on shutting it down, outweigh the annual cost of running the service by an order of magnitude.
At this point, anyone depending on Google for anything deserves to get burned.
I don't know how much more clearly they could tell their users that Google has absolutely no respect for users without drone shipping boxes of excrement.
If companies can spend billions on AI and not have anything in return and be okay with that in the ways of giving free stuff (okay, I'll admit not completely free since you are the product but still free)
Then they should also be okay for keeping the goo.gl links honestly.
Sounds kinda bad for some good will but this is literally google, the one thing google is notorious for is killing their products.
This is basically modern SV business. This old data is costing us about a million a year to hold onto. KILL IT NOW WITH FIRE.
Hey lets also dump 100 Billion dollars into this AI thing without any business plan or ideas to back it up this year. HOW FAST CAN YOU ACCEPT MY CHECK!
Can't dig this document up right now, but in their Chrome dev process they say something along these lines: "even if a ferie is used by 0.01% of users, at scale that's a lot of users . Don't remove until you've made solely due impost is negligible".
At Google scale I'm surprised [1] this is not applied everywhere.
Yup, 0.01% of users at scale is indeed a lot of users.
This is exactly why many big companies like Amazon, Google and Mozilla still support TLSv1.0, for example, whereas all the fancy websites would return an error unless you're using TLSv1.3 as if their life depends on it.
In fact, I just checked a few seconds ago with `lynx`, and Google Search even still works on plain old HTTP without the "S", too — no TLS required whatsoever to start with.
Most people are very surprised by this revelation, and many don't even believe it, because it's difficult to reproduce this with a normal desktop browser, apart from lynx.
But this also shows just out how out of touch Walmart's digital presence really is, because somehow they deem themselves to be important enough to mandate TLSv1.2 and the very latest browsers unlike all the major ecommerce heavyweights, and deny service to anyone who doesn't have the latest device with all the latest updates installed, breaking even the slightly outdated browsers even if they do support TLSv1.2.
So bizarre. Embedded links, docs, social posts, stuff that could be years and years old, and they're expecting traffic to them recently? Why do they seem to think their link shortener is only being used for like someone's social profile linktree or something. Some marketing person's bizarre view of how the web is being used.
> We understand these links are embedded in countless documents, videos, posts and more, and we appreciate the input received.
How did they think the links were being used?