Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What amazes me is that this wasn't the original plan. What product manager thinks "the best thing for our customers is to delete their data!".

> We understand these links are embedded in countless documents, videos, posts and more, and we appreciate the input received.

How did they think the links were being used?



i read in an earlier thread for this on HN - "this is a classic example of data driven product decision" aka we can reduce costs by $x if we just stopped goo.gl links. Instead of actually wondering how this would impact the customers.

Also helps that they are in a culture which does not mind killing services on a whim.


The Google URL shortener stopped accepting new links around 2018. It has been deprecated for a long time.

I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.

They also might have wanted to use the domain for something else.


How much of a burden could this really be?

The nature of something like this is that the cost to run it naturally goes down over time. Old links get clicked less so the hardware costs would be basically nothing.

As for the actual software security, it's a URL shortener. They could rewrite the entire thing in almost no time with just a single dev. Especially since it's strictly hosting static links at this point.

It probably took them more time and money to find inactive links than it'd take to keep the entire thing running for a couple of years.


"How much of a burden could this really be?"

My understanding from conversations I've seen about Google Reader is that the problem with Google is that every few years they have a new wave of infrastructure, which necessitates upgrading a bunch of things about all of their products.

I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.

If a product has an active team maintaining it they can handle the upgrade. If a product has no team assigned there's nobody to do that work.


My understanding is that (at least at one point) binaries older than about six months were not allowed to run in production. But APIs are "evolving" irregularly so the longer you go between builds the more likely something is going to break. You really need a continuous build going to stay on top of it.

Best analogy I can think of is log-rolling (as in the lumberjack competition).


Google is famously a monorepo and is basically the gold standard of CI/CD.

What does happen is APIs are constantly upgraded and rewritten and deprecated. Eventually projects using the deprecated APIs need to be upgraded or dropped. I don't really understand why developers LOVE to deprecate shit that has users but it's a fact of life.

Second hand info about Google only so take it with a grain of salt.


Simple: you don't get promoted for maintaining legacy stuff. You do get promoted for providing something new that people adopt.

As such, developing a new API gets more brownie points than rebuilding a service that does a better job of providing an existing API.

To be more charitable, having learned lessons from an existing API, a new one might incorporate those lessons learned and be able to do a better job serving various needs. At some point, it stops making sense to support older versions of an API as multiple versions with multiple sets of documentation can be really confusing.

I'm personally cynical enough to believe more in the less charitable version, but it's not impossible.


I agree this is an overriding incentive that hurts customers & companies. I don't think there's an easy fix. Designing & creating new products require more relevant capabilities from employees for promotions then maintaining legacy code.


> I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.

Arrival of new does not neccessitate migration.

Only departure of old does.


They deprecate internal infrastructure stuff zealously and tell teams they need to be off of such and such by this date.

But it's worse than that because they'll bring up whole new datacenters without ever bringing the deprecated service up, and they also retire datacenters with some regularity. So if you run a service that depends on deprecated services you could quickly find yourself in a situation where you have to migrate to maintain N+2 redundancy but there's hardly any datacenter with capacity available in the deprecated service you depend on.

Also, how many man years of engineering do you want to spend on keeping goo.gl running. If you were an engineer would you want to be assigned this project? What are you going to put in your perf packet? "Spent 6 months of my time and also bothered engineers in other teams to keep this service that makes us no money running"?


> If you were an engineer would you want to be assigned this project?

If you're high flying, trying to be the next Urs or Jeff Dean or Ian Goodfellow, you wouldn't, but I'm sure there's are many thousands of people who are able to do the job that would just love to work for Google and collect a paycheck on a $150k/yr job and do that for the rest of their lives.


I'd like to encourage you consider the following two perspectives --

1. A senior Google leader telling the shareholders "we've asked 1% of our engineers, that's 270 people, costing $80M/year, to work on services that produce no revenue whatsoever." I don't think it would pass that well.

2. A Google middle manager trying to figure out if an engineer working exclusively on non-revenue projects is actually being useful or otherwise; this is made more complex by about 30% of the workforce trying to go for the rest and vest option provided by these projects.


> A senior Google leader telling the shareholders "we've asked 1% of our engineers, that's 270 people, costing $80M/year, to work on services that produce no revenue whatsoever." I don't think it would pass that well.

The business case for this is that Google lose a bunch of money in b2b (cloud mostly, potentially AI in future) because professional users (developers etc) don't believe that products will be supported. Every time Google shut down a service like this, this perception is re-inforced. We're investing this money into these services to change our brand perception and help us make more money in future.

As a bonus, this kind of cultural change would also force them to rebuild their engineering systems (and promotional systems) to make this easier. This may not have mattered for Search/Ads but it will matter if they actually care about winning in cloud and AI.


A Google shareholder that shortsighted might as well ask why they have an HR department or have custodians to maintain the offices, after all, they don't generate income either.

The manager in the trenches can tell if there's actual work happening, to move goo.gl from the internal legacy system to the new supported one doesn't magically happen, code needs to change for it to work after the old system gets shut off.


Because it costs money to run things, and no one wants to pay for something that they aren't getting career value for.


A lot of Google infra services are built around the understanding that clients will be re-built to pick up library changes pretty often, and that you can make breaking API changes from time to time (with lots of notice).


But if you don't downgrade the old, then you're endlessly supporting systems, forever. At some point, it does become cheaper to migrate everything to the new.


And you could assign somebody to do that work, but who wants to be employed as the maintainer of a dead product? It’s a career dead-end.


> If a product has no team assigned there's nobody to do that work.

This seems like a good eval case for autonomous coding agents.


> How much of a burden could this really be?

You know how Google deprecating stuff externally is a (deserved) meme? Things get deprecated internally even more frequently and someone has to migrate to the new thing. It's a huge pain in the ass to keep up with for teams that are fully funded. If something doesn't have a team dedicated to it eventually someone will decide it's no longer worth that burden and shut it down instead.


I assume the general problem is people using these links for bad purposes and having to deal with needing to moderate them.


I think the concern is someone might scan all the inactive links and find that some of them link to secret URL's, leak design details about how things are built, link to documents shared 'anyone with the link' permission, etc.


> I think the concern is someone might scan all the inactive links

How? Barring a database leak I don't see a way for someone to simply scan all the links. Putting something like Cloudflare in front of the shortener with a rate limit would prevent brute force scanning. I assume google semi-competently made the shortener (using a random number generator) which would make it pretty hard to find links in the first place.

Removing inactive links also doesn't solve this problem. You can still have active links to secret docs.


To make the URLs actually short, you need to use most/all of the keyspace.

Back when it was made, shorteners were competing to see who could make the shortest URL, so I bet a brute force scan would find everything.


> You can still have active links to secret docs.

If they're have a (passwordless) URL they're not secret.


> My guess would be that it was a security and maintenance burden that nobody wanted.

Cloudflare offered to run it and Google turned them down:

https://x.com/elithrar/status/1948451254780526609


> I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.

Yeah I can't imagine it being a huge cost saver? But guessing that the people who developed it long moved on, and it stopped being a cool project. And depending on the culture inside Google it just doesn't pay career-wise to maintain someone else's project.


>The Google URL shortener stopped accepting new links around 2018. It has been deprecated for a long time.

It's a strange thing to consider 'since 2018' "a long time". Only in tech circles is this so, not in normal life.


I really doubt it was about security/maintenance burdens. Under the hood, goo.gl just uses Firebase Dynamic Links which is still supported by Google.

Edit: nevermind, I had no idea Dynamic Links is deprecated and will be shutting down.


Firebase Dynamic Links is shutting down at the end of August 2025.


I had no idea. It's too late to delete my comment now.

It's a really ridiculous decision though. There's not a lot that goes into a link redirection service.


Documents from 2018 haven't decayed or somehow become irrelevant.


I think the problem with URL shorteners like Google’s that includes the company name is that to the layperson there is possibly an implied level of safety.

Here is a service that basically makes Google $0 and confuses a non-zero amount of non-technical users when it sends them to a scam website.

Also, in the age of OCR on every device they make basically no sense. You can take a picture of a long URL on a piece of paper then just copy and paste the text instantly. The URL shortener no longer serves a discernible purpose.


Shorter URLs mean fewer characters to encode in a QR code.


And how does that matter? The QR gets read either way.


No, code is smaller and more readable, also shortener means additional tracking layer


Less complex QR codes are easier to scan, especially at a distance


How much does it really cost google to answer some quick HTTP requests and redirect, vs all their youtube videos etc


"security and maintenance burden" == "cost" == "cost-driven decision"


Capital inputs are one part of the equation. The human cost of mental and contextual overhead cannot be reduced to dollars and cents.


Sure it can. It takes X people Y hours a day/month/week to perform tasks, including planning and digging up the context behind, related to this service. Those X people make Z dollars per year. It's an extremely simple math equation


Emotional labor doesn’t show up on a balance sheet.


Goo.gl didn't have customers, it had users. Customers pay, either with money or their personal data, now or the future. Goo.gl did not make any money or have a plan to do so in the future.


One wonders why they don't, instead of showing down, display a 15s interstitial unskippable YouTube-style ad prior to redirecting.

That way they'll make money, and they can fund the service not having to shut down, and there isn't any linkrot.


This is such an evil idea.


Why is it evil? If we assume that a free URL shortener is a good thing, and that shutting one down is a bad thing, and given that every link shortener will have costs (not just the servers -- constant moderation needs, as scammers and worse use them) and no revenue. The only possible outcome is for them all to eventually shut down, causing unrecoverable linkrot.

Given those options, an ad seems like a trivial annoyance to anyone who very much needs a very old link to work. Anyone who still has the ability to update their pages can always update their links.


This is how every URL shortener on the internet worked used to work


Well, it's either that or a paywall. Pick your poison.


The monetary value of the goodwill and mindshare generated by such a free service is hard to calculate, but definitely significant. I wouldn't be surprised if it was more than it costs to run.


And also the ongoing demonstration of why you should never trust Google.

"Here's a permanent (*) link".

[*] Definitions of permanent may vary wildly.


Which raises the obvious question -- why make a service that you know will eventually be shut down because of said economics. Especially one that (by design) will render many documents unusable when it is shut down.

While I generally find the "killed by Google" thing insanely short-sighted, this borders on straight-up negligence.


There was a time in Google where anything seemed possible and they loved doing experimental or fun things just for doing them. The ad machine printed money and nobody asked questions. Over time they started turning into a normal corporation with bean counting.


I always figured most of the real value of these url hashing services was as an marketing tracking metric. That is, sort of equivalent to the "share with" widgets provided that conveniently also dump tons of analytics to the services.

I will be honest I was never in an environment that would benefit from link shortening, so I don't really know if any end users actually wanted them (my guess twitter mainly) and always viewed these hashed links with extreme suspicion.


One of the complaints about Google is that it's difficult to launch products due to bureaucracy. I'm starting to thing that's not a bad thing. If they'd done a careful analysis of the cost of jumping into this url-shortener bandwagon, we wouldn't be here. Maybe it's not a bad thing they move slower now.


I would bet that the salaries paid to the product managers behind shutting this down, during the time they worked on shutting it down, outweigh the annual cost of running the service by an order of magnitude.


At this point, anyone depending on Google for anything deserves to get burned. I don't know how much more clearly they could tell their users that Google has absolutely no respect for users without drone shipping boxes of excrement.


If companies can spend billions on AI and not have anything in return and be okay with that in the ways of giving free stuff (okay, I'll admit not completely free since you are the product but still free)

Then they should also be okay for keeping the goo.gl links honestly.

Sounds kinda bad for some good will but this is literally google, the one thing google is notorious for is killing their products.


This is basically modern SV business. This old data is costing us about a million a year to hold onto. KILL IT NOW WITH FIRE.

Hey lets also dump 100 Billion dollars into this AI thing without any business plan or ideas to back it up this year. HOW FAST CAN YOU ACCEPT MY CHECK!


Nobody wants to catch a falling knife, everybody wants to attach a lanyard to the moon rocket.


Hard to imagine costs were ever a factor.

For company running GCP and giving things like Colab TPUs free the costs of running a URL service would be trivial rounding number at best


Outside of bandwidth, I could run this entire service on a raspberry pi. No, I'm not exaggerating. It's just a text end of url to full url lookup.

I've handled far more traffic on single machines 20 years ago.


Arguably, this is them collecting the wrong types of data to inform decisions, if that isn't represented in the data.


All while data and visibility is part of the business.

Like other things spun down there must not be value in the links.


For all HN commenters: if you are not paying for it, you are not a customer and thus you should not complain.


> How did they think the links were being used?

Can't dig this document up right now, but in their Chrome dev process they say something along these lines: "even if a ferie is used by 0.01% of users, at scale that's a lot of users . Don't remove until you've made solely due impost is negligible".

At Google scale I'm surprised [1] this is not applied everywhere.

[1] Well, not that surprised


Yup, 0.01% of users at scale is indeed a lot of users.

This is exactly why many big companies like Amazon, Google and Mozilla still support TLSv1.0, for example, whereas all the fancy websites would return an error unless you're using TLSv1.3 as if their life depends on it.

In fact, I just checked a few seconds ago with `lynx`, and Google Search even still works on plain old HTTP without the "S", too — no TLS required whatsoever to start with.

Most people are very surprised by this revelation, and many don't even believe it, because it's difficult to reproduce this with a normal desktop browser, apart from lynx.

But this also shows just out how out of touch Walmart's digital presence really is, because somehow they deem themselves to be important enough to mandate TLSv1.2 and the very latest browsers unlike all the major ecommerce heavyweights, and deny service to anyone who doesn't have the latest device with all the latest updates installed, breaking even the slightly outdated browsers even if they do support TLSv1.2.


I guess the number of people who use Chrome to access files via FTP must be below 0.01% then.

https://www.auslogics.com/en/articles/is-it-bad-that-google-...


You can always justify the removal of something despite the guidelines


A “ferie”?


It makes solely due impost.


A feature :) iOS keyboard is unusable, but it also produces unreadable text


So bizarre. Embedded links, docs, social posts, stuff that could be years and years old, and they're expecting traffic to them recently? Why do they seem to think their link shortener is only being used for like someone's social profile linktree or something. Some marketing person's bizarre view of how the web is being used.


One that is operating in an environment where strict privacy laws exist. User data stuck in legacy systems is a liability.

Not only are things evolving internally within Google, laws are evolving externally and must be followed.


It's a redirect service. In archive mode, no private info is needed. No creator, no anything. Just link to link mapping.


tail -f access.log maybe?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: