1. If google is paying fastly, what prevents google from paying fastly some more to divulge the user information as well? As far as I understand ohp, the model assumes that the three parties are not aligned which is not true for the fastly:google connection.
2. Even 1. is not true, if the encrypted body is not salted, it is effectively just a unique hash of the original url, which means that forth party could make inferences about the content of the message by encrypting many possible url-s of interest through the google key encryption. Salt is not mentioned in the article.
I'm not a chrome user, so it is just random curiosity. In fact I like Google and many of the things that they've made affordable to the world, but I also do not like tracking of people for most reasons and for most of the ways it is done.
OHTTP does require that the parties don't collude, which is why Google has engaged Fastly to run the relay service (which knows end user identifying data) and are themselves running the gateway service (which knows the end user request body).
Part of the contract terms include not delivering log data to Google for this service, among other things that help ensure that this separation of knowledge is upheld.
First, thanks for the answers - both this comment and the other in the thread. HN shines again when it comes to access to the sources of information.
Second, as I said in another comment, I'm not a chrome user and I'm asking more for personal entertainment. However, I think that I'm asking questions that everyone not in the space would ask looking from the outside. Hopefully, your answers are of use for someone else.
Third, my personal biased opinion is that this will not resolve any of the issues surrounding Google and the online tracking. I lost my personal trust in Google many years ago and things haven't change since then. Even this initiative which is supposed to underpin the privacy and the choice of the user is provided as a corporate project with Google choosing who decides on the allowed urls, the ohp provider, and everything else about the parameters of the "deal". As I said, I cannot comment on the cryptography, but anything else in the whole story does not provide me with the confidence that the user choice has been uphold as a value. I doubt that anyone will have their opinion change from all of this.
Possible measures which could've demonstrated some transparency could've been if Google wasn't the only authority on the allow list, if people could choose the ohp provider, if the authority was granted to a ngo with transparent rules and decision taking process, and independent oversight...
Appreciate your questions and feedback. There's nothing wrong with some healthy skepticism. Ultimately this solution depends on the tech and implementation but it also requires a degree of user trust. I've been happy to see both Fastly and Google being pretty transparent about what's going on and how it works, in order to start establishing that trust.
I can't speak to your points about Google specifically, but I have appreciated in my interactions with the Privacy Sandbox team that they are putting a lot of energy in to delivering these services while also respecting user privacy.
On the Fastly side, I see an opportunity to deliver OHTTP services for a bunch of additional use cases and to other customers. I think this could be a powerful tool to enable privacy for all sorts of things, like metrics and log collection and other kinds of API access. The spec right now needs the client to know various things which requires a tight coupling between client -> relay -> gateway -> target, but I think that there are ways that could be adjusted in future revisions. And not all of the opportunities that I'm exploring are for commercial entities, to your point about NGOs.
I'm also working on some other privacy enablement services, like Fastly Privacy Proxy (which is one of the underlying providers for Apple's iCloud Private Relay) and some un-announced things. Between these various technologies I think that Fastly can help to raise the level across the industry for end user privacy.
Ultimately we are a business and we like making money. I think we can do that in this space by delivering real value to our customers and their end users via these building block services that help them to build privacy enabled products. I'm hopeful that, as we explore more opportunities in this space and OHTTP adoption increases, user trust continues to be built in both the OHTTP technology and Fastly's privacy enablement services.
How far does this scenario go though? it would be easier for Google to just deploy a changed client app (chrome) that skips the blinding infrastructure, or just sends the user information to them at the same time anyway. In the end, unless you're reading and building every line of code (and maybe even hardware) you are delegating trust.
How far would a multi billion company go in order to ensure its main source of income? They have a history of providing privacy assurances which fell through under serious inspection. The flock scheme was a few months back, the stop location history scandal was a few years back, and I'm mentioning only those on the top of my head.
That's fair, I suppose my point is there are easier ways, I don't think there's any way to technically achieve full privacy on the current design of the internet that couldn't be circumvented by operational changes. But perhaps the point is more that technologies like this are maybe used more to provide false assurance through obscurity. Although it does sound skeptical and bleak given there is no real solutions to that.
Correct me if I'm wrong, but either the user encryption key is unique and hence the user is identifiable to Google, or a finite number of keys is used which makes the encrypted text subject to identification because a finite number of inputs can produce a given output.
My cryptography education is rather superficial, so I might be missing something.
I know nothing about what has been implemented by Google/Fastly. All I can point you to is the details of Cloudflare's implementation and RFCs (which we co-authored): https://datatracker.ietf.org/doc/draft-ietf-ohai-ohttp/