Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ted Chiang is a master of analogies. It’s absolutely delightful to read his work and wrestle with the philosophical questions he explores. I devour almost everything he puts out, and they give me a much-needed escape from my world of bits and registers.

“LLMs are a blurry JPEG of the web” has stuck with me since the piece was published in the early days of ChatGPT. Another good one is his piece on why AI can’t make art.

While I heavily use AI both for work and in my day-to-day life, I still see it as a tool for massive wealth accumulation for a certain group, and it seems like Ted Chiang thinks along the same lines:

> But why, for example, do large corporations behave so much worse than most of the people who work for them? I think most of the people who work for large corporations are, to varying degrees, unhappy with the effect those corporations have on the world. Why is that? And could that be fixed by solving a math problem? I don’t think so.

> But any attempt to encourage people to treat AI systems with respect should be understood as an attempt to make people defer to corporate interests. It might have value to corporations, but there is no value for you.

> My stance on this has probably shifted in a negative direction over time, primarily because of my growing awareness of how often technology is used for wealth accumulation. I don’t think capitalism will solve the problems that capitalism creates, so I’d be much more optimistic about technological development if we could prevent it from making a few people extremely rich.



> master of analogies

analogy, in other words, embeddings?


No, analogies and embeddings are not exactly the same. Analogies in language are not a math function.

As Ted Chiang comments on the article, this kind of reasoning ("the brain is like $CURRENT_TECH") is flawed.


well, embedding comes with certain loss as well.

If you believe our brains use "language" to think, then I would assume analogies play an important part in reasoning.


I don't know if brains use language to think, but I do believe analogies play an important part in reasoning.

I'm just saying embeddings and analogies aren't the same; equating them is precisely the kind of flawed reasoning Ted Chiang mentions in the article.

Or to answer more directly:

> analogy, in other words, embeddings?

No, analogies aren't embeddings "in other words".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: