Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What’s the incentive for websites to let Kagi and others indexing content if llms in search show relevant informations right away? Wouldn’t something like perplexity ai making more sense then? Or perhaps better application of llms to search


I haven't used Kagi's Quick Answer very often yet, but when I do, it always cites its sources and I often end up clicking into at least one of the sources to look for more detail or context.


Bingo. I nearly always use the quick answer, and then will use the cited sources to click onto the page to either read more or verify that the summary was accurate (and it always has been in the 300+ times I've used it).


I've seen it cite sources, and then say the opposite of what the citations say

If you're going to use this feature, always check the citations.


If you publish to make information known, then your incentive is that it helps spread that information to people who may not otherwise visit your site. If you are trying to make money off of search engine traffic then you might not like this much at all. I think most people would rather not be pointed to those sites in the first place, so it’s a win-win if they block crawlers.


This is a bad take. I don't make money off search engine traffic (beyond the occasional donated dollar), and yet I would quite rather[0] that AI doesn't visit my site.

Imagine for a second a world where instead of publishing directly to your own website (home), you publish into the LLM's knowledge base. Nobody comes to you for information, they come to the LLM. Nobody cares about you, or the work you put in. Your labour is just a means to their end. To some extent, you could argue that Wikipedia works in this way, but it really doesn't. The work you put in is reproduced verbatim, and the work is collaborative. You get the joy of seeing your writing being used to help other people in a very direct sort of way, as opposed to being aggregated, misinterpreted and generally warped by a non-intelligent LLM.

In other words, you cannot possibly expect others to want to work in a sweat shop, toiling away to provide you with instant gratification. We must leave room for human expression.

[0]: https://boehs.org/llms.txt


Okay, I’ve imagined the world you described. I don’t care for it. I also don’t think that’s a likely outcome of LLMs. Why would somebody continue “toiling away to provide you with instant gratification” if there’s nothing in it for them?


If you are selling something on your website, you are just as happy for people to find your information if it is through an LLM or a search engine.

If you are publishing information for free, you don't care how people access it.

If you are publishing just information without selling anything and want to make a living on it, you should paywall it or get with a publisher.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: