Hacker Newsnew | past | comments | ask | show | jobs | submit | Arwill's commentslogin

Yes. If you get 120M+ in funding, you no longer qualify to be called indie.

So what would the limit be in your mind? Does it include marketing?

regardless of the definition of the word. i don't think anybody would call a game indie that has tens of millions in production budget, over 300 developer working on it and a movie deal before it was even released.

So under ten million max then?

What's the maximum developer count? Do outsourced assets count, if so, how? By the amount of people who directly worked on the assets by the outsource company or the whole headcount?


just admit you have no idea about indie games. for many years now it has been clear what is NOT an indie game and 7 figure production budget (marketing not included, most indie games don't even have those outside of social media) with hundredth of people working on it is exactly that. just compare them to any other indie game before if you want to educate yourself about something before posting.

they had a whole orchestra of the size of whole indie game studios for the music alone, does that seem like indie?


You're feeding a sea lion.

I thought only organs of people dying in accidents are donated, and not someone's dying from an illness.


I think the criteria for donation are most easily met by people who die as a result of something like a vehicle collision, but an otherwise healthy person who experiences sudden heart failure may have viable organs... From the reporting, this donor was not otherwise healthy, but maybe the symptoms were not known at the time or dismissed for some reason.


Not really. Organs from people who have died are almost always nonviable. So when it comes to vehicle collision victims, only people who are slowly dieing of internal hemorrhaging are used. Sudden heart failure organs are bad for two reasons: first, if the heart actually fails, you have minutes before organs are nonviable. Second, the medication that's used for trying to keep the heart beating will actually accelerate death, including organ death, if it doesn't work.

Most organs come from from people, usually braindead, who are definitely going to die, but you have days or at least hours before the body actually loses the fight. And even then the extraction process needs to be started quickly, because in the process of dieing the body will, as it's losing blood, ie. power and oxygen, one-by-one cut off blood flow from organs to try to keep the heart, lungs and brain alive. Most organs that have had their blood flow cut off by the body can't be transplanted, so extraction needs to happen before that point.

So that was probably the case here.


(wait until you figure out the answer to the other question ... "wait, if organs after death are not used, nonviable, then what happens if I donate my body to science" ?)


I had a WTF moment last week, i was writing SQL, and there was no autocomplete at all. Then a chunk of autocomplete code appeared, what looked like an SQL injection attack, with some "drop table" mixed in. The code would have not worked, it was syntactically rubbish, but still looked spooky, should have made a screenshot of it.


This is the most annoying thing, and it's even happened to Jetbrains' rider too.

Some stuff that used to work well with smart autocomplete / intellisense got worse with AI based autocomplete instead, and there isn't always an easy way to switch back to the old heuristic based stuff.

You can disable it entirely and get dumb autocomplete, or get the "AI powered" rubbish, but they had a very successful heuristic / statistics based approach that worked well without suggesting outright rubbish.

In .NET we've had intellisense for 25 years that would only suggest properties that could exist, and then suddenly I found a while ago that vscode auto-completed properties that don't exist.

It's maddening! The least they could have done is put in a roslyn pass to filter out the impossible.


Loosely related: voice control on Android with Gemini is complete rubbish compared to the old assistant. I used to be able to have texts read out and dictate replies whilst driving. Now it's all nondeterministic which adds cognitive load on me and is unsafe in the same way touch screens in cars are worse than tactile controls.


I've been immensely frustrated by no longer being able to set reminders by voice. I got so used to saying "remind me in an hour to do x" and now that's just entirely not an option.

I'm a very forgetful person and easily distracted. This feature was incredibly valuable to me.


I got Gemini Pro (or whatever it's called) for free for a year on my new Pixel phone, but there's an option to keep Assistant, which I'm using.

Gotta love the enshittification: "new and better" being more CPU cycles being burned for a worse experience.

I just have a shortcut to the Gemini webpage on my home screen if I want to use it, and for some reason I can't just place a shortcut (maybe it's my ancient launcher that's not even in the play store anymore), so I have to make a tasker task that opens the webpage when run.


This is my biggest frustration. Why not check with the compiler to generate code that would actually compile? I've had this with Go and .Net in the Jetbrains IDE. Had to turn ML auto-completion off. It was getting in the way.


The regular JetBrains IDEs have a setting to disable the AI-based inline completion, you can then just assign it to a hotkey and call it when needed.

I found that it makes the AI experience so much better.


There is no setting to revert back to the very reliable and high quality "AI" autocomplete that reliably did not recommend class methods that do not exist and reliably figured out the pattern I was writing 20 lines of without randomly suggesting 100 lines of new code that only disrupts my view of the code I am trying to work on.

I even clicked the "Don't do multiline suggestions" checkbox because the above was so absurdly anti-productive, but it was ignored


Try disabling the "Enable the next edit suggestions" in the AI settings.


The most WTF moment for me was that recent Visual Studio versions hooked up the “add missing import” quick fix suggestion to AI. The AI would spin for 5s, then delete the entire file and only leave the new import statement.

I’m sure someone on the VS team got a pat on the back for increasing AI usage but it’s infuriating that they broke a feature that worked perfectly for a decade+ without AI. Luckily there was a switch buried in settings to disable the AI integration.


You can still use the older ML-model (and non-LLM-based!) IntelliCode completion suggestions - it’s buried in the VS Installer as an optional feature entirely separate from anything branded CoPilot.


The last time I asked Gemini to assist me with some SQL I got (inside my postgres query form):

  This task cannot be accomplished
  USING
    standard SQL queries against the provided database schema. Replication slots
    managed through PostgreSQL system views AND functions,
    NOT through user-defined tables. Therefore,
    I must return
It's feels almost haiku-like.


Gemini weirdly messes things up, even though it seems to have the right information - something I started noticing more often recently. I'd ask it to generate a curl command to call some API, and it would describe (correctly) how to do it, and then generate the code/command, but the command would have obvious things missing like the 'https://' prefix in some case, sometimes the API path, sometimes the auth header/token - even though it mentioned all of those things correctly in the text summary it gave above the code.

I feel like this problem was far less prevalent a few months/weeks ago (before gemini-3?).

Using it for research/learning purposes has been pretty amazing though, while claude code is still best for coding based on my experience.


Now this is prime software gore


The problem with scrapping the web for teaching AI is that the web is full of 'little bobby tables' jokes.


Same thing happened to me today in vs code. A simple helm template:

```{{ .default .Values.whatever 10 }}``` instead of the correct ```{{ default 10 .Values.whatever }}```.

Pure garbage which should be solved by now. I don't understand how it can make such a mistake.


This is a great post. Next time that you see it, grab a screenshot, put on GitHub pages and post it here on HN. It will generated lots of interesting discussion about rubbish suggestions from poor LLM models.


> rubbish suggestions from poor LLM models.

We get rubbish suggestions from SOTA(tm) LLM models too, y’know.


I read somewhere, that it might not have been a process, but a unique event. Dogs are not just gradually tamed wolves, but domestication might have been started with a genetic defect that made them tame.


That would create a genetic bottleneck of one, which should shine like a beacon in the DNA studies. We already know Homo sapiens had a bottleneck of thousands at one point.

It also makes me wonder about the longlasting question of speciation. If it happens suddenly, shouldn't that indicate a singular (or near-singular) instance of mutation?


>strikingly high probabilities to very high prices, along with lower probabilities for a wide range of lower prices

Isn't this describing the strategy of keeping ever high prices, then doing some temporary price cuts/sales/deals?


>see streaming companies increasing prices every few months They can do that because they are practically monopolies.


They can do it because people are hopelessly addicted to screens.

You won't die if you stop watching Netflix. We aren't talking food or medicine here. In fact your life would probably improve. But addiction is a real animal.


I wish there were some term other than addiction here: addicts routinely steal from friends and family to feed their addiction; addicts who are parents sometimes threaten to stop allowing their children to visit with a grandparent unless the grandparent helps the addict pay for the addiction; drug addicts living in violent neighborhoods sometimes agree to murder somebody in exchange for drugs.

Screen addicts almost never stoop that low and the ones that do are addicted to a cam girl (e.g., Grant Amato), porn or gambling, not Netflix (or social media).


Battlefield peaked with BF3


There are libraries, like fastutil, that provide collections for primitive types.


-10 for modules is fair, only 4 for lambdas is not. My programming style changed after using lambdas in Java, even when using a different programming language later that doesn't have lambdas as such.


Lambdas + streams is fantastic. I think if you didn’t have them streams would just be a total mess to use.


Substitute orangutans for Australopithecus. That is (one of) the branches that did evolve more intelligence, but didn't survive. I suppose there were lots of such branches, that either merged back into humanity (like the Neanderthals), or died out.


Australopithecus is essentially on the human branch, and likely was still several million years before the development of advanced intelligence. Our common ancestor with Australopithecus was not any more intelligent than a typical Australopithecus, as far as we can tell.

As far as we can tell, no branch developed significantly increased intelligence after splitting off from our own lineage. That's not to say it definitely didn't happen or that our lineage was always the smartest, just that there isn't any evidence demonstrating a qualitative difference which has survived to the present. But it's weird that no such evidence exists.

Conversely different primate groups did independently evolve similar levels of intelligence, like Capuchin monkeys (which are new world primates) developed their intelligence after splitting off from the old world primates some 40 million years ago. Baboons and Macaques likewise each evolved intelligence independent of the great apes. Likewise similar levels (if different specializations) of intelligence have evolved independently outside the primates, such as cetaceans, elephants, and corvids. For cephalopods, which likewise are highly intelligent, their common ancestor with us didn't even have a brain.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: