Hacker Newsnew | past | comments | ask | show | jobs | submit | taylorius's commentslogin

pretty cool. I wonder if they could add a zoom lens feature, like if you squint a bit, the lens could form into a magnifier.

100% yes. As you say, Psychosis is an exaggeration but frazzled captures it. And it's definitely a feeling that "this is such an opportunity, but the window will close - got to make the most of it somehow". Very stressful in a diffuse kind of way.


I think this one is winning the inverse beauty contest.

It looks like it really wants to scoop up a large amount of plankton mid-cruise.


See also the Caproni Transaero, which isn't totally ugly but is messy in a "maybe more wings is better? some pushing engines at the back?" kind of way.

https://en.wikipedia.org/wiki/Caproni_Ca.60


> pushing engines at the back

Weird aircraft with a pusher engine? Curtiss-Wright XP-55 Ascender, right this way:

https://en.wikipedia.org/wiki/Curtiss-Wright_XP-55_Ascender

(and check out the list of similar aircraft)


I had a bloody die-cast toy of that as a kid for some reason, I thought it was just a fake plane they'd invented to justify a toy!

Forte of the gaps :-) I'm exactly the same.


Heredity is only one of many flavours of cronyism.


I'm working on a native code backtester that compiles pinescript strategies, and (hopefully) runs them super fast. Also a parameter optimiser with different scoring methods.


I thought Claude Monet - Impressionist techniques applied to coding.


"the dumb money"


maybe different preparatory "system" prompts?


Why can LLMs not be responsible for things? (genuine question - I'm not certain myself).


because it doesn't have any skin in the game and can't be punished, and can't be rewarded for succeeding. Its reputation, career, and dignity are nonexistent.


On the contrary - the LLM has had it's own version of "skin in the game" through the whole of it's training. Reinforcement learning is nothing but that. Why is that less real than putting a person in prison. Is it because of the LLM itself, or because you don't trust the people selling it to you?


Are you claiming that LLMs are... sentient? Bold claim, Taylor.


This doesn't seem to have stopped anyone before.


Stopped anyone from doing what? Assigning responsibility to someone with nothing to lose, no dignity or pride, and immune from financial or social injury?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: