Hacker Newsnew | past | comments | ask | show | jobs | submit | stephenr's commentslogin

My experience with coding by chat bot is about the same as my experiences welding by ear and driving by touch.

I'm sure there's people out there doing it but I don't feel like it's a particularly good use of my time.


Oh good. Thank fuck that one of the dozens if not hundreds of terminal emulators isn't going to be a "rug pull".

It's a completely fucked situation when it happens to fairly unique/obscure software like say Terraform or Packer or Vagrant.

But if it happened to some software that's so common it's literally competing against built in apps on every desktop OS, I just don't know what I'd ever do

/s for anyone who needs it.


Honestly this article seems like a really good ad for not trying/using jj.

I cannot believe how ironically confusing it sounds to use, given that they claim it is "designed from the ground up to be easy to use".


I don't remember my first steps with subversion, mercurial, or git too well, but I don't think any of them were any more intuitive. My understanding is that jj is supposed to be easier to use for day-to-day version control. I'm very comfortable with git, but that was hard earned and I don't see that level of confidence with most devs that I've worked with over the years. Hoping jj can be more accessible to the average working software engineer than git has been.

Yeah being new to a tool it's expected that you'll have a learning curve.

But some stuff is just ridiculous.

Ok you use the mercurial model and have bookmarks not branches... oh wait you also didn't create a default bookmark when you initialised the repo.

Ok you also have the mercurial model where there is no staging of files... but now you've added in a whole new command to solve the problem that mercurial solves by just letting you name files on the command line.

Everything about it just feels like it's being different from something for the sake of being different, not for any actual benefit.


> I would prefer things like /map/<lat>/<long>/, for example.

PathInfo is a thing you can absolutely use.


Most web application servers have already equipped to be able to easily parse parameters out of the URL path for many years, of course, it's definitely nothing new, it's just that historically, people reached for URL query parameters for this sort of thing. After all, making a request with query parameters is basically built into the browser; you can do it with <form> and anchor links with no JS needed.

Presumably, because of that, many pages will continue to use query parameters for the foreseeable future. I think that's fine, but at least for APIs, the QUERY method could eventually be a very nice thing to have.


Are the two sentences meant to be related somehow?

I fail to see how choosing to pay for a Spicy Autocomplete service relates to using open source software?


That would make perfect sense if all branches had to be made from the default branch.

But they don't.

At `$CLIENT` we use `stable` as the default branch.

Use whatever works for you. Getting upset about a default that you can change is like getting upset about the default wallpaper of your OS.

And before you get all persnickety about that argument working both ways: the developers of git, get to decide the defaults and they did.

If you're so upset, fork it, revert the default branch name and maintain it yourself infinitely. That's definitely worth it just to keep a default branch name you like, right?


> If you're so upset, fork it, revert the default branch name and maintain it yourself infinitely. That's definitely worth it just to keep a default branch name you like, right?

No idea how you got that impression from my comment. It sounds like you're the one that's upset.

I don't care what you name your branches. I do think it's dumb to tell other people what (not) to name their branch though. But definitely not something I feel compelled to rearrange my life over.


Nobody is telling you what not to name your branches.

The people that wrote the software you're using for free, decided to change the default name.

That's it. Nobody has said you can't use whatever name you want.


> Nobody is telling you what not to name your branches.

> Nobody has said you can't use whatever name you want.

This is reductionist. The git people didn't pull this idea out of their butt. It came about because a lot of people were saying that we should not name our branches master.

I have no problem with what the git people did. Easy enough for them to change it, and it puts a dumb issue to bed (for them).

But I think it's fair for anyone to point out that the motivation was dumb, and to explain why it's dumb and how the word "master" is actually not an unreasonable choice in this context.

> Nobody has said you can't use whatever name you want.

Sure, until somebody makes the mistake of not renaming all of their old "master" branches and gets shamed by the word police over it.

Of course you're welcome to disagree.


> how word "master" is actually not an unreasonable choice in this context.

It doesn't even make sense in this context though. The name just got copied from BitKeeper which had master and slave branches.

Git doesn't have that concept.

> Sure, until somebody makes the mistake of not renaming all of their old "master" branches and gets shamed by the word police over it.

How are you going to be shamed? I thought there's nothing wrong with it?


> How are you going to be shamed? I thought there's nothing wrong with it?

If you re-read my comments you will understand that I don't believe there's anything wrong with using the word "master" to name a branch. But other people do, which is why there was an uproar and the default name was ultimately changed to "main".

Not sure how you were able to misinterpret this.


So what's your point?

If you don't think there's anything wrong with it, why would you care if someone else says "hey change this".

Do the same thing you'd do if someone says "hey you should use mongodb it's web scale": tell them you disagree and won't be doing that.

If you don't think there's anything wrong with it, how can you be "shamed" into doing something you disagree with?


I am doing exactly as you say.

I made a comment saying I disagree with the word police and I think it's dumb to cast people as being insensitive for using a longstanding word that makes sense to many people in the context it's used in.


> copied all the Darwin libraries from the Darling project and used LLVM to generate all the appropriate dylibs

I'm just starting for the day and misread that as "...used LLM to generate...", and I wondered what kind of crack you were smoking.


In future, your OS will be an agentic LLM which runs software by YOLOing the binaries, and then continuously fixing and refining the environment until it runs without crashing.


Can confirm. I use a Dell 6K 32", and it's frankly amazing. I still use an older Dell 4K 24" (rotated 90º) off to one side for email/slack/music but I just use the single 32" for ~90% of what I do.


Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.

The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.


Yeah sure, as long as you have a lot of resources for testing widely.

Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.


> if you were to make an analogy you should target for a few devices that represent the "average"

For Macs, 220DPI absolutely is the average.


Sure, but Macs are around 10% of general desktop computing. To a first approximation, they don't count. User communities vary widely. If you target macs, then a high DPI screen is a must for testing. Otherwise, I dunno; ~ 100 DPI screens are way less expensive than ~ 200 DPI screens, so I'd expect that installed base is significantly higher for standard DPI. But there's probably enough high DPI users that it's worth giving it a look.

To address a question elsewhere, personally, I don't see the benefit to pushing 4x the pixels when ~ 100 DPI works fine for me. My eyes aren't what they were 20 years ago, and it's just extra expense at every level.


I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.

One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.


I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.

> One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world

I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.

> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)


If you look at the Steam hardware survey, most users (as in, > 50%) are still using 1080p or below.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


In part though that's not because all those users can't afford >1080p because some of them can it's that insanely high refresh rate monitors and esports players often use 1080p at >300Hz - even the ones without still use 1080p because driving up the frame rate drives down the input latency.

Whether it matters is a bigger issue, 30 to 60Hz I notice a huge difference, 60 to 144Hz@4K I can't tell but I'm old and don't play esports games.


I don't think this is contra to my original point. Nearly 50% of all users are running at greater-than 1080p resolutions, and presumably power users are overrepresented in the latter category (and certainly, it's not just the ~2.5% of Mac users pushing the average up)


FWIW, I didn't mean to reply to you in an argumentative way. Just proposing an answer to this:

> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

I still see 1080p fairly often on new setups/laptops, basically, although 1440p and 4K are becoming more common on higher-end desktops. Then again, 1440p at 27" or 32" isn't really high dpi.


Writing this on 1280x1024 because it still works fine

The 5:4 aspect ratio is weird, especially in this era of everything 16:9, but it's a second monitor so usually only has one thing open


If you have 20/20 vision, a 27" display at 1440p (~110 DPI) has a visual acuity distance of 79cm - ie, if you are sat 79cm or further away from the screen, humans are not capable of resolving any extra detail from a higher resolution. High refresh rate 1440p IPS screens are very widely available at good prices, so it isn't that crazy that people choose them.

Phone and laptop have higher DPI screens of course, but I'm not close enough to my desktop monitor for a higher DPI to matter.


That's a common misconception. That acuity calculation is based on knowing nothing about the image; imagine applying it to arbitrary noise. When you have things like lines and edges your eyes can pick out differences an order of magnitude finer.

https://en.wikipedia.org/wiki/Hyperacuity


Thanks for pointing this out.

It's always been a weird topic - science (appears to, due to the aforementioned misconception) say one thing, and yet I have eyes and see a difference that the science says I shouldn't see.


Have you tested it in practice? High-DPI monitors make a very noticeable difference for text and user interface. That's the truth, even if the theory doesn't agree.


I'm running a 32" display at 4k, which works out to about the same at 79cm. Apparently a bunch of people sit really close to their monitors :)


Absolutely everyone in my company uses 1080p monitors unless they got their own. That’s just “normal“.

It’s horrible.


Retina still isn't available for large monitors like 38" and above.


Retina is available for only a handful of 5k 27" monitors, most of which aren't great, and all of which are only 60Hz.

It's really hard to buy one given how expensive / badly specced they are compared to 4k monitors, even as someone who value the vertical pixels.


I can’t tell you how often I see this. Brand new designs or logos in 2024 or 2025 that look abysmal on a retina monitor because no one bothered to check.

Stands out like a sore thumb.


Don't a bunch of the newer tools that wrap Virtualization.Framework (which itself wraps/builds on the lower level Hypervisor.framework) already support this?

There's even an example project to do this in code: https://developer.apple.com/documentation/virtualization/run...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: