Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why do people run their server software on their desktop systems instead of, say, a proper server? Back in the days our desktop systems were VT102s, NCD X-Terminals or underpowered laptops and developing on the server through rsh/ssh or even VNC was natural. Does Ruby somehow require locally running GUI processes?


There's certainly nothing stopping you from doing Ruby or JS dev work on a remote server. I run Windows on my laptop, so when I wanted to start contributing to Discourse on the side a few months ago, I ordered a VPS at DigitalOcean and did my work there over SSH. I used SSH tunneling to access the dev HTTP server.

Edit: It helps that I use vim as my editor.


I’ve done similar things with pycharm. I can run pycharm locally but run a remote python process. .py files are automatically uploaded and I can debug over the network just fine. I know eclipse, and presumsbly others, can do the same.


At my current company we do exactly this. It’s nice because it’s someone else’s problem to make sure my dev environment works, I can recreate my dev environment with a single command, and I have a shareable URL to my Rails instance that anyone on the eng VPN can use (makes it easy to show works in progress to teammates or to the product team).

It does kind of suck in some ways though. I can’t develop without internet, and debugging is worse than debugging locally. The debugging issue could probably be alleviated with some more investment in developer tooling/editor integrations.


Because you can, Convenience, and compatibility.

Since linux became the OS of the internet, people have been able to run the same software on their own computer. It's more convenient to just run the "server software" on your own machine and you have full control over it.


> Because you can, Convenience, and compatibility.

You shouldn't shoot yourself in the foot because you can - it doesn't seem particularly convenient to wait > 10 minutes for tests because the laptop CPU is overheating when a proper development server would be twice as fast even with (clumsy) single-core tests.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: