Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The language of the web is javascript. Until web browsers allow Go as the embeddable language, node will remain useful.

Do people really get that much code re-use between client and server that using Javascript on the server provides a large benefit?



I'm currently working on the next major version of my JavaScript game engine[1]. The main new feature will be real time multiplayer. Being able to run the same game code server and client side is invaluable.

[1] http://impactjs.com/


For small apps, no.

For complex ones yes. The actual code shared may not be that much, but consider the advantages in sharing validation code between the client and server -- you never accidentally forget one requirement on the server side, leaving you vulnerable to an attack.


consider the advantages in sharing validation code between the client and server

Is there a mature implementation of this yet?

One that is not tied to an entire immature framework (meteor) or programming pattern (nowjs)?

Code-reuse was the big promise of node. Yet in reality I've never seen it executed beyond brittle experiments.

Where is the form_for_model() function that emits code for both the client and the server?


Template compilation being able to happen on either the client or the server is a huge win for anyone writing webapps, and it's trivial to write view code that can be compiled in both Node and browsers even without a framework.

In other languages, even if you write completely logicless views in a templating language that can compile in the browser like Mustache, you'll end up duplicating your presenter code if you don't use a language that browsers can understand.


and it's trivial to write view code that can be compiled in both Node and browsers even without a framework.

Nonsense. It's far from trivial to have the same code get binding and validation right on both sides. That's why it needs to be wrapped in a thin abstraction, which apparently nobody in the node-community has been capable of writing.

Instead we see dozens of rails-clones (oh, exciting..), and a handful of very half-baked full-stack universes that are nowhere near production ready.


Alternatively, there's an argument that one should not validate on the client to avoid this exact scenario (split-brain validation).


Why not validate on both? Validate on the client to give a nicer and quicker notice if something is wrong (good UX) and then validate on the server for actual security.


To avoid a split-brain validation scenario.

Update your model? Don't forget to update your validation in both the client and server. Otherwise, you'll validate something on the client but not the server (bad UX), invalidate something on the client but not the server (bad UX), or corrupt your data slowly when validation entirely fails (bad UX).

Since it's a pain in the ass to remember to update two places when you make a change, you're seeing people make ridiculous leaps of programming ingenuity by choosing a server-side language that allows them to not have to update two places at once, and Javascript absolutely sucks for server programming. Every time I have to do any client-side Javascript I, quite literally, hate my life. People that love Javascript and want to apply it to everything have arguments about the stupidest things, like using semicolons, which is telling about how awful of a programming environment it can be.

Easier: Just don't validate on the client, and make a round trip. If you're doing mobile or on a high-latency link, there's an argument for doing client-side validation but then you just need the discipline to update both sides at once, which hopefully integration tests should help with.


That's a straw man if I ever saw one. This isn't a debate as to whether JS should be a server-side language or not. The fact is that if you're using Node and you're validating data, you can reuse the code to validate both on the front-end and the back-end. Front-end validation will likely improve the way your users perceive the app.


Proper unit testing makes split-brain validation a non-issue. Client-side validation does feel more responsive, it also saves bandwidth, and prevents impatient users from spamming the server with invalid requests.


That is why the guy is using the same code to do both validations.


In a medium sized web application, 50-70% of server-side code can be re-used verbatim between server and client.


Well, whatever "medium sized" means, I work on a high-traffic Web application that reuses 0% of code from the client; in almost everything I've worked on, I've found reuse between client and server nearly impossible, enough that they're almost always segregated repositories. There's an argument for models existing on both sides (which is reiterated below), but I have a hard time seeing models as 50-70% of any application.

Just not my experience that this is the case. Though I might be old-fashioned.


For what it's worth, here's one counter-example:

I've recently been working with a team that is developing a content management system for which the display side of things (responsible for routing requests to models and views, fetching model content, rendering, etc.) is about 500 lines of non-comment CoffeeScript code.

Nearly 100% of the server-side code is re-used on the client, and perhaps 80% of the (non-library) client-side code is shared with the server.

In this case the exceptions are:

- Things like HTTP-level request handling and reading from local files rather than over HTTP are used on the server-side only.

- Things like DOM-manipulation and interacting with browser events are used on the client-side only.

I wonder if a significant factor may be how much functionality you're actually replicating on both the client and server. In our case, browser-permitting, the entire display engine runs equally well on the client or on the server. If we had a lot of JQuery kind of stuff happening in the client-side JavaScript the ratios might be a little different, but "also-run-the-app-on-the-client" is a good example of a use case that leads to a lot of reuse of the server side code.


The amount of traffic has nothing to do with it.

A crud app would properly share relatively little code between the server and the client. A multiplayer game would share much, much more.


And which do you think is more prevalent in the world, a CRUD app or a multiplayer game?


"medium" means I don't have personal experience of how the percentage scales to large applications.

as for sharing: data sources, domain models, utilities and templates can be shared. its the io handling (HTTP and dom) that can not be shared.


Check out stuff like meteor or firebase. They barely even differentiate btwn client and server code.

Rolling your own framework based on the same ideas is not hard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: