I'm currently working on the next major version of my JavaScript game engine[1]. The main new feature will be real time multiplayer. Being able to run the same game code server and client side is invaluable.
For complex ones yes. The actual code shared may not be that much, but consider the advantages in sharing validation code between the client and server -- you never accidentally forget one requirement on the server side, leaving you vulnerable to an attack.
Template compilation being able to happen on either the client or the server is a huge win for anyone writing webapps, and it's trivial to write view code that can be compiled in both Node and browsers even without a framework.
In other languages, even if you write completely logicless views in a templating language that can compile in the browser like Mustache, you'll end up duplicating your presenter code if you don't use a language that browsers can understand.
and it's trivial to write view code that can be compiled in both Node and browsers even without a framework.
Nonsense. It's far from trivial to have the same code get binding and validation right on both sides. That's why it needs to be wrapped in a thin abstraction, which apparently nobody in the node-community has been capable of writing.
Instead we see dozens of rails-clones (oh, exciting..), and a handful of very half-baked full-stack universes that are nowhere near production ready.
Why not validate on both? Validate on the client to give a nicer and quicker notice if something is wrong (good UX) and then validate on the server for actual security.
Update your model? Don't forget to update your validation in both the client and server. Otherwise, you'll validate something on the client but not the server (bad UX), invalidate something on the client but not the server (bad UX), or corrupt your data slowly when validation entirely fails (bad UX).
Since it's a pain in the ass to remember to update two places when you make a change, you're seeing people make ridiculous leaps of programming ingenuity by choosing a server-side language that allows them to not have to update two places at once, and Javascript absolutely sucks for server programming. Every time I have to do any client-side Javascript I, quite literally, hate my life. People that love Javascript and want to apply it to everything have arguments about the stupidest things, like using semicolons, which is telling about how awful of a programming environment it can be.
Easier: Just don't validate on the client, and make a round trip. If you're doing mobile or on a high-latency link, there's an argument for doing client-side validation but then you just need the discipline to update both sides at once, which hopefully integration tests should help with.
That's a straw man if I ever saw one. This isn't a debate as to whether JS should be a server-side language or not. The fact is that if you're using Node and you're validating data, you can reuse the code to validate both on the front-end and the back-end. Front-end validation will likely improve the way your users perceive the app.
Proper unit testing makes split-brain validation a non-issue. Client-side validation does feel more responsive, it also saves bandwidth, and prevents impatient users from spamming the server with invalid requests.
Well, whatever "medium sized" means, I work on a high-traffic Web application that reuses 0% of code from the client; in almost everything I've worked on, I've found reuse between client and server nearly impossible, enough that they're almost always segregated repositories. There's an argument for models existing on both sides (which is reiterated below), but I have a hard time seeing models as 50-70% of any application.
Just not my experience that this is the case. Though I might be old-fashioned.
I've recently been working with a team that is developing a content management system for which the display side of things (responsible for routing requests to models and views, fetching model content, rendering, etc.) is about 500 lines of non-comment CoffeeScript code.
Nearly 100% of the server-side code is re-used on the client, and perhaps 80% of the (non-library) client-side code is shared with the server.
In this case the exceptions are:
- Things like HTTP-level request handling and reading from local files rather than over HTTP are used on the server-side only.
- Things like DOM-manipulation and interacting with browser events are used on the client-side only.
I wonder if a significant factor may be how much functionality you're actually replicating on both the client and server. In our case, browser-permitting, the entire display engine runs equally well on the client or on the server. If we had a lot of JQuery kind of stuff happening in the client-side JavaScript the ratios might be a little different, but "also-run-the-app-on-the-client" is a good example of a use case that leads to a lot of reuse of the server side code.
Do people really get that much code re-use between client and server that using Javascript on the server provides a large benefit?