Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While null may be the simplest way to represent nothing between a large amount of data types, I think the biggest issue (beyond how far null can propagate from the source in error scenarios) is that it's overloaded with too many possible meanings. Does it mean nothing was found? Was an error thrown? Is the value itself null? At least with a well defined API, you can determine these scenarios from a glance. I think that handling types like, None, Err("reason"), or Some(None) (using Rust types) is far clearer than looking at a null pointer and an int error code (which one may ignore).


I do like the Rust way because it’s standard (so far anyway). The situation in C++ is more like “a thousand blooms of nullishness”, where each library and API is frustratingly different in terms of something so fundamental. Objective-C with its messageable nil is so much better in this respect.


Messageable nil has a special place in my heart, but as a default mode of operation, it's certainly a fun way to hide bugs.


Messages to nil ... sure, OK, fine. Allow that, but CRASH because someone sent an unhandled message? That’s just mean.

Either crash when messaging nil, or don’t crash when improperly messaging an object.


If you're talking about Objective-C: it doesn't crash. It sends the message -forwardInvocation: to the receiver. (Nowadays it might try a couple of other things first).

Now the default implementation of -forwardInvocation: raises an exception, which in turn is by default unhandled and therefore goes to the default exception handler which logs and terminates.

However, you can implement your own behavior at every level:

1. Override -forwardInvocation: for your own objects (or one of the other mechanisms)

2. Override -forwardInvocation: in NSObject

3. Handle the exception

4. Install your own default exception handler


It doesn't matter how a language “handles nil or ‘not understood’ [sic] messages” - burning the computer would be okay. As with any other undesired behavior, it's your responsibility to prove it doesn't happen.


I think that everyone here knows that programming without bugs is the ideal, but having the language work against doesn't really help in that regard, does it?


I wouldn't count “burning the computer under impossible conditions” as “working against”. It's just the principle of explosion at work, and pretty much any optimizing compiler that takes advantage of undefined behavior uses it.


No. In this case there is no undefined behaviour, the way this is handled is defined, but to the parent it seems inconsistent.

Undesired behaviour can not be really leveraged for optimisation, and in any case, high(er) level languages should not have UB in their design.


If your program's correctness depends on `nil` being handled this or that way, your design is broken. Fortunately, most programs aren't designed this way: messages to `nil` are seldom actually intended, whether the language's semantics considers them a legitimate operation or not, and whether they are “helpfully” trapped at runtime or not.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: