Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have to say, the way indentation and brackets were done here looks like it's just inviting subtle bugs. Take this for example:

    if (peekc) {
      c = peekc;
      peekc = 0;
     } else
      if (eof)
       return(0); else
       c = getchar();


If you judge the coding style of the early 1970s with modern standards, it isn't going to look great. 50 years is a loooong time.

dmr was one of, if not the first C programmer(s).


And he was almost certainly using ed(1) as his editor and a mechanical teletype at 7.5 or 10.0 characters per second as his terminal...

The C language (and all of Unix) was designed to be very terse as a consequence.


I try ed(1) once in a while and I find it pretty usable (and even rather efficient if you know what you are doing). cat(1) also works if you just want to type a new text.

Speaking of terseness, I love the the fact that C does not have 'fn'.


We used to speak of a great Unix systems programmer as someone who could write device drivers with cat and have them compile and run the first time.


Before I look up `man cat`, what can you do with `cat` other than just see what's in a file?


When not given a file, cat will just read from stdin, so you can use "cat > file.c", write some text, and send EOF with ^D when you're done.

Obviously, there's no way to go back and edit anything mid-stream, you have to write the whole thing out in one shot.


The backspace does work within the line.


If your terminal is in line-buffered mode.


You can join files together.

    $ cat foo bar > baz
will join the files foo and bar together into a single file called baz


That period lasted what, 10 years after Unix was created? And we'll be stuck with those decisions decades if not centuries.

Similar story with the design of QDOS / MS-DOS / Windows and nowadays with Android. Both designed for super underpowered machines that basically went away less than a decade after they were launched and that will be hobbled because of those early decisions for a long, long time.


We will be hobbled with these decisions for a long time precisely because the complete package of trade offs the designers made were so successful.

If they had gone for wart-free on properly powered hardware, they would be stuck back in Multics land or living with the gnu Hurd—cancelled for being over budget or moving so slowly that projects that actually accomplish what the users need overtake them.

Do I wish that C had fixed its operator precedence problem? Sure. But the trade offs as a total package make a lot of sense.


Some would say,

https://multicians.org/history.html

Instead we pile mitigations on top of mitigations, with hardware memory tagging being the last hope to fix it.


Is there an explanation on why C’s operator precedence is weird? Such as: why does the bitwise AND have higher precedence than logical AND?


There is, and it is amusing

“In retrospect it would have been better to go ahead and change the precedence of & to higher than ==, but it seemed safer just to split & and && without moving & past an existing operator. (After all, we had several hundred kilobytes of source code, and maybe 3 installations....)“

https://www.lysator.liu.se/c/dmr-on-or.html


> why does the bitwise AND have higher precedence than logical AND?

Why is this precedence weird? Bitwise AND tends to be used to transform data while a logical AND tends to be used for control flow.


I meant equals having a higher precedence than bitwise AND.

As in:

    if (x & 2 == 2)
...is actually parsed as:

    if (x & (2 == 2))
...which isn’t intuitive.


See the above example from dmr himself


> that will be hobbled because of those early decisions for a long, long time.

Perhaps this is why a programmer would want to rewrite a system & tout "funny success stories" about the effort & results?

https://news.ycombinator.com/item?id=25844428

> Why couldn't you just upgrade the dependencies once then set up the same CI/CD you're presumably using for Svelte so that you can them upgrade versions easily?

Because the existing system was painful & time/energy intensive to upgrade. It happens with tight coupling, dependency hell, unchecked incidental complexity, architecture churn, leaky abstractions, etc...

Maintenance hell & > 1 month major version upgrades tend to occur with large, encompassing, first-mover frameworks, often built on a brittle set of abstractions, as they "mature" & "adapt" to the competition. e.g. Rails, Angular...


Was it different for the designers of ALGOL/SIMULA/Pascal?


Yeah, those languages were IIRC designed to be edited offline (as a deck of punch cards) and submitted to a mainframe via high-speed card reader as a batch job.


Very interesting when you think about it. A language created in 2009 (Go) owes its syntax to a language from 1969 (B), and the latter looks like it does because it was designed during a short transition period between offline editing (1960s) and electronic terminals (1970s).

And there are people claiming that computer scientists are not conservative :)

To what extend this explanation is correct is another question... The article by Denis Richie says "Other fiddles in the transition from BCPL to B were introduced as a matter of taste, and some remain controversial, for example the decision to use the single character = for assignment instead of :=".

It's a kind of butterfly effect :) Mr. Richie prefered "=" over ":=" and fifty years later a server crashes somewhere because somebody wrote a=b=c instead of a=b==c.


Actually the transition from "offline editing" and "electronic terminals" was not short at all. Teletypes (aka "typewriters which can recieve content encoded in electricity, next to the user keyboard") date back way beyond computers, and were still in use in the 1980s (but evventually superseded by fax). Teletypes were cheaper, faster and more convenient then video terminals. Don't underestimate of having a printout of your session, especially when being online (i.e. connected to the mainframe or mini computer) is something valuable and your terminal is "dumb" and has no storage (except paper).


My first usage of a computer was on a printed teletype. My last such use was probably around 1985. They were around for a long time.


And for a lot of people, the lightbulb goes off once they realize what 'tty' stands for...


B was a descendant of Bootstrap CPL, a language never intended to be used for anything other than making a CPL compiler, really a butterfly effect.


If you can only read it line by line:

    return(0); else
makes a bit of sense.


I dont think it would look that out of place if he was using the ternary operator which is the same thing after all.

E.g.:

  if (peekc) {
    c = peekc;
    peekc = 0;
  } else
    eof ?
      return(0) :
      c = getchar();
The first else clause sill looks weird, but the final part isn't nearly as out of place (well i guess assigning in a ternary would be weird, but in terms of indentation) and its not like we actually changed anything.


Can't return in a ternary.


Correct, return is not an expression. But then again, he could if he wanted to, as the language designer ;)


Wouldn't it be wonderful if we could write

    a = return b;


What would that even do? Return b, but then set b to a right before “deleting” a? That would serve no purpose.


Maybe it was not worth the extra work.


It would have made code review a nightmarish activity for the team.


Why nightmarish? A reviewer may explain that it is prone for other human to overlook the end of conditional and go sleep as usual. I never understood the emotional component of blaming someone’s personal code styles, as if it were a religion with sacrifices and blasphemy instead of just a practical way of coding in a heterogeneous team.

This triggers me because many people jump on “bad c0de” in the forums, but then you read some practical code on github, and it is (a) not as perfectly beautiful as they imagine at all and (b) is still readable without nightmares they promised and (c) the algorithms and structure itself requires programmer’s perception and understanding levels far beyond the “it’s monday morning so i feel like missing an end of statement in a hand-written scanner” anyway.


That team had Bell Labs researchers, Ken Thompson and Doug McIlroy being among them. Their brains could handle much harder things!


The difference between Dennis Ritchie and the average programmer of today is, Dennis Ritchie did not write bugs in the first place.


Well, some say (I disagree) that C is a bug.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: