Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is probably the most overly pedantic, nitpicky reason for not using a program I've ever heard.

I'm aware that there are two different conventions on this issue, so I just use parentheses to get the behavior I want.

But, growing up, as the top math student in my class, it never occurred to me that somebody out there wants -3^2 to equal -9, I thought it was just a weird quirk in some calculators/programs. How would you read that expression aloud? I think of it as "negative three squared" so that's why (-3)^2 makes sense to me. Do you say "the negative of three squared"?

In 8th grade, I remember being instructed to type such an expression into the calculator to observe how it does something contrary to what we expect it to. From that moment on, I thought, "Huh, guess you have to use parentheses." It certainly wasn't cause enough to throw out my calculator, let alone tell others not to use it, just because I prefer a slightly different precedence convention.



> This is probably the most overly pedantic, nitpicky reason for not using a program I've ever heard.

If you found learning math easy, you're fortunate. But lots of people find learning math difficult and frustrating, and things which might not have bothered you can be big deals for those folks. If I used a program in teaching which has a convention about basic arithmetic operations that is the opposite of the convention that mathematicians use, it is one more source of confusion and frustration for people.

Student: "You said that -3² was -9, but Excel says it's 9."

Me: "Well, mathematicians use a different convention than spreadsheets."

Student: "So which one should I use on a test? Can we use both?"

Me: "Since this is a math class, you should use -9, not 9."

Student: "How am I supposed to remember that? This is why I hate math ..."

Everyone will weigh costs and benefits differently. There is plenty of good math software out there like Mathematica, R, Geogebra, or maxima. Spreadsheets didn't seem to offer much, and there was this arithmetic convention thing that I knew would be an issue.

I'm sorry if you find it pedantic and nitpicky. I always tried to minimize unnecessary causes for upset, because there were difficulties enough learning math without my adding to them. If you saw people getting extremely angry or in tears because they "didn't get it", I think you'd understand. Math is really hard for some people.


I think we should agree that standard notation is too ambigious and switch to reverse polish notation:

  3 2 ^ -     -9
  3 - 2 ^      9
No way to misinterpret that!


Except 3 - 2 ^ makes it hard to express 3 2 - ^ without accidentally subtracting, so in this case a unary negation sign needs to be a different symbol!


I dug out my HP 50g over the weekend to play with, and conveniently…

Sigh. I understand why we commonly enter math on basically a teletype-with-ASCII, and I don’t have an urge to go all APL, but for a while we were so close to a future where we could’ve had separate negation or multiplication or exponentiation symbols that might’ve removed so much room for error. I mean, that little calculator and its predecessors were popular and widely used by the same people who brought us things like Unicode and the space cadet keyboard. If only one of them had said, gee, it sure would be handy to have a +- key on the keyboard the person in the next cubicle is designing as I have on the calculator on my desk!

But nope, Everything Is ASCII won and here we are. At least programming languages are starting to support Unicode identifier names, which has its own issues but is excellent for non-Latin alphabet users who want to write code in their own tongue. It seems like a reasonably short hop from there to giving operators their own unambiguous names. I can imagine a near-distant future where a linter says “you typed -3. Did you mean ⁻3?”, to the chagrin of programmers still entering their code on teletypes.


It would be nice if OSs defaulted numeric keypad / * and - to Unicode ÷, ×, −. I never use them even when I do use the digits. That would solve the more glaring typewriter legacies. Then you'd just have the apostrophe/single-quote as the last remaining unification.


I'm not convinced. I was brought up with the middle dot for multiplication (and × reserved for cross products, I suppose?) and according to Wikipedia, the

> ISO 80000-2 standard for mathematical notation recommends only the solidus / or "fraction bar" for division, or the "colon" : for ratios; it says that the ÷ sign "should not be used" for division

I think these things are way less standardised even on paper than you believe.


That was kind of my point. We somewhat settled on an ASCII representation of many symbols, and that space is so small that lots of them have multiple meanings you have to infer. There was a brief window where we could’ve taken a different path and used different symbols for different things. Alas, we didn’t.

I don’t contend we should change things today. I do think if I were personally writing a new programming language from scratch today, I’d likely use different Unicode symbols for different operators, and the accompanying language server would nudge people to use them.


Everyone learnt -3^2=-9 in middle school… it is a very clear implementation mistake


"minus (pause) three squared"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: