Hacker Newsnew | past | comments | ask | show | jobs | submit | _r9fo's commentslogin

I'd be interested in knowing more about the limits you hit in Singapore, if you're willing to share! Singapore is a really interesting city and as someone who moved to Hong Kong a few years ago, I've always wondered what life in Singapore is like.


Well they aren't related to Singapore specifically, more or less Asian culture as being more conservative than American culture. It can be really hard to meet locals here, as many stick to themselves and find talking to a stranger somewhat weird, much less making friends for no reason. This can be seen especially at clubs, where you go and hang with your group of friends rather than meeting new people. I was just in HK for a week and really liked it, it seems more vibrant than Singapore as far as the energy of the city.

Many limits I ran into were personal including health conditions and it turns out humidity really makes me sweat which I never realized. Also in India for some reason eating with my hands made me so uncomfortable, whereas eating with chopsticks was fine and I learned it quickly. The language barrier wasn't too bad since Singapore is English speaking, although the accent can be really difficult sometimes so speaking slowly is sometimes needed. I knew Asian culture was more conservative as I've been to South Korea before and have a lot of friends there from research at my home university, but I really thought I could make local friends anywhere despite a quieter culture and that turned out not to be the case.


From: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

['1', '2', '3'].map(parseInt); // While one could expect [1, 2, 3] // The actual result is [1, NaN, NaN]

parseInt is often used with one argument, but takes two. The first is an expression and the second is the radix. To the callback function, Array.prototype.map passes 3 arguments: the element, the index, the array. The third argument is ignored by parseInt, but not the second one, hence the possible confusion.


This shows how badly javascript was designed. Functions when used improperly should throw an exception not "sort of work."


To take up the mantle of Devil's Advocate:

Neither of these are doing anything wrong on their own[1]. When you go to map something you want to be able to access the index sometimes. When you parse an int, you sometimes want to be able to specify a base. Maybe in a strongly typed language you might use an enum with valid bases instead of an int, but that doesn't make sense in a dynamically typed language[2]. Maybe you use different function names if you want to parse an int in decimal vs an arbitrary base, I can't argue with that, but overloaded functions/methods are the norm and not the exception in most languages I've used.

[1]: Except parseInt's default radix which is ambiguous and the cause of countless bugs. But this bug is because we ARE specifying it.

[2]: And no, being dynamically typed is not inherently bad design.


In terms of design I'm talking about an exception not the arity of the function. A hidden NaN in an array of numbers is like a javascript promise for an error in the future. This isn't IO so it's better to kill it now rather than killing it sometime in the future.

Actually, to be honest will NaN even launch an exception? Does 1 + NaN cause an exception or does it propagate more NaNs all over the code?

Imagine you are debugging a 10000 line function that's suppose to return an int, but instead returns a NaN. Where did this error occur in the 10000 lines? Sure, I can say all kinds of stuff to make finding this error sound simple but if exceptions were thrown instead of assigning NaNs to some variable life would be easier. The exception would tell me the exact line of the illegal operation while a NaN tells me nothing.

Javascript treats the "NaN" type as if it's a number... in the sense that when you do a math operation on something that is a NaN you get another NaN... The reality is... the very abbreviation "NaN" stands for is "Not a Number" so the addition of a number and something that is not a number should not even occur. Seriously? They should have called it "LoL" or "WtF" because these are more appropriate abbreviations.


> Imagine you are debugging a 10000 line function that's suppose to return an int, but instead returns a NaN.

I'm imagining that. It's my first day at a new job cleaning up a legacy codebase, but upon opening up a single 10kloc function written in JS, I realize that I made a terrible mistake and immediately hand in my resignation.

In the meantime, I set a breakpoint at the end of the function, inspect the inline value annotations for NaNs, and trace them back to the start.

Relative to any other debugging task in a 10kloc function, that doesn't sound bad at all. If it's consistently returning NaN, that should take maybe half an hour.

Also, I'd be really surprised if the function was made to return an int, because JS doesn't have integer datatypes.


Read what I wrote again:

> Sure, I can say all kinds of stuff to make finding this error sound simple but if exceptions were thrown instead of assigning NaNs to some variable life would be easier.

I said this in the parent post and sure enough the reply is exactly inline with what I said was going to happen. I don't hate javascript, but it has its horrible horrible warts.

>Relative to any other debugging task in a 10kloc function, that doesn't sound bad at all. If it's consistently returning NaN, that should take maybe half an hour.

An exception would take a second. An exception would give you the target line of every illegal operation; while a NaN propagates itself like a virus along every arithmetic operation.

Some people find it easy to run a 20 miles, sure but that still doesn't change the fact that cars are much better. Get the analogy? You are running 20 miles and telling me how you don't mind... I'm saying sure, but driving still has a purpose in life.


> I said this in the parent post and sure enough someone who is a fan of javascript had to reply with exactly what I said was going to happen.

Breakpoints are off the table? Seriously? I didn't realizing using a debugger made me a JS fanboy.

Wrapping all of your basic arithmetic in try/catch blocks is completely unreasonable. If you need to check for NaN, check for NaN. Making arithmetic substantially slower in the general case to catch the uncommon cases is exactly the problem NaN was introduced to solve.


No dude. I'm not saying it's off the table. I'm saying it shouldn't be necessary for a 1/0 illegal operation.

I'm not even saying to wrap your functions in try catches. If your program has a 1/0 operation then obviously you made a mistake somewhere. With javascript you have to hunt that mistake down, with a language like python you get the line number of where that mistake occurred. The exception is there for the entire program to fail so that you can correct your mistake. The language should not allow operations to occur that create these things called NaNs and thus automatically this removes the need to check for NaNs too. That's all I'm saying.

BTW unless you explicitly instantiated a NaN you should rarely ever be checking for NaN's in your code. NaNs represent illegal operations like 1/0, and thus any type of code producing a NaN should be killed. If you check for NaNs in your code that means you are guarding your code from your own bugs.

>I didn't realizing using a debugger made me a JS fanboy.

Yeah that was inappropriate. I apologize for calling you that, unfortunately you still saw it before I edited it out.


Look, misattributing the introduction of NaN to JS is not going to help make the case that JS is poorly designed[1].

JS uses double-precision floating point numbers, based on the IEEE 754 standard first published in 1985.

[1]: Though there is a case that using doubles for all numbers is poor design, that's a different topic altogether.


I don't know where NaNs were introduced, but I never attributed the introduction of NaNs to javascript. If JS is just propagating a bad design, it still fits the description of being poorly designed.

Not to mention a null. null + 1 = 1. This is even worse. Your 1000loc function returns a legit number but it's a little bit off... and you have to find it. Imagine never having to have to do that with a language like python... 1+None and 1/0 all return named exceptions with line numbers in python.


No, not following the globally agreed upon standard for the sole purpose of making arithmetic slower in the general case would be bad design.


https://dzone.com/articles/the-worst-mistake-of-computer-sci...

Keep in mind this guy didn't even mention how javascript handles nulls. While nulls bypass typecheckers, typically a null + 1 throws an exception. Not so in javascript... null + 1 returns a 1 in javascript.


Null and NaN are different concepts with different trade-offs and different behavior when working with them. Nulls don't have special processor support[1] and are not part of a multilanguage, multiprocessor, international standard.

[1]: So far as I'm aware, though I'd love to read about it if I'm wrong.


what are you talking about? when did i say I want to make arithmetic slower? And how is that globally agreed upon as a standard? In math when you get a NaN, do mathematicians continue to use it as a variable or do they realize they made a mistake? Think about it. In all math and science It's actually globally agreed upon that a NaN is not a variable you can reuse in some other function.


> And how is that globally agreed upon as a standard?

Because all modern processors and mainstream languages implement their floating point calculations as IEEE 754.

> when did i say I want to make arithmetic slower?

When you said you wanted to introduce exceptions whenever NaN is encountered.

Your processor has instructions that add/subtract/divide/multiply floating point numbers according to the IEEE 754 standard. What you propose is to then check in the implementation of JS after each instruction to see if the result is NaN, which is going to be a speed reduction of at least 2-3x, though I would expect it to actually be higher than that because branching can get expensive. (Making a note to go try it and benchmark)

Then, after doing this check, you want to have JS throw an exception. This only slows down the uncommon case, so that's not much of an issue, but in situations where NaNs are able to be treated as valid values, they then have to catch these exceptions and resume normal execution flow.

The result is that there is a happy path slowdown of arithmetic operations by at least 2-3x, to gain an arguable advantage in debugging time in the unhappy path.

Throwing away compatibility with other languages and working against the processor are not goals I'd shoot for, and wrapping basic arithmetic in try/catch blocks doesn't sound like a very good payout for doing so.


>Throwing away compatibility with other languages and working against the processor are not goals I'd shoot for, and wrapping basic arithmetic in try/catch blocks doesn't sound like a very good payout for doing so.

Almost every popular language launches an exception when you do a divide by zero.

>Your processor has instructions that add/subtract/divide/multiply floating point numbers according to the IEEE 754 standard. What you propose is to then check in the implementation of JS after each instruction to see if the result is NaN, which is going to be a speed reduction of at least 2-3x, though I would expect it to actually be higher than that because branching can get expensive. (Making a note to go try it and benchmark)

Try it, I'd like to see the results. I'm not too familiar with the processor implementation, but something tells me that if every system level language implements exceptions for division by zero then the cost must be minimal. Also how would it be a 3x slow down? At most it's just one instruction.

Interestingly in some languages integer division by zero throws an exception and floating point division by zero returns some kind of infinite or NaN value.

If this behavior was propagated down from processor design, I'd say the processors made a bad choice. Why is this behavior specific to floating point? Why is the behavior for integers different. Either way when I say poorly designed, I mean poorly designed in terms of usability, not speed. Clearly, javascript wasn't initially designed for speed.

>Throwing away compatibility with other languages and working against the processor are not goals I'd shoot for, and wrapping basic arithmetic in try/catch blocks doesn't sound like a very good payout for doing so.

Like I said, the purpose of a division by zero exception is not to be caught. It's to prevent a buggy program from continuing to run. The only place a division by zero operation can legally occur is if the zero arrives via IO. In that case it's better to sanitize the input or detect the zero rather than catch an exception.


> Almost every popular language launches an exception when you do a divide by zero.

1.) Divide by zero returns Inf, not NaN.

2.) I just tried C++, Java, C#, Go[1], and the champion of safety: Rust. All of them evaluate to Infinity when dividing a double by 0.0. No exceptions to be found.

> Try it, I'd like to see the results. I'm not too familiar with the processor implementation, but something tells me that if every system level language implements exceptions for division by zero then the cost must be minimal.

But they don't. Not every system level language even has exceptions and the ones that do return Inf as stated above.

> At most it's just one instruction.

Branches can be expensive if the branch predictor chooses the wrong path and forces a pipeline flush. Should be able to work with the predictor so that happy paths are true and that should make it less likely, but I'm not an expert there.

> The only place a division by zero operation can legally occur is if the zero arrives via IO. In that case it's better to sanitize the input or detect the zero rather than catch an exception.

In many applications this is true. Not all.

[1]: You do need to make sure Go treats both operands as doubles, easiest way to be sure is to declare x and y as float64 and then divide them afterward. But when actually dividing a float64 by another float64, it does evaluate to Inf as well.


> All of them evaluate to Infinity when dividing a double by 0.0. No exceptions to be found.

Well, Rust doesn't have exceptions, so it wouldn't happen, but yeah. To your parent, these languages follow IEEE 754, as far as I know, which specifies this behavior.


Go throws an exception with 1.0/0.0. Unless you specify the type as float64.

Ok overall I'm wrong.

But either way poor design from a usability standpoint and from a consistency standpoint because ints don't do this.


Yeah, floats and ints are just fundamentally different, especially when you're at the level of caring about how hardware deals with things.

TIL about Go:

Numeric constants represent exact values of arbitrary precision and do not overflow. Consequently, there are no constants denoting the IEEE-754 negative zero, infinity, and not-a-number values.

https://golang.org/ref/spec


Yeah that's one of those things that I learn, and then forget, and then relearn, and forget...

Would probably retain it better if I used Go as my primary language for a year or two.


It makes sense to have a function that is like the common functional “map” but passing the index as well, it's probably not a great design to have it be called “map” and have the basic, no-index behavior rely on the passed function taking only one arg, especially in a language where functions that usually take one arg but also have optional args are common.


I'm always interested in reading about the inner workings of gangs. It seems like these biker gangs are not particularly sophisticated nor well organized, though. The main one in the story immediately lost cohesion when their leader fled the country and were absorbed by the Yakuza. In many ways, they seem less robust and effective than biker/street gangs in America.


There is a tension on how organized you make your gang. Too organized and the power concentrates at one person who is easy to arrest or assassinate and then there is a power vacuum that pulls the organization apart.

Basically you trade off being more effective for being more robust. When any member can be arrested at any time it makes sense to retain only loose organization even though you won't get as much done.


I wonder if having all of your members be independently mobile on their motorcycles allowed them to thrive in North America. It's so much easier to commit crime when you just hop on and off a bike, access paths unavailable to cars

Recently I've been reading about Vancouver's gang wars and it strikes me just how persistent Hells Angels seem to be compared to other groups that have all but disappeared off the radar or become friendly to Hells Angels.

Perhaps decentralization is the new organized crime paradigm...we won't be seeing supercartels like Cali anymore


"Leaderless Resistance" has a long history within groups that seek to remain covert (for both legal or illegal reasons).

https://en.m.wikipedia.org/wiki/Leaderless_resistance


Motorcycles are hyper powered horses. The war tactics of the preindustrialized world still apply.


American biker gangs in particular benefit from deep cultural roots. They originated post-WWII as associations of disaffected veterans, and continue to have a well of support in groups that follow that culture without any illegal activity. These Japanese biker gangs seem to have a much lighter base of support.

(Interesting side note, gay leatherman culture grew out of specific gay biker gangs, and mainstream biker groups like Hell's Angels have a positive view of leather culture despite their macho image.)


> American biker gangs ... originated post-WWII as associations of disaffected veterans

Based on what I've read, in any country demobilized soldiers are a challenge; they are trained in violence, and are often traumatized, have trouble fitting in with civilians, and lack valuable civilian skills. In some places they become violent insurgents (a recent example is Iraq), in some the government finds a war for them (I've read that that was a motivation for the Crusades), and in the U.S. post-WWII it appears that some became biker gangs, a sort of vigilante for a peaceful society.


And even when they did not become involved in violence, they gave demobilized soldiers a clear hierarchy, with insignia and chains of command.


That's really interesting! Do you happen to remember where you read about demobilized soldiers in countries, or about the possible motivation for the crusades?


> Do you happen to remember where you read about demobilized soldiers in countries

I've read it many times about many countries, current and historical. Find resources serious and detailed enough to deal with that kind of policy issue. There is abundant discussion in the U.S. press (e.g., the NY Times) of the demobilization of the Iraqi army after the Iraq War.

> or about the possible motivation for the crusades

Sorry, not off the top of my head, but it's a relatively well-known theory and I read it more than once. As I wrote, I've read that theory; I didn't say such things happened. That's because I don't recall whether the source was based on serious research and expertise or was repeating a semi-popular 'wisdom'.


You can! And usually with only a slight pay cut. There are a lot more remote-ok and remote-only companies now.


The pay cut from my experience can be pretty dramatic. Whenever I have replied to a remote job ad and asked about salary it would often mean 40%+ pay cut for me.


Let's say I make $60k a year now in an unrelated field and I join your school, but after graduating, I was unable to find a job in software.

Would I still be on the hook to pay you 17% of my income despite not landing a programming job?


No, you'd never pay us a dime.


This is what I don't understand about offshoring. People keep going to India and China to find devs when they can get very high quality English speaking devs from the UK, Australia, and Canada for about the same price (factoring in all costs).


> very high quality English speaking devs

Thats exactly what OP is saying you'll get in India for that price. Unless your assumption is everyone in India/China is low quality at any price.


I think they were more mentioning that you can get around some of the timezone issues and other annoyances distance creates for the same cost.


What constitutes a startup?

Because they did have customers and revenue for the service that they offered. They had everything that is part of a business (albeit, a very slow and unprofitable one). They're missing the ability to scale since they had no tech stack to speak of, but their MVP did not really need one.


It says in the article that she has >50% equity.


I'm going to guess that these big name investors have liquidation preferences of at least 2x, meaning that her stake will probably be worth nothing once this all blows up even if the company maintains any value (and it probably won't anyways)


4sq has struggled mightily with traction/retention since the big change. Yelp is ubiquitous in the US and they are making strides into other countries.


They said Groupon is one of the "examples of successful Chicago based startups". So it's no longer a startup. And it IPOed recently whereas Orbitz IPO'ed in 2003 so it's not a recent startup.


Good lord. No wonder this city's scene sucks.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: