Hacker Newsnew | past | comments | ask | show | jobs | submit | more vegabook's commentslogin

It doesn't have a REPL which is strange for a BEAM VM language.


See: intel / iphone


Configuration is a hard problem as evidenced by all continuous reinvention of configuration formats (YAML TOML XML etc). They're all "simplified" and that's why they fail at more advanced use cases. Once you get over a few syntax quirks and the tendency of advanced users to use a bunch of syntax shortcuts, which are confusing to new users, Nix is actually a very terse but conceptually quite simple functional language which has exactly the right amount of complexity to map to its problem space.


Like a mac?


Just tried usearch against ol’ faithful np.dot, and found the latter to be 8x faster than usearch on 10m brute force scan as described in their readme [1] for top 50 matches. Identical output result. 1.74 seconds for numpy and around 12 seconds for usearch on an M2 max with enough ram to hold the vectors without swapping.

[1] https://github.com/unum-cloud/usearch?tab=readme-ov-file#exa...


Author here :)

This might not be an apples-to-apples comparison. NumPy uses BLAS for matrix multiplication, which benefits from tiling to make better use of CPU caches.

USearch, on the other hand, computes L2 distance directly (not the dot product) and supports a variety of metrics. It doesn't use tiling, so it's expected to be slower than BLAS GEMM routines for single or double-precision vectors.

Things might get more interesting with half-precision, brain-float16, or integer representations, where the trade-offs are less straightforward. Let me know if you decide to try it with those — I'd love to hear how it performs.

PS: You may find related benchmarks here: https://github.com/ashvardanian/SimSIMD


It turns out, my bad and I apologise, that although 10e6 x 1e3 FP32 fits well within 96GB of RAM, during the np.random.rand initialization phase intermediate allocations mean we go to about 32GB of swap files. These only get cleared if more ram is demanded and that happens on the first bench run. So whichever gets run first, np or usearch, gets penalised bigtime. So now I have re-run with sizes that never reach swap threshold, and the results are MUCH more impressive for usearch. Basically usearch is twice as fast. 7e6x1e3 scan for 1e3 top 50 is 1.32 seconds for numpy and 0.633 seconds for usearch. Swapped the order of benchmarks as well to rerun, and results check out. Nice work. usearch is now in my toolkit and I apologise again for the misleading comment.

As an aside, it's kind of amazing how it takes essentially just over half a second to scan 7m 1032-size vectors for semantic similarity, on a (beefy but not extraordinary) desktop computer. Modern hardware is so awesome. And I'm guessing I could get another order of magnitude or two speedup if I got Metal involved.

EDIT: Linux on tiny el-cheapo 100 dollar Intel n95 mini PC with 32GIG of (single channel) RAM, and dropping size to 3mx1024: usearch: 0.65 seconds numpy: 0.99 seconds. Amazing.


Oh, epic! Thanks for taking the time to double check :)


> poetry: 0.99 seconds

Good enough for me and sans VC risk.


You were anything but “randomly born in your country”. You are most likely the latest branch in a sedentary tree rooted both geographically and culturally for centuries, and without that tree you certainly wouldn’t exist. This doesn’t mean you shouldn’t travel, perhaps plant a new tree, but the idea that you were born randomly is obviously false.


From your point of view you were born randomly in a universe in a random country to random parents. Yes. There is a story once you were born. But as far you are concerned you could have been born as anything in any Universe. There would always be a story backing it up.


But you are not a tabula rasa. You are made of a genetic code inherited from your parents, which strongly determines how you look and your body can do. You were clothed, fed, raised, cultured, taught by your family and others in their geographic area.

The self has a heritage even if the self refuses to accept it. You are your parents' child and not someone else, for the same reason you are also a human, a hominid, a mammal, not a fish or a tree.


If I am made of genetic code, why do they say I die and no longer exists even though the body remains?


Your genetic code shapes who you are and how you appear to others when you're alive.

If you're alluding to the existence of some transcendental self, can you at least accept it is inseparable from your corporeal self?


I think the corporeal self is an illusion.


Perhaps you are right, Neo. Perhaps we are in the Matrix. But you lack evidence for your position. How could you tell if you're a transcendental spirit, and not a brain in a jar suffering from last Thursdayism?

I have plenty of evidence that we're bags of chemicals wandering around a rock in space, and our consciousness is the results of electical signals passing through our synapses. If you wish to reduce the majesty of nature to "unknowable", you reduce your own position to unknowable too. If you're fine with that, I'm fine too, but I still like my position better than yours.


The only thing you know is you are aware. Rest everything mere appearances in the awareness. And that includes all the what ifs, the idea that you have evidence and you are sure.


If CPUs are made of transistors, then why do we call them broken when they stop functioning even though the chip still remains?


Are you saying CPUs are alive as in how humans and birds are alive and have conscious experiences? What is that experience? Is it the body? Or something else ?


The problem here is that you're defined by the history of environments you've been in, which provides a reason to stay in them (and a reason why the phenomenon of culture shock exists). I could not randomly have been, to take a random person, Carmen Miranda, because I am not her to any extent, so for me to "be" her, in some alternate reality, wouldn't mean anything.


Who are you?


IDK, a strange loop? But a specific one.

Oh wait, you changed what to who, didn't you?


Does the specificity comes from beliefs that you think about yourself?


Kinda, but like they say, if you open your mind too far, your brain falls out.


> "if you open your mind too far, your brain falls out"

Boom, thanks, and ouch... I guess.


Is that also something that you believe or you know?


It's a maxim that reflects objective reality, according to me.


Do you know it or is that also a belief?


It's knowledge, and it may be false.


I grew up near a border; a border which changed a lot throughout history. Not to mention that the country I was born into didn't exist as such even a few generations ago. So yeah, I'd go with “randomly born in your country”.


Which is what we also call a "random forest".


You could have said "two people decided to have sex, you were not born randomly", and it would have had the same relevancy. The fact is that you had 0 input on where you will be born, so from your perspective, it was random. Everything else is just trying to play up a belief, religion, or some kind of woo as a ground truth.


Absurd.

Any decision I did not take; is random to me amd I am really mot responsible for it.


Soon you’ll have to pay extra for that. “Introducing Faraday Cage Class…”


NixOS really makes Ubuntu (and all the other distros) feel old though. I mean I _love_ Ubuntu, and I’ve used it faithfully for 12 years, but once you get used to Nix, and granted, it’s tough, but it’s just an absolute revelation in terms of confidence in one’s operating system, freedom to use so much more software, and not be worried about even very advanced configs. I could never go back.


I've always thought that NixOS is a new distro because of the recent hype, but apparently it is older (by about a year) than Ubuntu!


Interchangeable cookie-cutter coder availability is Finance's number one priority. Definitely no room for critical pricers or infra written in languages that they can't frictionlessly slot someone else into as people leave. So python.

Also, arguably, Julia, while fantastic, just didn't do that much that Python didn't do already. It's main argument outside of tidier ergonomics was basically "speed without leaving Julia" but with Numpy and Pandas being essentially stdlib, that just wasn't a very powerful argument. Julia was basically too incremental to be worth switching to. It seems to have found its niche elsewhere though with the optimization people?


One of the main things I like about Julia is that all the libraries are just built on their core Array type. In Python, there’s torch tensors, numpy arrays, and I think Jax has their own array type too. Also, Julia has a really beautiful distributed computing paradigm, whereas python needs to use libraries like Ray, which have their own quirks and documentation and community


General scientific computing is pretty good across the Julia ecosystem, from optimization, to ODE and now PDE solver libraries, to various statistics and inference packages, etc. It lacks the deep NN tooling or breadth of ML libraries of Python, and nothing matches R for breadth of stats libraries, but for most other scientific computing it is really great at this point.


I am by no means an expert, but I used Flux.jl for a convolutional neural net in electromagnetics for my latest paper and it was such a breath of fresh air compared to Python and PyTorch. (I'm an EE and not great at programming, so I found a lot of frustration in PyTorch). Even though the Keras library in Python is pretty nice, even then I got myself into some odd pickles when trying to do some custom layers which used FFT processing as it relates to gradient computation. Things are much smoother in Julia, and that doesn't even count how much easier the Plots library is! I'm ashamed to admit that I have no idea how to manipulate the figures and axes in Matplotlib without extensive googling.


I am really waiting for the day Julia ML ecosystem improves and I can jump ship to it.


Scientific/Numeric/Data-Python is essentially a DSL around C-API which creates friction (try for example to map a custom function over a Pandas column). Whereas in Julia, it's just Julia. It's liberating to just extend and use a library written in the same language. It leads to surprising synergies.


numpy is only fast if the computation does not escape it. There are plenty of cases where execution ping-pongs (if that's a verb) between python and the C(++) wrapper numpy actually is. Then everything becomes quite slow.

Anyway, I see data scientists and statisticians (at least 100% of the ones I know) completely ignoring Julia, just because they only have been exposed to Python and R in their education. The quality of the programming language/ecosystem seems to be irrelevant.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: