There’s a basic approach to this using markov chains which works surprisingly well. Scott Aaronson once challenged some students to beat his algorithm — only one student could, who claimed he just “used his free will”. Human randomness isn’t so random. There’s a neat little writeup about it here: https://planetbanatt.net/articles/freewill.html
I like to think that this is a measurement of free will in the literal, naïve sense. It makes just as much sense as other definitions (the ability to take action independent of external cause) and it has the bonus of being quantifiable.
The only downside? A LOT of people get very mad at the implications.
Double entry account is, in fact, what gave “transactional” databases their name: They were meant for financial transactions!
Nowadays TigerBeetle is a custom built financial database just for double entry accounting. The implementation is fascinating.
Especially with Apple, I often see people scared that if they open up their ecosystem, then users will lose one of the most consumer friendly tech companies out there. It’s not just “if apple allows alternative browsers then Chrome will win”, which is (probably) true. It’s:
* If Apple allows alternative app stores then the whole ios ecosystem will rot and be foooded with malware, brough up during the Apple vs. Epic cases
* If Apple can’t control the data on their user’s phones, then privacy rights will disappear, a common talking point during the Apple vs. Facebook case for opt-in data collection.
And like, these points are correct — Apple kind of acts like a “benevolent dictator” when it comes to their ecosystem. But shouldn’t there be alternatives between “Apple can control all software on the hardware they sell” and “the moment Apple doesn’t have control of their user’s experience then it’ll be far worse”? Like, we should have more tech companies, more options to pick from between these two extremes. The market needs to be more competitive, and if that isn’t possible shouldn’t there be regulation to protect users and devs better? This constantly feels like a “pick your poison“ kind of deal, where we can only pick between a company locking down their hardware or abuse of users via. software. If Microsoft banned alternative browser engines there’d be riots in these comments. Apple is just better to its users.
Giving companies the power to lock down hardware they sell isn’t a solution that will work when Apple inevitably turns against its users, and is a horrible precedent to set legally. Lord knows John Deere and a million other predatory hardware companies are salivating at the idea of users of their hardware not having control over what they bought, and Meta and Microsoft love the idea of users not having control of the software they run and the data it collects. We can’t just picking between the least worst of two companies.
It's weird that people never distill those arguments to their most basic logic.
Apple directly dictate the shape, speed, and existence of any innovation on iOS, and by extension, any innovation involving mobile phones or meant to run on mobile phones. They don't simply have "power" over it, in the sense that they get to say "Yes" or "No". iOS is locked down in such a fundamental way that any innovation will not come about unless Apple specifically envisions it and designs the OS to support it.
Browsers didn't exist when Windows 1.0 came out. But they happened. If it had been iOS, there would have been no networking, no JIT (I know that came later, bear with me), Firefox/Gecko could never have existed and been able to fix the web. Apple alone would have controlled the evolution of the most important tech of the past few decades. It couldn't have existed in the first place unless Apple, and no one else, invented it and put it in iOS themselves. Basic OS features: files and the filesystem, sharing, casting your screen, communicating with other devices. It doesn't exist until Apple makes it. It doesn't change until Apple changes it.
Even something as simple as file syncing. They forced Dropbox, GDrive, OneDrive to adopt their shitty, buggy backend. Those services all had to drop basic features to adapt. Those features can't ever come back unless Apple allows them. Any hypothetical new features won't exist unless Apple, and no one else, thinks of them and adds them.
> iOS is locked down in such a fundamental way that any innovation will not come about unless Apple specifically envisions it and designs the OS to support it.
No platform highlights the issue you hit on here like VisionOS.
It is barren. Not just because of the lack of customer base for paid apps, that hurts too, but because the APIs aren't there, and because you can't hack on the private APIs or the hardware directly...they won't be. The app store on Vision Pro is filled with half-assed "spatial computing" consumption apps (Wow, I can put the stock tickers on the wall! That I can only see with these huge goggles on! Neat!), "showroom" apps that are just pure consumption, mostly 3D models of products, and media consumption apps. The games that exist are all pretty lame, and you can't enjoy any of the backcatalog of games written for VR because A. They'd never pass app review, and B. you can only use the PSVR controllers with it, so my Index controllers that I already have are useless.
The Vision Pro demands being as open as the Mac. The problem space is too ill defined and the hardware too packed with interesting use cases to gate behind the restrictive App Store rules. The iPad model worked because it was 2010 and had all the upward momentum of the iPhone to ride. Here and now, on a stagnant, occupied app market where room for innovation is small, on a device with far less promise, the App Store restrictions take all the air out of the room. The entire device is suffocated by Apple's iron grip and belief that they are entitled to own any good ideas that happen on the device, and that they are entitled to 15-30% of any economic exchange happening on the device. Just an utterly kneecapped platform right out of the gate, pricing, specs, weight, and everything else aside. There are no good apps because you just can't write the sort of apps your imagination is likely to want to make. Hell, accessing the main camera wasn't allowed until visionOS 2.0, and you have to use the "enterprise apps" API/entitlement to access it.
Apple's grip has killed it. It is a glorified TV you can wear on your face. It's a very good TV. It's even alright as an external monitor for a *real* computer.
Wind gusts were reaching 125 MPH in Boulder county, if anyone’s curious. A lot of power was shut off preemptively to prevent downed power lines from starting wildfires. Energy providers gave warning to locals in advance. Shame that NIST’s backup generator failed, though.
Notably, we had the marshal fire here 4 years ago and recently Xcel settled for $680M for their role in the fire. So they're probably pretty keen not to be on the hook again
I guess that explains why they had no qualms shutting down half of Boulder's power with a vague time horizon. After losing everything in my fridge, though, they finally turned it back on today.
Indeed. Losing the contents of (lots of) fridges is cheaper, as a whole, than incidentally burning the countryside. We all ultimately pay for the result no matter what, so that seems like a reasonably-sensible bet.
On the fridge itself: You may find that the contents are insured against power outages.
As an anecdote, my (completely not-special) homeowner's insurance didn't protest at all about writing a check for the contents of my fridge and freezer when I asked about that, after my house was without power for a couple of weeks following the 2008 derecho. This rather small claim didn't affect my rate in any way that I could perceive.
And to digress a bit: I have a chest freezer. These days I fill up the extra space in the freezer with water -- with "single-use" plastic containers (water bottles, milk jugs) that would normally be landfilled or recycled.
This does a couple of things: On normal days, it increases thermal mass of the freezer, and that improves the cycle times for the compressor in ways that tend to make it happier over time. In the abnormal event of a long power outage, it also provides a source of ice that is chilled to 0F/18C that I can relocate into the fridge (or into a cooler, perhaps for transport), to keep cold stuff cold.
It's not a long-term solution, but it'll help ensure that I've got a fairly normal supply of fresh food to eat for a couple of days if the power dips. And it's pretty low-effort on my part. I've probably spent nearly as much effort writing about this system here just now as I have on implementing it.
It's probably not worth it to go through your insurance for the loss of food and perishables in the fridge/freezer. It counts as a claim on your home insurance and can result in increased rates or even your insurer dropping coverage at the next renewal.
As previously-stated: There was no rate increase that I could discern.
I did neglect to mention that there was also no issue with renewal, but perhaps I should be more careful to always use absolute rote specificity and leave nothing to implication.
They also sent over some folks with a tall ladder to have a look at the roof of this property that they insured, which was good since we had no means to visually inspect it from the ground. (The roof was fine.)
Anecdotally, that phone call to the insurance company had no downside at all.
It provided a roof inspection that I did not have ready means to perform on my own, and a relatively small amount of money (a couple of hundred bucks) that became very useful not just because of lost food, but also due to all of the other storm-related issues that we were not insured against.
tl;dr - the fire destroyed over 1,000 homes, two deaths. The local electrical utility, Xcel, was found as a contributing cause from sparking power lines during a strong wind storm. As a result, electrical utilities now cut power to affected areas during strong winds.
The disater plan is to have a few dozens stratum 1 servers spread around the world, each connected to a distinct primary atomic clock, so that a catastrophic disaster needs to take down the global internet itself for all servers to become unreachable.
The failure of a single such server is far from a disaster.
And the disaster plan for the disaster plan is to realize that it isn't that important at the human-level to have a clock meticulously set to correspond to other meticulously-set clocks, and that every attempt to force rigid timekeeping on humans is to try to make humans work more like machines rather than to make machines work more like humans.
I really, really can't get behind this sentiment. Having a reliable, accurate time keeping mechanism doesn't seem like an outlandish issue to want to maintain. Timekeeping has been an important mechanism for humans for as long as recorded history. I don't understand the wisdom of shooting down establishing systems to make that better, even if the direct applicability to a single human's life is remote. We are all part of a huge, interconnected system whether we like it or not, and accurate, synchronized timekeeping across the world does not sound nefarious to me.
> Timekeeping has been an important mechanism for humans for as long as recorded history.
And for 99% of that history, Noon was when the sun was half-way through its daily arc at whatever point on Earth one happened to inhabit. The ownership class are the ones who invented things like time zones to stop their trains from running in to each other, and NTP is just the latest and most-pervasive-and-invasive evolution of that same inhuman mindset.
From a privacy point of view, constant NTP requests are right up there alongside weather apps and software telemetry for “things which announce everyone's computers to the global spy apparatus”, feeding the Palantirs of the world to be able to directly locate you as an individual if need be.
> The ownership class are the ones who invented things like time zones to stop their trains from running in to each other
In a world where this didn't happen, your comment would most likely read:
> The ownership class are the ones who had such indifference toward the lives of the lower class passengers that they didn't bother stopping their trains from running into each other.
Far more things rely on reliable and accurate time-keeping than just being on time to work. Timekeeping is vitally important (even if it's not readily visible) to lots of critical infrastructure worldwide.
Actually, it's really important to me to have a network of atomic clocks available to verify the times I clock in and out, I want to make sure I get paid for an accurate duration of time down to the nanosecond
oh....no, not really, no, the world needs GPS, so, yeah. this is not like scrooge mcduck telling you to be at work on time. scrooge still has a windup watch
If access to the site is unsafe and thus the site is closed; not having access seems reasonable.
Time services are available from other locations. That's the disaster plan. I'm sure there will be some negative consequences from this downtime, especially if all the Boulder reference time sources lose power, but disaster plans mitigate negative consequences, they can't eliminate them.
Utility power fails, automatic transfer switches fail, backup generators fail, building fires happen, etc. Sometimes the system has to be shut down.
Maybe this is the disaster plan: There's not a smouldering hole where NIST's Boulder facility used to be, and it will be operational again soon enough.
There's no present need for important hard-to-replace sciencey-dudes to go into the shop (which is probably both cold, and dark, and may have other problems that make it unsafe: it's deliberately closed) to futz around with the the time machines.
We still have other NTP clocks. Spooky-accurate clocks that the public can get to, even, like just up the road at NIST in Fort Collins (where WWVB lives, and which is currently up), and in Maryland.
This is just one set.
And beyond that, we've also got clocks in GPS satellites orbiting, and a whole world of low-stratum NTP servers that distribute that time on the network. (I have one such GPS-backed NTP server on the shelf behind me; there's not much to it.)
And the orbital GPS clocks are controlled by the US Navy, not NIST.
So there's redundancy in distribution, and also control, and some of the clocks aren't even on the Earth.
Some people may be bit by this if their systems rely on only one NTP server, or only on the subset of them that are down.
And if we're following section 3.2 of RFC 8633 and using multiple diverse NTP sources for our important stuff, then this event (while certainly interesting!) is not presently an issue at all.
There are many backup clocks/clusters that NIST uses as redundancies all around Boulder too, no need to even go up to Fort Collins. As in, NIST has fiber to a few at CU and a few commercial companies, last I checked. They're used in cases just like this one.
Fun facts about The clock:
You can't put anything in the room or take anything out. That's how sensitive the clock is.
The room is just filled with asbestos.
The actual port for the actual clock, the little metal thingy that is going buzz, buzz, buzz with voltage every second on the dot? Yeah, that little port isn't actually hooked up to anything, as again, it's so sensitive (impedance matching). So they use the other ports on the card for actual data transfer to the rest of the world. They do the adjustments so it's all fine in the end. But you have to define something as the second, and that little unused port is it.
You can take a few pictures in the cramped little room, but you can't linger, as again, just your extra mass and gravity affects things fairly quickly.
If there are more questions about time and timekeeping in general, go ahead and ask, though I'll probably get back to them a bit later today.
I'm the Manager of the Computing group at JILA at CU, where utcnist*.colorado.edu used to be housed. Those machines were, for years, consistently the highest bandwidth usage computers on campus.
Unfortunately, the HP cesium clock that backed the utcnist systems failed a few weeks ago, so they're offline. I believe the plan is to decommission those servers anyway - NIST doesn't even list them on the NTP status page anymore, and Judah Levine has retired (though he still comes in frequently). Judah told me in the past that the typical plan in this situation is that you reference a spare HP clock with the clock at NIST, then drive it over to JILA backed by some sort of battery and put it in the rack, then send in the broken one for refurb (~$20k-$40k; new box is closer to $75k). The same is true for the WWVB station, should its clocks fail.
There is fiber that connects NIST to CU (it's part of the BRAN - Boulder Research and Administration Network). Typically that's used when comparing some of the new clocks at JILA (like Jun Ye's strontium clock) to NIST's reference. Fun fact: Some years back the group was noticing loss due to the fiber couplers in various closets between JILA & NIST... so they went to the closets and directly spliced the fibers to each other. It's now one single strand of fiber between JILA & NIST Boulder.
That fiber wasn't connected to the clock that backed utcnist though. utcnist's clock was a commercial cesium clock box from HP that was also fed by GPS. This setup was not particularly sensitive to people being in the room or anything.
Another fun fact: utcnist3 was an FPGA developed in-house to respond to NTP traffic. Super cool project, though I didn't have anything to do with it, haha.
Now if the (otherwise very kind) guy in charge of the Bureau international des poids et mesures at Sèvres who did not let me have a look at the refrerence for the kilogram and meter could change his mind, I would appreciate. For a physicist this is kinda like a cathedral.
If you ever are in Paris, I can't recommend the Musee des Arts et Metiers enough.
I believe they have the several reference platinum kilograms that are now out of spec. [1]
they also have the original actual Foucault pendulum that was used to demonstrate Earth's rotation. (and a replica doing a live demo, of course)
They have so many incredible artifacts (for weights and measures but also so much more: engineering, physics, civil engineering, machining,...)
I don't know if you will be reading this, but I am just back from that museum. Thank you very much for the information.
I spent 4 hours to there and was surprised to see so many tourists, this is not a place I expected people visiting Paris to go to. There were no crowds though.
The top part is really great, you get to see how much people did with so little. So is the chemistry part.
I found the steel replica of the kilogramme and the meter, and of course the Foucault pendulum (in the neighboring refurbished church).
This is truly an interesting museum, on part with the museum of discoveries (musée de la découverte) which is unfortunately close now for a few years for renovations (or at lest was recently planned to be closed). Much better than La Vilette.
Ahhh, thank you for the tip. I live in Versailles and usually go to museums for art, but this would be wonderful as well.
The Musée de Sèvres (or Bureau des Mesures as it is called now) has the original kilogramme and meter iridium reference, hidden in the basement ;( So if the director has a change of heart, I am all in!
>The actual port for the actual clock, the little metal thingy that is going buzz, buzz, buzz with voltage every second on the dot? Yeah, that little port isn't actually hooked up to anything, as again, it's so sensitive (impedance matching). So they use the other ports on the card for actual data transfer to the rest of the world.
Can you restate this part in full technical jargon along with more detail? I'm having a hard time following it
As you can see, the room is clearly not filled with asbestos. Furthermore, the claim is absurd on its face. Asbestos was banned in the U.S. in March 2024 [1] and the clock was commissioned in May 2025.
The rest of the claims are equally questionable. For example:
> The actual port for the actual clock ... isn't actually hooked up to anything ... they use the other ports on the card for actual data transfer
It's hard to make heads or tails of this, but if you read the technical description of the clock you will see that by the time you get to anything in the system that could reasonably be described as a "card" with "ports" you are so far from the business end of the clock that nothing you do could plausibly have an impact on its operation.
> You can't put anything in the room or take anything out. That's how sensitive the clock is.
This claim is also easily debunked using the formula for gravitational time dilation [2]. The accuracy of the clock is ~10^-16. Calculating the mass of an object 1m away from the clock that would produce this effect is left as an exercise, but it's a lot more than the mass of a human. To get a rough idea, the relativistic time dilation on the surface of the earth is <100 μs/day [3]. That is huge by atomic clock standards, but that is the result of 10^24kg of mass. A human is 20 orders of magnitude lighter.
Agreed the stated claims don't seem to make much sense. Using a point mass 1 meter away and (G*M)/(r*c^2) I'm getting that you'd have to stand next to the clock for ~61 years to cause a time dilation due to gravity exceeding 10^-16 seconds.
Actually, it's even worse than that: the design of the clock makes it so that the cesium atoms doing the actual time keeping are in free-fall while they are being observed. So it is physically impossible for any gravitational influence to change the accuracy of the clock.
> technically if you have 3 or more sources that would be caught; NTP protocol was designed for that eventuality
Either go with one clock in your NTPd/Chrony configuration, or ≥4.
Yes, if you have 3 they can triangulate, but if one goes offline now you have 2 with no tie-breaker. If you have (at least) 4 servers, then one can go away and triangulation / sanity-checking can still occur with the 3 remaining.
Sure, but not needing a failure to cascade to yet another failsafe is still a good idea. After all, all software has bugs, and all networks have configuration errors.
If your application is so critical that NTP timing loss causes disaster and your holdover fails in less than a day and you aren't generating your own via gps, you are incompetent, full stop
The article explained it fairly well. Re-iterating the credit card churning example: people will spend a lot of time optimizing their credit card spend, to end up with maybe a few hundred dollars in savings per year. Working 10 hours of overtime a year nets more and takes less time/mental capacity, for example. But it is fine to do this anyway if you let go of the "I'm saving money" schtick and just embrace that you like maximizing points on spend.
I keep hearing about how the supply and demand cycles are “lagged” and that this price spike happens every 3-5 years for purely economical reasons. I feel like I’m being gaslit — I don’t recall any such price spikes in the past of this magnitude. You’re telling me that there’s a totally predictable price cycle and NO ONE has prepared for it? Or else prices are just high temporarily and no one can step in to increase supply? Either way, it seems that there’s not enough competition in this market.
I am always astonished by the range of people who claim their college degree was useless, citing rote memorization and bad classes. I had an entirely different experience and so did most people I know. University gave me the opportunity to talk to world-class researchers during office hours, to discuss ideas with my peers and have them either validated or critiqued by experts. Sure, all the information is available online (which is a miracle into itself) but without frequent contact with professors and mentors I wouldn’t have even known where to look or what existed in the field. University, for me, was a place where I was apprenticing full-time under highly experienced people, surrounded by people my age who also were doing the same. Years of self-teaching didn’t get me anywhere close to what a few semesters of expert mentorship got me. I never felt I had to memorize anything: exams consisted of system design or long programming projects or optimization challenges. I loved it, and I’m not sure if people went to different universities or just didn’t take advantage of the opportunities presented to them.
These people can’t possibly be at every university, let alone colleges, community colleges, or technical schools.
> … rote memorization and bad classes …
Not every school will be good. There are at least three post-secondary schools within driving of me that take the minimum required curricula as a script and offer nothing more than the bare minimum required to get certification, accreditation, and receive that sweet state and federal budget money.
I can’t imagine how someone with a good or great post-secondary education is confused that this would be the situation for millions of students.
Thank you; I feel similarly and wish I could go back as an adult to take even more advantage of all the incredible opportunities. Having four years to dedicate to learning new things in depth -- with zero pressure to take shortcuts so that the lessons become economically useful sooner -- was more of a privilege than I understood at the time.
And then of course, the people I met there have shaped my life and career in wonderful ways ever since. The sheer level of diversity among students and faculty is unlike anything I've experienced elsewhere. Many of them are still my lifelong friends (or in one case my wife :)) and others have opened professional doors to me 15 years later and counting.
But also, I went to a very well known and respected university with sufficient endowment and financial aid that it shouldn't be functioning as a "toll gate" regardless. I know things are not this rosy at a lot of universities.
Did you learn about selection bias in university? Maybe you went to a good one, but there are far, far more dogshit schools than good ones.
Just a few years ago my husband had all of his tuition refunded (and degree cancelled) because the school was so bad and so scammy that the government had to step in and force them to refund everyone.
The reality is that higher education in the USA is a for-profit venture, and like all for-profit ventures in the US, the number one explicit goal is to extract as much profit as possible by any means possible. Providing quality education and world-class faculty is completely disjoint and incompatible with that goal.
Most people in this country are not so privileged as you to attend one of our dwindling number of good schools. Everyone else has a predatory institution that technically meets the requirements to offer the degrees they claim. Usually, anyway.
> or just didn’t take advantage of the opportunities presented to them.
It's this. Most undergraduate students do not go to office hours, try to get to know their instructors, ask follow-up questions, pursue independent research, or do anything approaching "apprenticeship". Most American students matriculate into college/uni not even having ingrained behaviors that make any of these things obvious or approachable, so yes, it's understandable why many would consider higher ed the same as secondary ed: rote memorization and "bad" classes.
> … Most undergraduate students do not go to office hours, try to get to know their instructors, ask follow-up questions
This was actively discouraged by the instructors in the school I attended. Not by policy, but by behavior - passive-aggressively belittling students for not “getting” the subject matter, showing a complete lack of interest in reciprocating any amount of getting to know the instructor.
> … ask follow-up questions, pursue independent research, or do anything approaching "apprenticeship". Most American students matriculate into college/uni not even having ingrained behaviors that make any of these things obvious or approachable …
A failure of secondary education and students’ families.
I was talking with a professor yesterday who claimed his students don't ask questions any more on Piazza. They used to, but now they go to ChatGPT which is always perky and ready to answer. Plus, there's no shame in asking a dumb question as there can be in class or on Piazza.
He says it's only a matter of time before the students realize they don't need him. Or need to pay tuition.
And we’ve all been sold a bill of goods on the necessity of diplomas and degrees. Because businesses have been sold a bill of goods on the quality of employees with diplomas and degrees.
Within at least the last 15 years, the paper provided by a school is no guarantee of better pay - but that’s how high schoolers are convinced to go into excessive debt for attending post-secondary schools.
My experience has been there is no correlation between skill at teaching and skill at research; maybe the two are even anti-correlated. To some extent, this is an artifact of the selection process for professors, but I think it's partly because there's a real tradeoff between spending effort on research vs teaching.
In some cases, an excellent researcher even has cogent papers but is absolutely abysmal at lecturing and in person teaching skills.
Peers are very important, but from talking to others, it's harder to know where you will get good peers than you would think. Even 1st tier universities will have majors dominated by students whose primary interest is in maximum grades with minimum work and where cheating is rampant. You've got to either get lucky (I did) or put in some leg work to find smart students who are actually interested in learning and doing things right.
I think how much rote memorization is encouraged or required is strongly dependent on the field. Pre-med students will sometimes memorize their way through calculus; a professor I knew once described it as "grimly impressive".
Teaching is a skill like any other. While I don't think the two are anti-correlated, you're going to find good teachers and bad teachers, no matter how good they may be at their other duties.
And I would gather you find more bad teachers than good, but that's true of many spaces from IT to sports.
If you’re in an American University, then the professors will most likely be top researchers in some particular niche, and they will likely have office hours. I think the “college is a toll” argument specifically applies to American universities that are essentially pay to play.
That’s not true. Source: University of Warsaw. Poland. Not Illinois. I’ve had office hours with world-class mathematicians. Those office hours were required of every lecturer and TA.
I believe you’re referencing Patrick McKenzie’s takes on fraud, which I agree with — but when he (and others) talk about the optimal amount of fraud they’re usually referring to fraud from “losing money from the company to customers”. This is not PayPal’s case; because PayPal isn’t the victim of fraud on its platform but makes money off its use, their optimal amount of fraud is “as much as they can permit without losing customers or regulators getting up their ass”.
I love conferences and talks and wish I could go to them more — what stops me is the sudden drop in corporate sponsorship of them. It’s been nearly impossible to convince leadership to spend money to train their employees or to socialize within their own discipline, and while I’d love to go to conferences, taking time out of my own PTO and my own wallet to see them isn’t worth it.
> while I’d love to go to conferences, taking time out of my own PTO and my own wallet to see them isn’t worth it
I’ve been using PTO and my own dime for conferences this whole time! It’s been super worth it career-wise especially as a speaker. Lots of fun, meet great people, have a bunch of “sawdust” to show for it.
The biggest impact has been that people in the industry generally know of me and that lends itself to creating opportunities / opening doors that otherwise wouldn’t exist.
> what stops me is the sudden drop in corporate sponsorship of them.
That's true in two ways: not only are less companies paying to send their attendees to training, but less companies are paying to sponsor these events as well.
Even at AI research conferences the trend seems to be such (also fewer industry exhibits), though I'm not perfectly up to date on this, might have turned around very recently. The reason seems to be that they are not hiring as much right now.
I think this is a shorter-term trend in the economy though, it doesn't necessarily hold as much inertia as other factors. Unless the AI job replacement really works out the way many companies hope.
In my experience it tends to be the opposite — I am not a quant (QD) but having worked with a few teams there’s a negative selection for technical expertise. QRs who are good at programming are usually pushed into maintaining infrastructure, datasets, or just tooling for less technical members of their team, who then get to use those tools to further their own alpha generation. Orgs incentivize the final step in making alpha — spend too much time helping others or building reusable research, and your coworkers steal the thunder.
That, or stop helping your coworkers/accommodating them… risky, as a career move. Only seen that work once.
reply