Datapoint: I've used my Roborock s7 max ultra in three different households (2 houses, 1 apartment) and have had zero issues with this.
It actually was night and day compared to the $1000 equivalent roomba I had at the time. lidar is the game changer in this space, and roomba was complacent with their technology.
Yes, you can certainly use up your CPU allocation on an M-10 database (at which point we offer online resizing as large as you want to go, all the way up to 192 CPUs and 1.5TiB RAM). Even still, I've been able to coax more than 10,000 IOPS from an M-10. (Actually, out of dozens of M-10s colocated on the same hardware all hammering away.)
You can get a lot more out of that CPU allocation with the fast I/O of a local NVMe drive than from the slow I/O of an EBS volume.
Is there a good guide for all of these concepts in claude code for someone coming from Cursor? I just feel like the amount of configuration is overwhelming vs. Cursor to accomplish the same things.
Most guides to wringing productivity out of these higher level Claude code abstractions suffer from conceptual and wall-of-text overload. Maybe it's unavoidable but it's tough to really dig into these things.
One of the things that bugs me about AI-first software development is it seems to have swung the pendulum of "software engineering is riddled with terrible documentation" to "software engineering is riddled with overly verbose, borderline prolix, documentation" and I've found that to be true of blog and reddit posts about using claude code. Examples:
These are thoughtful posts, they just are too damn long and I suspect that's _because_ of AI. And I say this as someone who is hungry to learn as much as I can about these Claude code patterns. There is something weirdly inhumane about the way these walls of text posts or READMEs just pummel you with documentation.
Thanks for the new word re: prolix! Couldn't quite pin down why heavily AI generated posts/documentation felt off — aside from an amorphous _feeling_ — until today.
Technically you could theoretically accept incoming connections if the SOCKS5 server supports the BIND command and the client knows how to use it. It's rare though.
No one ever talks about the electricity demands for powering these things. Electric bills here in NJ via PSEG have spiked over 50% and they are blaming increased demand from datacenters, yet they don't seem to charge datacenters more?
A classic political games move, and it says more about how much anti-consumer nonsense is tolerated in New Jersey than it does about power generation and distribution pricing realities.
The data centers will naturally consolidate in areas with competitive electricity pricing.
This is called marginal pricing. Everyone pays the price of the marginal producer.
In some cases they try to get the data centres to pay for their infrastructure costs but the argument is that customers don't pay this normally but do so through usage fees over time.
AppleCare+ basically makes it so having a case or screen protector isn't a requirement for me. I was actually only using a screen protector for about a year, but what I found was they actually are more fragile than the phone's screen itself. I was going through them every 2-3 months. So about a year ago I stopped using screen protectors as well and have dropped the phone many times like how I dropped it with the screen protector and the screen has been fine.
Sure I have some scratches on the screen, but so what? If the front or back glass shatter, it's $29 to fix.
Assuming correct implementation of the NTP spec and adherence to the "eras" functions, NTP should be resistant to this failure in 2036.
The problem being so many micro-controllers, non-interfaceable or cheaply designed computers/devices/machines might not follow the standards and therefore be susceptible although your iPhone, Laptop and Fridge should all be fine.
You have 13 years to upgrade to 64-bit ints or switch to a long long for time_t. Lots of embedded stuff or unsupported closed-source stuff is going to need special attention or to be replaced.
I know the OpenFirmware in my old SunServer 600MP had the issue. Unfortunately I don’t have to worry about that.
Most 32-bit games written for some form of Unix will use the system time_t if they care about time. The ones written properly, anyway. Modern Unix systems have a 64-bit time_t, even on 32-bit hardware and OS. If it’s on some other OS and uses the Unix epoch on a signed 32-bit integer that’s another design flaw.
You’ve got 13 years to update unless any of your code includes dates in the future. Just stay away from anything related to mortgages, insurance policies, eight year PhD programs, retirement accounts…
If you’re managing mortgages or retirement accounts on systems that weren’t ready for 2038 by 2008 you were already missing the biggest bucket of the market.
Another one bites the dust. I've used weather underground api, yahoo weather api, dark sky api, and all of them have gone from free to paid (or just not public anymore) over the years. Currently using pirate weather - https://pirateweather.net/en/latest/
Is there any way people would be incentivized to setup a little weather station/contribute to data and get paid. Wonder if there's a model where people could make money/not game the system too. It would have to be standardized/verified to be accurate somehow.
Blockchain people have been trying variations of this for a decade. Any time you create a system that pays people for data, it will be exploited to the extreme.
I don’t think you need to incentivize people to provide weather data. Just make it easy to set up a station and get a lot of people interested. There are already hobby stations out there and networks for them.
While that detail is true, the real problem is much more general: you have goal x, you use some proxy y for that goal, you pay people for y, they give you lots of y that may end up being the exact opposite of x.
Famously, the British found x was "fewer cobras" and y was "cobra tails", the opposite of x being "the locals bred cobras to get money for cobra tails".
Make a citizen science weather station that's free, it's all fine. Make it paid, someone's going to grab satellite pics and generate from them plausible but not necessarily accurate simulated weather station data for everywhere to get that money.
Incentives do work in general. Sometimes they are abused. Incentives with no checks and balances are always abused. I don’t think the generalized problem you discuss above is broadly general
Incentives can work, but most governments and businesses are still only mediocre at them even with enough money to throw at the problem they get to do do-overs when they get it wrong.
Trying to do this with humans on a big scale combines the worst of software development in the days of punched cards, working without anyone having given you a formal language spec, and black-hat hackers on the modern internet.
It is very very easy to pick your incentives badly; you only get feedback on a very slow cycle (in the punched card days you might run the program overnight and only find it crashed on line 32 from a typo the next morning, but it's much slower than that in meatspace); and you also need constant fine-tuning as people interested in gaming the system share their methods for doing so.
Bitcoin wasn’t designed for high throughput. I’m referring to projects like the Helium network, which rewards people for running network nodes: https://www.helium.com/
It doesn’t work as well as they wanted and it has been subject to various exploits over the years from people figuring out how to fake the participation to extract rewards.
>>Commerce on the Internet has come to rely almost exclusively on financial institutions serving as
trusted third parties to process electronic payments. While the system works well enough for
most transactions, it still suffers from the inherent weaknesses of the trust based model.
Completely non-reversible transactions are not really possible, since financial institutions cannot
avoid mediating disputes. The cost of mediation increases transaction costs, limiting the
minimum practical transaction size and cutting off the possibility for small casual transactions
From the bitcoin white paper.
>I’m referring to projects like the Helium network, which rewards people for running network nodes
TON blockchain showed throughput above 100k per second, about 3x more of visa. There are other blockchains with similar throughput claims. L2 networks can give even lower fees and higher throughput.
I don't think they were saying to use blockchain to do this. It's just an example that shows that if you offer financial incentives in exchange for data people will exploit it and gameify it. The reason blockchain clearance rates are so slow is because of all the effort to prevent this. You could remove PoW from bitcoin and the network would be significantly faster. It would also be dominated by people exploiting it.
It's the same thing with ad networks, most of the effort goes into verifying that an ad click was legitimate and not a bot. Or that classic story of when the British government tried to eliminate Cobras in India by paying a bounty for every dead cobra, which just led to people breeding more cobras.
This is primarily for air quality by default, but you can get temperature, humidity, etc as well. For each station, someone paid for the hardware and is sharing the data gratis.
You’ve just described Ambient Weather. What I find kinda funny about that is they still try to upsell you to get more than 1 year of data retention.
Luckily, they allow you to configure additional arbitrary locations to pump data to. I wrote a little program to drop that data into an InfluxDB database (along with PurpleAir, AirGradient, AirThings, Solar Data, and Iotawatt). The only practical use I’ve found is to look and see “When was the last time we head three days in a row that were so windy?” I suppose I could do fun stuff with Home Assistant too.
I'd be a little surprised if Google (or even Apple) haven't considered trying to use cell phone temp and pressure sensor data collected across the entire fleet of devices running their OS. Similar to the recent Android earthquake warning thing, or Google's traffic data.
Like others have pointed out though, gathering observation data is only part of the problem. Turning current and historical observations into usable and accurate forecasts is a big compute heavy task, and whoever is paying for that compute needs either government funding, which is not easy in the age of DOGE, or to charge for the forecasts.
I have a weather station that collects temp, pressure, wind speed and direction rainfall - and which has wifi and built in capability to send it's data to a bunch of web services. Sadly, it's still in the box it came in because I haven't got around to installing it and the burst of enthusiasm the inspired me to buy it has long since died. (If anyone in Sydney Australia wants it, reply here and we might be able to organise for you to come collect it.)
Is there a feasible way to turn noisy cell phone temperature data into reliable weather data? Cell phones can be indoors; they can be in someone's pocket next to their body heat; they can be in direct sunlight; and they can generate a lot of heat themselves under load. And it's not just outlier phones that aren't in a position to accurately measure outdoor temperature; it's probably the majority of phones at any given time.
But I barely understand how shit works when you operate at Google scale.
I wonder how many "Android-ish" devices like maybe in car entertainment systems are out there and reporting all their telemetry data back to Google? I wonder how much "Android" is in vehicles with AndroidAuto, and whether that hardware typically has temperature and pressure sensors like phones do?
If I had, say, a billion cars sending me data that includes temp, location, and possibly some vehicle specific CANBUS type data - I'd guess there could possibly be a signal in there that could reliably report temps and pressures at locations, based on heuristics that identify cars parked outdoors.
Same with phones. At the scale of "every single Android phone on the planet", the left over after "the majority of phone that aren't in a position to accurately measure outdoor temperature" still leaves a huge number of devices. I suspect even something stupidly simple like "What's the p99 low temperature of all the Android temp reports in a suburb?" might be a really good indicator, when "all the Android phones in a suburb" might be 10,000 devices or more?
>It would have to be standardized/verified to be accurate somehow.
You could do something that for the same zip/county, aggregates the results based on a certain percentage. You could weight it based on how many times a user is outside this range. (e.g. bad actors)
I just got zigbee working in my house (SONOFF Zigbee 3.0 USB Dongle Plus Gateway). Is there any recommended weather nuts out there that could recommend a weather device (that they like and is cool), just in case someone wants to create a project and is looking for data providers.
There are a few models for community data collection/distribution that appear reasonably successful in ADS-B and bird tracking with commercial, non-commercial and academic examples. The challenge is that there are hard costs to collecting/persisting/distributing the data which are incompatible with free (which in reality just means someone else pays).
This feels like a great thing for the government to do which is why NOAA/NWS have traditionally maintained these services. The data these stations produce nationally is valuable but hard to quantify on an individual station level - should the station that detects vital data about a hurricane be given a large bonus for it? If so we'll end up with extremely lopsided coverage while the information from nearby weather systems can be invaluable.
It kinda was, but they did this same rugpull and closed their free api way back before covid.
They do still claim:
"250,000+ Weather Stations
Weather Underground is a global community of people connecting data from environmental sensors like weather stations and air quality monitors "
I think it's a bit like FlightRadar24 - if you feed them your weather station (or ADBS receiver) data you get some level of free access but with non commercial use restrictions.
same model as flightaware et al use for crowdsourced ADS-B air traffic monitoring. You set up an ADS-B receiving station, send your feed to FA, and in return you get a premium-level account.
I never considered government weather departments would provide APIs for their data, but after seeing your comment I went to see if Environment Canada provided one.
I am very impressed by how how much data they provide free of charge.
Michael Lewis' book the fifth risk talks explicitly how someone with a paid weather app was put in charge of the commerce department (by trump) so he could try to hide public weather data but use it in his app then charge for it.
> The first 500K calls a month is free with a $99 a year[...]
They may be included with your $99/year subscription, but to call them "free" is like saying that the groceries I'm holding are free because I just gave the cashier money.
I would say it's more like saying that driving on the highway is free because you pay taxes. I doubt anyone is buying a developer account specifically for weather API calls.
Lots of people drive on highways who pay no highway taxes, foreign tourists for example.
It's free like riding the monorail at Disney World is free; included in the cost of your entry ticket, and utterly inaccessible to anyone who has not paid.
> Lots of people drive on highways who pay no highway taxes, foreign tourists for example.
Highways are mostly funded through gas taxes, and registration fees for EVs. Even foreign tourists buy gasoline, or drive an EV that paid its registration fee.
Also, the Disney World monorail is outside the ticket gates. You can ride it without a ticket.
Foreign tourists very much pay for driving on the highways on the way to Disney World through higher sales taxes, tolls, hotel taxes etc. There is a reason that Florida is state tax free.
Source: I live 30 minutes away from Disney and partially moved to Florida when I started working remotely to save money on taxes.
I feel its not nearly as useful as the old darksky api. The secret sauce of that software was that it combined typical weather data with local reports. Afaik there is no way to submit a weather report on the apple weather app. They bought it for the name and to kill a competing option essentially vs attempting to use what made that app actually compelling compared to other weather apps.
If you scroll down on to the bottom of Apple Weather it has a “Report an Issue” button which allows you to report current weather conditions at your location.
I have no idea what happens to that data and if it contributes to the report in any way.
Wow I would have assumed that was to report a bug not a weather call. Darksky had a great ui for quickly reporting the weather and encouraged it as the primary feature of the app.
The charge cards don't even have to be paid in full anymore. They offer "pay over time" for charges on platinum/gold/green/etc which effectively makes them credit cards.
It actually was night and day compared to the $1000 equivalent roomba I had at the time. lidar is the game changer in this space, and roomba was complacent with their technology.
reply