The interesting abstract idea here is to use excess power capacity during the night or windy periods in given region for computing applications (like mining or high-latency cloud computing) where geography is irrelevant. This is a cool strategy for arbitraging away one of the biggest problems with renewable energy sources (intermittency). And it should only get more effective as the ratio of power society devotes to computation vs. physical use (heating, cooling, etc.) increases. Not sure why I'd never heard of it before.
(The application of the earned money to funding climate research is basically unrelated.)
EDIT: Maybe not. I'm getting article with conflicting estimates for future growth, but this study
Bunch of aligning trends reducing energy usage in recent times, will be interesting to see if it keeps up. (more virtualization means less machines, strong push towards efficiency in both computing and cooling, especially among the big DC operators, trend towards Cloud computing means more computation happens in those high-efficiency DCs instead of smaller, inefficient setups)
The only big examples of "high-latency" computing I can think of have been these volunteer distributed projects (BOINC, SETI), but maybe there's a commercial angle for that as well?
What makes the "cloud" data-centers energy efficient? Just oversubscription of VMs to hosts? Can't most datacenter-operating businesses do that with their vSphere installations?
Or is there something about the actual power distribution, machines, cooling, etc?
I was primarily thinking about the latter. Building a large DC from the ground up means they can better optimize cooling systems (there are even a bunch of experiments with ambient air cooling in cooler climates, schemes were the heat is transferred to users nearby instead of spending energy on dissipating it, ...). Since they are big enough to get their hardware custom made, they can optimize there to leave everything off they don't need, do power conversion at rack level. At least for services they control they have potential for clever cluster scheduling to improve things and shutting unneeded machines off.
Lots of potential for small optimizations that pay off on a large scale, but aren't feasible or worth the effort in smaller setups.
Both. First, the largest data centers have developed virtualization platforms that are significantly more efficient than vSphere. A major factor is that they provide fine-grained billing for compute, storage, etc instead of selling VMs, which allows them to optimize more effectively. Amazon can provide high-latency Glacier storage or spot instances for spare compute at very low cost.
>First, the largest data centers have developed virtualization platforms that are significantly more efficient than vSphere.
This needs a citation. One of the issues with AWS/etc is that they don't know the usage pattern of a customer that allocates a VM. A well managed vSphere system can be heavily oversubscribed with mostly idle VMs that don't need high performance.
On a PaaS like Heroku where dynos are statefree and short-lived, don't need to allocate CPU/RAM/HDD for a contiously-running VM. And scaling running dynos can be done automatically. We did this based on work to-be-done (message queue length), with scaling decisions taken every 20 seconds. Many parts of system had no dynos running as the default, only when needed.
With a function-as-a-service type computational model like Lambda, it becomes even more fine grained.
Yes, for now. The trend is (and has been) towards more and more elasticity. And since there are economic incentives on both platform provider and buyer side, I believe this will continue.
It is somewhat predicated on tooling and programming models continuing to improve alongside. Because for many smaller projects/businesses, compute costs are dwarfed by engineering costs.
No matter how oversubscribed, VM systems have to supply enough servers to meet peak demand, which means inefficiency during the trough. Amazon's spot billing means that they can approach 100% utilization 24/7.
It is worthwhile to note that by 2010 the GAO estimated over 100billion in total federal dollars on climate spending[1]. This number doesn't include other nation's spending, or private spending.
By contrast, Skylab's entire program cost was about 10 billion in 2010 dollars.
It's naive to think bureaucratic inertia is not a problem in climate change. In other words, it is not free of one of the problems the War on Poverty and War on Drugs saw: that they became industries focused on preserving themselves rather than ever fixing the problems they were meant to.
Whenever we spend our own money to fix problems that affect us, we consider the opportunity costs carefully. But when the collection plate is passed to us and we want to demonstrate virtue, our money is given into the hands of people who have a reward system that is not ideally suited to using it altruistically.
Is more money going to solve the climate crisis? Is there a better way, such as political reform? If not, I'd just like to be honest about how many manned missions to Mars we are passing up on to instead pay for this.
I know people say that "people will die from climate change," but people are dying. We have people even here in the USA without access to clean drinking water, basic healthcare, or safe neighborhoods to live in. Around the globe people die from malnutrition while the world produces a surplus of food. People don't die in 2017 because of a lack of tech or resources, they die because of politics and greed, or maybe even from one of the 26,000 bombs we dropped last year.
> Whenever we spend our own money to fix problems that affect us, we consider the opportunity costs carefully. But when the collection plate is passed to us and we want to demonstrate virtue, our money is given into the hands of people who have a reward system that is not ideally suited to using it altruistically.
Yeap. As usual, Yes Minister explains the political process to a T:
- This money is voted solely to make sick people better.
- No, no, no, Minister. It is to make everybody better. Better for having shown the extent of their care and compassion. You see, when money is allocated to the Health or Social Services, Parliament and the country feel... cleansed. Purified. Absolved. It is a sacrifice. After the sacrifice, nobody asks the priest what happened to the ritual offering after the ceremony.
Climate change is a mind-bogglingly large-scale problem. We've known that for decades, fossil fuels was energy expenditures that accrued debt for short-term growth... it's not going to be cheap to fix.
Actual wars are a better place to look - the US has been spending an average of $300B/y on war in the Middle East for a cause that kills next to no one. Cut backs on war would more than pay for all our healthcare, space exploration, and climate change needs.
I agree that our wars need to be either cut back or shut down. If you're interested, Rand Paul is actually putting up a fight in the senate for that right now.
That article is incredibly selective. For instance, the conclusion of the GAO investigation recommended that that level of spending was far too LITTLE. From their 2017 high risk report[1]:
"For example, the Department of Defense's (DOD) 2010 and 2014 Quadrennial Defense Reviews state that climate change poses risks to defense infrastructure, particularly on the coasts. DOD's infrastructure consists of more than 555,000 defense facilities and 28 million acres of land, with a replacement value of close to $850 billion."
Climate change is not some experiment for the virtue of learned knowledge. It's a management effort. The immediate costs far outweigh our spending, much less the future costs.
This has always seemed to me like an obvious setup for anyone with enough solar capacity to routinely run a surplus - particularly in states where the local utilities have purchased legislative support for near-punituve connection charges or effectively zero rates for anyone interested in selling surplus power back into the grid.
Edit: Not an outdoor setup but a setup to use all electricity surplus to regular needs for mining instead.
How does that compare to selling the excess power then putting the money into a (let's say) index fund? I have no data right now, so I don't have the answer.
I'm thinking specifically about states like Arizona, though I'm not sure of the current situation there being >1000 miles away. I know they had attempts to add a $100/month fee for connecting solar systems to the grid, plus assorted other licensing and regulatory requirements.
Mining equipment can also be older models no longer cost effective to mine with when you're paying for utility power.
If anyone's still checking a thread a few days old, I was talking about things like Florida's requirement that even if you're capable of being fully independent you be connected to the grid: http://www.wftv.com/news/local/want-solar-panels-you-still-h...
A side effect of that is that if the grid is down, you're also required to shut down your solar system (which makes complete sense, it's a safety thing to not have power sources connected while there's work being done) except that you're apparently not allowed to disconnect.
Discussion on [1] notes that disconnectable systems can be quite a bit more expensive, but it sounds like that's not even a legal option - no connection, no occupancy.
Seems inefficient as your mining equipment deprecates if not in use.
A more reasonable solution is to just sell that electricity than buy more reasonables. If you pull say a net 5% ROI you double energy produced every 15 years. With the added bonus of creating an income stream for retirement etc.
The siting of the tubine is indeed art. It will produce close to nothing at that height as air slows down near the ground due to friction. That's why turbines try to be located as high as possible (usually in a trade-off between cost of the tower and revenue from more wind).
>There's no net-value created by doing the 'mining part' - in fact, it's a value-destroying activity
You don't understand where cryptocurrencies derive their value from if you're saying this. Mining is what give cryptocurrencies their legitimacy, it proves that a digital file is scarce (important for information used to account for wealth/scarcity) and second it proves that the payment hasn't been double spent.
The traditional financial / political system, these processes are obfuscated or inefficient because you need lawyers, judgers, armed guards, vaults, auditors etc etc to preform the same task: to ensure that FedWire isn't inflating the supply, rolling back a payment.
'Mining' does not give them 'legitimacy', it's just an odd by-product of how some of them are structured.
And frankly - that 'mining' consumers power and resources is a total waste. There could very well be other, less wasteful ways accomplish the same thing.
Certainly 'Kin' is not 'mined' - they just arbitrarily create and dump them as they please.
>'mining' consumers power and resources is a total waste
Driving a cash truck to merchants, to pick up cash, to take back to a bank, to be counted by a physical machine, to update a database in a bank server is also a 'total waste'.
Just because the waste is hidden in the old financial system doesn't mean it's 'less wasteful'
Another angle of criticism would be that solving the Byzantine generals problem with the threat model that cryptocurrencies target (i.e. anyone can join and leave the network somewhat anonymously at any time) is simply not necessary in the real world unless you are a die hard anarchist. Therefore, for any application that doesn't have those requirements it is a huge waste of resources.
I'm glad you have outed this rather complex multi billion dollar industry with thousands of extremely competent programmers.
> 'Mining' is an arbitrary answer to handing out block-chains.
No it's not there are many ways to mine a coin and reach consensus, Proof of Work has been the most sensible one - as staking electricity is fundamental and universal.
> They could give out Bitcoins to whoever solved specific math problems for which 'computing power' doesn't matter.
They?
> The 'mining' paradigm is utterly wasteful because it wastes resources for no reason.
Ethereum uses a mining algo to process Smart Contracts. Other currencies like sia, use mining to process decentralised storage. Utterly wasteful is an overstatement. There are plans to move Ethereum to Proof of Stake where mining coins will become incredibly difficult and uneconomic and will reduce the power consumption problem.
> The level of religious support for this stuff is hilarious.
Your poor understanding of the topic made me chuckle
Alexasmyths isn't really making particularly crazy statements there. Ripple is a successful (third in market cap behind only Bitcoin and Ethereum) coin that already works by the "just randomly give them out to people" system without Proof Of Work. Yeah it's not great but neither is using huge amounts of processing power.
The purpose of Mining is not to hand out new coins, this is only done to incentivise mining. The real reason behind mining is to secure the transactions. Mining prevents people from double spending their coins.
You can completely agree with crypto-currencies having value and also believe "There's no net-value created by doing the 'mining part'". The reason is that whatever you gained in BitCoin value you lost in the ability to sell your electricity.
The net value of doing additional mining is that the payment that you accept as a merchant is that much harder to undo by an attacker.
If you accept that a bank vault that has one guard is made more secure by adding another guard you must accept that adding miners to a PoW chain makes it more robust against attackers -- this is the value.
Mining is literally "hacking the security of the coin, measured partially." The whole point is to incentivize mining instead of hacking because they'd both benefit equally from the same discovered flaws in the crypto. That ensures the crypto (which can be swapped out if a weakness is discovered) is essentially impenetrable.
Mining also serves as THE first-known solution to the byzantine generals problem, which is a problem in every decentralized system that needs to agree on the canonical truth. What less-energy-consuming system to solve this problem would you suggest instead?
Yeah, this costs energy, but it's sort of like expending energy to shoot ammunition at a wall constantly in order to know right away as soon as any weakness is discovered.
What is the price of being unable to be hacked (see: Equifax)? Right now, that price is "mining energy expenditure."
Reducing the friction of financial transactions definitely creates value. Credit cards are able to charge ~2% on a large fraction of consumer purchases for exactly this reason. The current return to mining might be disconnected from the value created, but the latter is very far from nothing.
"Reducing the friction of financial transactions definitely creates value."
Crypto-currencies are entirely unecessary for this.
I pay most of my bills 'digitally' with 0% transaction charge.
'Friction' is ultimately derived from 'regulation' - which is a necessary condition when dealing with real money across an economy.
One could argue there is 'too much regulation' - that said, without it, the economy would crash instantly due to mass fraud, corruption, embezzlement, tax avoidance etc. etc..
Irony: the moment 'Bitcoin' is important enough to be considered a 'real currency' by governments - 'poof' it's under regulation and loses a great deal of it's value in this area.
The 'solution' to e-currency is not Bitcoin, it has too many downsides for practical purposes. It's a great intellectual exercise, and something good may come of it, but 'it' is not the solution.
Even if we ignore all the uses that you don't personally find useful (free international transfers, domestic transfers that clear in an hour rather than days), you've retreated from "crypto currencies don't create value" to "crypto currencies create value to regulation evaders, but the net impact for society is negative", which is just completely different claim.
On the contrary, mining does create value, by creating a demand for computing power therefore financing the development of more advanced lithography, one of the very few areas in which our species is still making substantial progress.
Now if they were getting the electricity by burning coal, a case could be made that the environmental harm outweighs the benefit. But getting it from a renewable source also expands the economies of scale on that renewable source just a little more, so it's a win across the board.
Or Frederic Bastiat's "Candlemaker's Petition", which was a satirical request that officials block out the sun in order "to encourage industry and stimulate employment".
This is an "apart from that, Mrs Lincoln" - there is no analogy, because you're leaving out the destruction caused by hurricanes, which is their overwhelmingly most important result.
Sure, if you spent a million dollars buying CPUs and destroying them, that would create value. In practice, there are always ways to spend a million dollars to create more value (e.g. buying CPUs and using them to do useful computation!), so nobody does that.
If you have something more useful to do with the wind-generated electricity, the same thing would apply. But thanks to the limitations of the grid and the intermittency of renewable energy, that may not be the case. In that case, mining cryptocurrency may be the highest value thing you can do with it.
> In that case, mining cryptocurrency may be the highest value thing you can do with it.
Wow, how convenient that the thing you like is the single "highest value thing" that anyone can do with energy.
You are in a filter bubble of people patting each other on the back for wasting energy. You've been rewarded for cryptocurrency speculation for so long that you've forgotten what productive work is.
As it happens, this guess is incorrect. I've never been involved in cryptocurrency other than as a curious spectator. I have no strong feelings about it one way or the other.
Do you agree with the general principle that when the heuristics you are using, generate erroneous conclusions, it's a good idea to re-evaluate those heuristics?
> Sure, if you spent a million dollars buying CPUs and destroying them, that would create value.
Can you read what you've just written and see how mad it's become?
The converse of what you've said is that making CPUs destroys value, and by implication Moore's law and the whole process of technological improvement is destroying value.
But I was waiting for someone to bring up the broken window fallacy. The answer is that the unspoken assumption on which it is a fallacy is that resources in the economy are otherwise optimally employed. Back when it was coined, that might have been close enough to the truth to be a useful approximation. Today, it's very much not.
I don't think this argument has become particularly constructive, so I'll let you have the last word, but I suggest instead of just pattern-matching what you think I said to something you've heard somewhere else, try stepping back and just looking at the facts as they are.
You're confusing 'economic stimulation' with 'net value creation'.
A lot of money can change hands, and a lot of people can 'do stuff' with nobody being better off.
Take for example how the GDP is measured: the total value of 'everything produced and sold' PLUS government spending.
Get that: it does not matter how the government spends - it counts towards GDP.
A government could borrow $1 Billion dollars - and literally burn it - but the 'GDP' would be +$1B Billion that year. That doesn't mean $1B in economic value created ... it just a number on a page.
> Take for example how the GDP is measured: the total value of 'everything produced and sold' PLUS government spending.
That's incorrect; GDP is the total cost of everything produced and sold. Government spending isn't added (if it was, there'd be a substantial double counting, because a lot of things produced and sold are bought by government.)
(More precisely, it's private consumption + private investment + government spending + government investment + exports - imports.)
No, I'm pointing out how this does in fact cause net value creation. Sure, when somebody mines Bitcoin, the dollar value of the Bitcoin is purely on paper. When they send dollars to chip manufacturers to buy mining hardware, the movement of dollars from one bank account to another is also purely on paper. But when people respond to the movement of those dollars by building more advanced chip factories, that is indeed net value creation.
That is not 'value creation' - that is 'digging holes and filling them'.
The 'innovation' created from 'more advanced tech' to do something entirely useless is not really 'value creation'.
Now - if those 'advanced shovels' end up being used for other purposes, you might argue there is 'value creation' - but no - this is not right.
Why? Because of the distortionary effects of the impetus of the innovation in the first place.
If there is a 'real reason' to make 'better shovels' - then they will get created for that reason. But because they are being created for a non-reason - i.e. digging holes and filling them, well, it's a misallocation of resources.
If the 'false impetus' did not exist, that investment would go elsewhere, and create value for society in more efficient ways.
It's pointless to invest in things that have no value, because there might be indirect innovation elsewhere - just use the money for rational, value creating things - it's much more efficient.
> If the 'false impetus' did not exist, that investment would go elsewhere, and create value for society in more efficient ways.
Actually no, it probably wouldn't. Most likely, the money would be hoarded and the people would be eking out an existence doing some miserable make-work. The unspoken assumption behind your position is that all resources are currently being used fully and optimally. Unfortunately, this is very far from true.
This seems a very roundabout way to do things - climate modelling is CPU-intensive enough already, why not use the wind energy to power those computations instead of this extra step in the middle?
(The application of the earned money to funding climate research is basically unrelated.)
EDIT: Maybe not. I'm getting article with conflicting estimates for future growth, but this study
https://eta.lbl.gov/publications/united-states-data-center-e...
suggests that data center energy use has recently slowed dramatically, to just 4% per year. Very surprising to me.