A tokamak is a kind of fusion reactor, and basically the most prevalent.
"100M degrees" (Kelvin) corresponds to 10 KeV (kilo electron volts), which is an important figure to exceed for D-T fusion. D-T fusion which is the kind of fusion the ITER Tokamak (a forthcoming fusion reactor and international megaproject) intends to demonstrate.
An older fusion experiment, JET (Joint European Torus) reached these levels, so this does not break new ground, but it is important if this Chinese Tokamak is going to provide data useful for ITER.
I will note that it's rather unusual to refer to plasma temperature in Kelvin rather than in KeV. I edited this comment with a few more details to try to make it easier for laypeople to understand.
My aussie friend who was a scientist said when he was in the us for about 15 years that he still used c for temp in lab but F for weather and it took him a while to realize that he had flipped and the 2 things had basically no relationship in his head.
Kind of a tangent, but I still firmly believe that for Earth weather conditions, Fahrenheit is a more intuitive scale. 0 is a really, really, really cold day and 100 is a really, really, really hot day. 50 degrees is neither especially warm nor especially chilly, 70 degrees is warm, 30 degrees is chilly, 20 and below are truly cold, 80 and above are truly hot, and if you go into negative numbers or above 100 degrees than you're in the "extremes".
Cold and warm are relative. 0 for freezing / snowing and 100 for boiling gives you a much more understandable range. Honestly, US metrics don’t make any kind of sense anymore...
While it's true that 100C doesn't have any weather meaning, 0F being "very cold" isn't particularly objective. I just looked it up, and apparently 0F is only about -18C. Whether -18C is "very, very cold" depends a lot on where you're from and what you're used to. I'm Canadian and I certainly wouldn't characterize -18C as "very, very cold". -30C maybe. 0C has a fairly objective interpretation in terms of weather: it's the point at which puddles start to freeze into ice slicks.
Yes, they are relative. 0 for freezing and 100 for boiling works well for chemical reactions and cooking. 0 for extremely cold weather and 100 for extremely hot weather works well for knowing what to wear before going outside. Celsius requires smaller numbers with decimals for similar weather precision.
Some. Not the physicists or astronomers, but some Americans who work with data from non-scientists regularly stay with the units in the data. Some engineers, those who are dealing with older equipment, stick with the units used when the equipment was built. US pilots and aerospace generally still talk in knots and feet of altitude.
Mountain climbers in the US typically also use feet for elevation. Compasses sold in the US also have different markings for measuring the Universal Transverse Mercator grid(1:24,000 and 1:50,000 in the US) and rulers(inches, feet). 1:24,000 is used on USGS 7.5-minute maps, and I believe 1:50,000 originates from an older map series, both of which have topographical lines marked in feet instead of meters. Altimeters sold in the US also customarily use feet although the digital ones can be switched to Meters.
You're right, the degree was removed in 1967, because Kelvin's units are considered absolute. However, one could argue that, since it has a scale, each "step" or "unit" can be considered a "degree", like in a literal scale. In informal conversation, we should not be so pedantic. Even some modern scientific textbooks still include the degree.
I remember seeing an infografic a while ago where some very high temperature of like 10 million Kelvins was displayed like "9999728 degrees". I thought it was mildly amusing.
It must've been, "99,999,999,726 C, the temperature inside a newly formed neutron star"! It's even more preposterous than I remembered. It's not even rounded correctly!
But then again I just noticed that I was off by one in my parent comment so I shouldn't through any stones I suppose.
You seem to know what you're talking about - do you know what the end goal of the work in this field is? Is it working towards nuclear fusion as an energy source?
The specific EAST reactor [1] mentioned in the article is a testbed that will enable new technologies to be used on the ITER project. The ITER project [2] is currently the largest fusion power research project underway (and the largest reactor under construction). ITER's goal is to provide research that enables new technologies to be used on the DEMO project. The DEMO project's goal [3] is to provide commercially available power plants utilizing nuclear fusion.
From ITER's wikipedia page:
>The goal of ITER is to demonstrate the scientific and technological feasibility of fusion energy for peaceful use. It is the latest and largest of more than 100 fusion reactors built since the 1950s. ITER's planned successor, DEMO, is expected to be the first fusion reactor to produce electricity in an experimental environment. DEMO's anticipated success is expected to lead to full-scale electricity-producing fusion power stations and future commercial reactors.
And from DEMO's wikipedia page:
>As a prototype commercial fusion reactor, DEMO could make fusion energy available by 2033.
Any idea on how much money we're spending on this?
Also, I can imagine it's a joint project only partially because we can share the cost, I imagine another reason to work together is so that no one country gets this technology first.
Several billions of Euros, with Europe paying the largest share, but the reactor is in France so we get much of the secondary benefit (i.e. scientists and engineers spending their salaries there, construction work).
Initial budget was €5bn. Current budget is four times that and with completion nowhere near estimates of the final cost are as high as $60bn. Go figure.
If by "we" you mean US's share, that's 9% of total costs. China, India, Japan, Russia, South Korea, and the US are paying 9% each and EU is paying 46%.
With 'we' I meant every country that's involved, not just my own country.
$60bn is actually surprisingly little for research that could change the future of energy production and possibly society as we know it. To put it into context, the Apollo program cost $200bn in today's money and a high speed train between LA and SF is projected to cost $100bn.
$1 trillion is not a lot of money for the US. the Congressional Budget Office estimates that interest spending on US public debt will hit $915 billion in 2028[1].
That doesn't mean 1 trillion isn't a large amount of money. That means the US spends a trillion dollars servicing it's debt. As it so happens, the US will be spending a lot of money servicing debt.
That’s a very misleading figure as inflation both directly reduces the actual debt payment nessisary, and makes comparisons between spending 2003 – 2011 vs 2028 difficult to compare.
except inflation is directly related to the amount of new debt issued, because thats the only source of M1 money supply. If the US Treasury didn't borrow from the Federal Reserve and instead printed its own currency, this would be different.
It's interesting to note that on the ITER FAQ page, they mention that 90% of contributions are to be delivered "in-kind" in the form of parts of buildings, instead of cash.
> The DEMO project's goal [3] is to provide commercially available power plants utilizing nuclear fusion
That's not true unfortunately. Yep it's planned to be attached to the grid, but it won't be a production-ready power station. That will be PROTO [1].
Basically the whole schedule slipped more, US pulled out of ITER so they had to scale it down, then it was delayed, so as things stand now DEMO will still be a testbed. Some recent DEMO design notes can be found here [2]. A bit dated, but if anything an optimistic outlook can be found here [3] at page 8. Note that DEMO is though to "resolve" some issues still.
I think calling 2033 highly unlikely would be an understatement.
https://en.wikipedia.org/wiki/ITER: ”Initial plasma experiments are scheduled to begin in 2025, with full deuterium–tritium fusion experiments starting in 2035.”
So, according to Wikipedia, DEMO will build on ITER’s results, but will produce energy before ITER’s first real fusion experiment starts.
If global warming will be more significant then is predicted now I believe they will suddenly invest 100x more into this technology. This and nuclear plants are only reasonable way how to have available energy for some 'CO2 removal' projects...
More money means you can buy better gear, hire more workers, complete projects faster and run multiple parallel sites to complete various goals at the same time.
Of course there is some point where adding more funding will not advance the speed as much anymore but I doubt were even close to that point at the moment.
Why is it so underfunded? The first country to build one will be taking a big step to reducing their dependence on other countries for energy. I can understand big oil exporters not wanting to push it too much (although it's unlikely to come into fruition for the current generation of politians), but for others isn't it a no brainer?
Historically, Fusion has been bad at predicting when it's going to be ready for prime time, that plus very early fusion experiments turned out to be a lot of duds. Just think of the Cold Fusion hype (on a side note; cold fusion does work, µCF replaces electrons with muons in the hydrogen atom which reduces the radius of the atomic nucleus such that fusion becomes possible at room temperature and far below but generating the necessary muons is a fools game) and the various other nuclear fusion failures.
Fission on the other hand could report a lot of results and success and, at the time, seemed to be infallibly safe.
Same reason preventative medicine is not practiced. The upfront costs are great and the out come not certain leaving many hesitant to invest in something they may never benefit from. Especially if your country has more immediate pressing matters that you could fix right now with that money.
Another (completely stupid) reason is that it's "nuclear" energy. The general public has a hard time understanding that fusion and fission have very different risk profiles. I recall that after Fukushima some countries cut their budget for fusion research because nuclear energy is clearly bad and unpopular.
- Oil/coal/natural gas lobbyists and interests which hinder tax payer funded research.
- The fact that there is no guarantee we will ever figure it out and no idea whatsoever as to how much it will cost to figure it out. Investors like returns, in their lifetime, leaving largely tax payer funded research as the greatest source of funds... see above.
And then with government-funded research... if a government figures out fusion, what do you do with it? Do you license it to private industry? Do you make state-owned power plants?
If you give it to private it industry, it's going to get to other nations. If it gets to other nations, you lose non-electrical power and create potential strategic issues, which means you are motivated NOT to share the technology.
It sucks.
I wish we could all just get along, fund stuff like this and space exploration, and get over petty politics before our species goes extinct.
I'd argue fusion is actually overfunded, if you go by its likelihood to actually deliver a competitive source of energy. A clean sheet energy R&D program would invest very little in nuclear fusion.
Given climate change, giving it to other nations is exactly the right thing to do, even from a purely self-interested perspective. Nobody wins if coal plants in China tip us into runaway warming.
And that's another reason ITER et al are very broad international projects: everyone wins when the project wins, and nobody can stall one project by poaching Von Braun for their own scheme.
Because we already have a functioning fusion reactor.
Utility-scale PV now costs only $43/MWh. Investing in developing fusion reactors makes very little economic sense compared with capturing the output of the fusion reactor we already have.
The research should still be done, of course. It can have benefits to a future interstellar civilization - but until we're interstellar, PV is far, far more compelling.
Cheap intermittent sources are sufficient to destroy the economic case for expensive baseload sources. The latter have to be able to sell their output most of the time or else their economic case collapses entirely.
Nine women can't have one child faster, but on average they can have about nine times as many children in nine months as one woman. It really depends what your goal is and the bottleneck in reaching that goal whether or not that analogy holds.
Sustainable high-temperature plasma is what we need to create fusion as an energy source. Current fusion processes rely on pulsing a very short instant of high temperature. This creates a brief instance of fusion, but takes heaps more energy than that quick moment of fusion releases. Thus, it's a net negative of energy. One way to do this that the US uses for nuclear weapons research is to zap a small bit of gas with a tonne of high energy lasers all at the same instant.
A high temperature plasma represents a continuous supply of fusing atoms. The current research at ITER, this place, etc. are attempts to create a persistent environment for fusion. If they can do that, then it creates an environment where research can focus on 1) reduce the energy required to hold it at that temperature (which includes limiting how much plasma leaks out, since leaking plasma drops the temperature), and 2) work out ways to extract the energy created by fusion.
As I understand (and I could be wrong, it's been years since I last read about it), ITER plans to generate a net-negative energy situation (i.e. it'll never produce energy, just consume it) but hopes to create a sustainable plasma field at temperatures that cause fusion.
ITER will never produce electric energy but they plan to achieve Q=5, ie, output 5 times more energy than put in.
The more important work on ITER is work around enabling actual power stations using Q=5 and developing the tech to maintain operating fusion reactors (remote robots, etc.)
At that point can't you just store the output current in supercapacitors for later use anyway? You can even feed it into the machine itself. Supercapacitors easily have better than 20% efficiency for each cycle.
No, ITER doesn't have any planned way to make electricity, the power it outputs will be largely thermal and radiative. Engineers will likely put up a cooling system to measure the thermal output, especially after putting up the various ways to turn the major neutron radiation into heat (which is a fun way to produce energy with rare materials as a byproduct).
DEMO, to my knowledge, will then include an actual electric generator to be hooked up to the fusion core.
As I understand, this represents sufficient energy to overcome the Coulomb Barrier [1] which naturally repels atoms apart. To cause fusion, you need to push particles together either hard enough or fast enough that they push through this repulsion and fuse. The repulsion is a product of the electrostatic repulsion of the positive charges of the nuclei (pushing the positive ends of two magnets together, essentially).
The temperature of a gas is essentially a measure of the constituent particles' kinetic energy. Higher kinetic energy = higher temperature. 10keV represents enough kinetic energy for the D-T atoms to collide fast enough that they overcome the Couloumb repulsion and fuse together.
A system of particles in thermal equilibrium will contain a small fraction whose kinetic energy exceeds the average by a factor of 10. Also, the energy available in the center-of-mass frame of two colliding particles is higher if they happen to be moving in opposite directions. That could give you another factor of up to 2. In any event, I don't think you want your fuel fusing all at once!
Expanding on this, this 10KeV temperature is the average of all particles, and there's a distribution around this average. Some will be higher, and thus more capable of colliding with high energy.
In addition to that, whether two nuclei fuse is also dependent on how squarely they collide. A glancing blow intuitively allows both nuclei to push each other away a lot easier than if they experience a head-on collision.
A graph of the fusion rate such as [1] shows that even at much lower average temperatures, fusion will occasionally happen when two higher-than-average nuclei collide head-on. As the temperature gets higher, this rate increases as more particles have enough energy and less require those head-on collisions. The peak rate for D-T according to this graph is about 70kEv.
I can't speak to why the grandparent's link referenced 100kEv, as it's been a decade since I last studied this and I'm very rusty.
How does one control the chain reaction of the fusion process? I understand fission reactors using control rods to absorb some of the neutrons to prevent those neutrons from hitting other fissile particles, but this seems like a harder problem. Those particles fusing at a "low" temperature in the distribution of equilibrium cause additional fusion reactions at higher temperature thresholds in other particles because of the energy released, if my mental model is correct? And assuming the fuel is gaseous, the idea of a control/absorptive retarder seems like a much harder problem. Edit: Oh! Maybe they reduce the strength of the magnetic containment field? Which reduces pressure inside the reaction chamber and thus reducing temperature?
In addition to this there is also a quantum effect called quantum tunneling [1] which allows for a very tiny probability for particles with insufficient energy to fuse upon collision anyway.
The Wiki entry also mentioned "two effects that lower the actual temperature needed", one being average kinetic energy and the other quantum tunneling "if [nuclei] have nearly enough energy." The term "nearly" isn't precisely defined though.
There isn’t a hard limit. But as you get farther away from the amount of energy that would be “enough” without tunneling, the probability of tunneling falls off exponentially.
Quantum tunneling effects are generally (maybe always?) exponentially unlikely, in the sense that the probability of tunneling will decay exponentially with the discrepancy between energy-available and energy-required.
They may be trying to differentiate between high temperature as "fast", like in this case, and high pressure as "hard". Though both can lead to a particle kinetic energy above the Coloumb barrier, in the former case the time between collisions will be lower (AFAICT), which may be important. I'm not a plasma guy though.
Sorry, I should've been much clearer, that was really badly worded as I was distracted also doing my day job :) As a sibling comment pointed out, I wasn't using any specific terminology as I was trying to talk at a layman level. There's not really a difference. I should've said 'hard enough and fast enough'.
Conceptually, there's two main methods of fusion - inertial confinement, and magnetic confinement.
In inertial confinement, lasers are shot at the plasma to squeeze it together so the pressure (and thus temperature) increases until fusion occurs. This is also what happens in stars, except they use gravity not lasers. Conceptually, this is what I alluded to when pushing it 'harder'
In magnetic confinement, the pressure/temperature is increased by shooting electrical currents through the plasma among other techniques that i'm not as familiar with. The volume hasn't really decreased, but the kinetic energy and pressure of the plasma has increased so it's the same principle, but this is conceptually what I alluded to when i said pushing it 'faster'.
In either sense, the idea is to somehow increase the plasma temperature which increases the kinetic energy of the particles enough that they overcome the electrostatic barrier. It was just really badly worded by me and I apologise for that!
I guess hard is something inside the star where matter is compressed by gravity and fast is when you're accelerating matter. In the end it's the same, only you can't use gravity for small reactors to replicate star technology yet, so you must accelerate particles.
SciFi likes to talk about antigravitation. But supergravitation would be cool as well :)
GP isn't using standard terminology, but I would presume "hard" was in reference to more mass, which results in higher kinetic energy at lower temperatures.
I don't know about CRT's but 10KeV is not particularly energetic. A CT unit may operate in the ~100-200kV range (so max energy of 200keV) while radiation therapy units operate in 6MV-18MV range (so electron accelerated up to 18x10^6 eV).
I don't know much about fusion, but my guess is the 10keV is impressive because it is a self-sustaining fusion reaction rather than being impressive because of the absolute energy of the reaction?
edit: someone down the page mentioned that containment is the issue. In a CRT you are just accelerating an electron across a few thousand volt potential and slamming it into the screen.
>> edit: someone down the page mentioned that containment is the issue.
Right, and that was my understanding. It's easy to accelerate a particle or a stream of them to high energy. It's another thing entirely to contain a gas/plasma at those energies. The distinction becomes more obvious when they flip back and forth between impressive sounding temperatures and simple KeV measures.
Considering that multiple particle colliders can accelerate particles over 1 TeV (LHC topping 13 TeV), yes, 10KeV isn’t a very high number taken out of context.
"100M degrees" (Kelvin) corresponds to 10 KeV (kilo electron volts), which is an important figure to exceed for D-T fusion. D-T fusion which is the kind of fusion the ITER Tokamak (a forthcoming fusion reactor and international megaproject) intends to demonstrate.
An older fusion experiment, JET (Joint European Torus) reached these levels, so this does not break new ground, but it is important if this Chinese Tokamak is going to provide data useful for ITER.
I will note that it's rather unusual to refer to plasma temperature in Kelvin rather than in KeV. I edited this comment with a few more details to try to make it easier for laypeople to understand.