If you have access to the raw data, then you should be able to quite easily provide a good estimate for what the article is trying to assess--what's the actual operational cost of bitcoin mining. Estimating the energy consumption is only a vehicle to doing this because those of us who don't have any raw data to do this.
What you should be able to do is:
a) Open up your electricity bill and look at the total energy consumption for a given period. Admittedly, I'm not familiar with how electricity is invoiced, but I'd be surprised if there wasn't something that was at least coarsely indicating total energy consumption for a billing period. (Or you can skip several middlemen steps and just take the total cost of the electric bill.)
b) Tot up all the bitcoin mining rewards you received in the same time period.
c) Count how many bitcoin blocks were mined in the same time period.
d) Divide b by c, now you know what fraction you are of the total bitcoin mining network.
e) Divide a by d, now you have an estimate of total worldwide energy consumption of bitcoin.
The only reason I can see that might make this hard is if you mine more than just bitcoin, in which case, replace "bitcoin" in the above steps with "sum over all of the cryptocurrencies you mine"--which is closer to what people want to know anyways. Even then, you can use crude weighting metrics to estimate what the consumption of just bitcoin is, which would at least let you know if the estimate is in the right ballpark or not. The exact number doesn't matter; it instead matters if the real number is closer to 14 TWh/yr, 140 TWh/yr or 1400 TWh/yr.
But we really don't care about that factor of 2. It literally doesn't matter, whether Bitcoin wastes half a Luxemburg or a full Luxembourg.
And, iirc, then they estimated total consumption from well-funded and very active mining pools. I.e. exactly the operators that you would expect to run the most efficient mining equipment. Meaning all you do is being overly optimistic, as in "we claim mining uses less energy than it actually does"
The statistic you're poo-pooing puts error bars that are a factor of 2× in either direction. Tolerating that degree of error bar isn't all that difficult, and if you bothered to continue to read the article, you'd discover that the author themself discusses possible error bars, and deflates the estimated operational cost of bitcoin miners by half anyways.
And, FWIW, the kind of methodology that was used to produce the Bitcoin energy consumption is the same kind of methodology that is used to produce statistics like GDP, inflation, employment--basically every macroeconomic indicator. If you're going to complain that you can't quantify Bitcoin's energy consumption, then to be intellectually honest, you need to complain that the true inflation rate or the true size of any country's economy is impossible to quantify as well.
Again, though, that's just not true. The hardware is well-understood and easily measured. Sure, there is slop in the measurement (how much of the fleet is using platform A vs. platform B). But those are comparatively small numbers. Sure, like you say, we might have a factor of two lurking in there (seems high, but I'll grant it).
But a factor of two wouldn't change any of the analysis! If the article said "60 TWh" instead, would that be any less horrifying? No, it wouldn't.
We don't know what hardware is on the network and we don't know how all of the power is generated. If it is 60TWh of clean energy that isn't being used for anything else, who cares?
What you should be able to do is:
a) Open up your electricity bill and look at the total energy consumption for a given period. Admittedly, I'm not familiar with how electricity is invoiced, but I'd be surprised if there wasn't something that was at least coarsely indicating total energy consumption for a billing period. (Or you can skip several middlemen steps and just take the total cost of the electric bill.)
b) Tot up all the bitcoin mining rewards you received in the same time period.
c) Count how many bitcoin blocks were mined in the same time period.
d) Divide b by c, now you know what fraction you are of the total bitcoin mining network.
e) Divide a by d, now you have an estimate of total worldwide energy consumption of bitcoin.
The only reason I can see that might make this hard is if you mine more than just bitcoin, in which case, replace "bitcoin" in the above steps with "sum over all of the cryptocurrencies you mine"--which is closer to what people want to know anyways. Even then, you can use crude weighting metrics to estimate what the consumption of just bitcoin is, which would at least let you know if the estimate is in the right ballpark or not. The exact number doesn't matter; it instead matters if the real number is closer to 14 TWh/yr, 140 TWh/yr or 1400 TWh/yr.