Serious question: what happens after? Will it literally be 'free energy'? And are we ready as a society to handle this properly? How do we ensure that this will benefit everyone and not create an enormous imbalance by itself?
I guess the existing economics will be broken at some point if the cost of everything will be driven down in such a way. Are there any articles/works that explore this issue?
No, the cost structure of fusion is going to be much like fission: the cost of building the reactor has to be spread over time (with interest). Even though the fuel is cheap, you are (in a financial sense) 'burning' the reactor, which is expensive and must be bought up front. We don't know for sure how it will work out, but none of the predictions I've seen show fusion producing dramatically lower cost of electricity.
In theory one can say they use 25% of their bedroom or living room for work, and hope that the tax man is lenient enough to accept that. But then you also need to submit the floor plan or square footage measurements in order to get something deducted from your rent/utilities. And probably pay someone to do all this for you due to the huge number of caveats. German tax law truly is something else.
Completely favors the middle class that can afford to keep one room for 'work'. One wonders if starving artists can get some tax returns for their attic shoebox ateliers.
I've always preferred MATLAB to Python as the more engineer-friendly programming language. The interface was a plus as well. That being said, given the sheer amount of scientific libraries for Python and universities moving towards it as well, I'm wondering if the effort to maintain Octave is worth it.
One use for it could definitely be running older MATLAB scripts that have deprecated language features. Those were a real pain to make work again.
In my opinion it is absolutely 100% worth "developing" Octave (the word 'maintain' makes it sound like it's an otherwise static project).
The improvements from 3 to 4 were massive. The improvements from 4 to 5 equally so. I can't wait to try 6.
In any case, it is definitely not just a tool for running 'old matlab scripts'. It is a beautiful language and environment in its own right, and if academia manages to wake up at some point they would invest in it for the open source environment that it is, in a way similar to python, and help it get even more amazing.
I use both Python and Octave and I can clearly say that one never replaces the other.
For quick and dirty scientific PoC or obtaining ground truth, Octave/MATLAB is much better since it handles all the quirks of numerical programming with generally slow but with proven solutions.
During development phase of my Ph.D., my professor would scribble something in MATLAB and send it to me with a note "for the given inputs, you should obtain this and that. Attached is the crude math". It was up to me to clean the math, convert it to C++ and make it blazingly fast while solving any attached small but hard problems.
I still have a MATLAB license to test these ground truth scripts against my code.
There is still a lot of mathematics that there are matlab scripts for, and not python. I think it depends on how deep you dive into a particular area of course. But I think it’s definitely worth it.
I spent the majority of my career porting MATLAB to C++. We charge about $70 an hour ($200 billable man-hour) to do it. It’s a slow and arduous process. Often times the features that are running in Matlab needs some C++ library that prevents the sort of speed increase that you hoped to gain out of the C++ port, so a C++ reimplementation of the toolbox needs to be developed anyway; this loosing the benefit of the “verified“ toolbox.
Python, on the other hand can be improved for performance piece-wise through tools like cython and swig. Of course, both offer a C interface, but I prefer the python one to MEX files for a variety of reasons. Also, you don’t have to pay a license to use python. (Same library implementation problems)
Add a minimum, if the MATLAB folks would at least consider moving over to Octave we could illuminate some of the licensing fees, but... once performance matters, you’re still going to have to pay a full-time software engineer to port Octave code to make it performant.
I would be curious to know more about your firm and what customers you are serving? I feel like I have a lot of experience translating or communicating with weird things to MATLAB but I'm not sure who values this outside Academia.
A lot of companies in the (German) automobile industry rely heavily on Matlab and its toolboxes, so I'd assume that people like you are wanted there as well.
MATLAB seems ill condusive for structuring and organising code in a sane manner. It tends to give its users bad habits with respect to structuring code is what I observe.
It seems to lack data structures that are quite pervasive in modern programming practise.
Also python being a general purpose system brings its own benefits, one can easily hook up ones code to fetch or push data to databases and even easily scrape data of the web or extract/reparse/rectify/reformat poorly/complexly structured data before processing.
And I find MATLAB not feature rich in terms of being able to manipulate tabular data in a relational DB like manner, that is querying/projecting/selecting rows/colums to find interesting facts.
You write a function, using nice linear algebra syntax. Already python is worse: you do a bit of import boilerplate and write linear algebra in a gimped notation. You call the function. Not so in python, where you have to import it first. You change the function definition, next call will be redefined function. In python you can try to do an interactive reload via third party software but chances are it won't work right since Guido apparently never considered this something worth designing properly (apart from matlab lots of "real" programming languages are much better at this, including erlang, common lisp and smalltalk).
Your function runs too slow. You press a button and you see a color coded version of the code in your editor window and see instantly where the bottleneck is. In python you break out one of several crappy profilers. You want to save your results from your interactive exploration "save results.mat" -- done. In python there are various ways of saving stuff, which are either not general or slow or don't work between different versions.
I haven't used matlab in years, but as an interactive environment for linear algebra it blew python out of the water and likely still does, even if it has a number of big shortcomings as a general purpose programming language (and probably a worse selection of libraries in quite a few numerical domains these days as well). This is particularly true if the people using it are not trained programmers.
> You call the function. Not so in python, where you have to import it first.
Er, no you don't, if you are using it in the same notebook, module, or REPL session where you defined it. And if you aren't doing the equivalent in MATLAB, you also would have to load the definition.
What on earth are you talking about? If I add `def bar...` to a file foo.py in my PYTHONPATH, it does not suddenly become available in my interactive session, I need to import it first. Further if I change foo.py even importing it again in the repl (or jupyter notebook) will not have any effect. You will need to use one of several reload hacks which in practice won't work reliably for all sorts of reasons. In contrast, unless my memory severely fails me (and I don't think so, since I just checked against octave), in matlab if you edit a file in the path, it will be auto-imported every time you save it as soon as the repl is non-busy.
On the other hand, if all you need to do is to state and solve a few humongous linear systems, or a system of ordinary differential equations, or a partial differential equation, or compute the vibration modes of a solid shape, or perform an elasticity analysis, or simulate the absorption spectrum of a chemical compond, or find the numerical solution of a system of nonlinear equations, or optimize a function that has terms in both the spatial and the frequency domains, then Octave is definitely up to the task and will probably solve your problem out of the box in a handful of lines of code. In python/numpy, the sub-par linear solver will probably fail or be extremely slow on large linear systems (because the scipy developers are a bunch of anti-GPL fanatics that refuse to use the state of the art linear solvers because of license issues). But yes, you have nice strings and dictionaries at your fingertips; too bad they are completely useless for numerical computation.
Still doing many things in Python is a bit more laborious (no surprise and not a drawback of Python, it is general purpose language) than in Matlab/Octave. Plus learning curve is lower for Matlab/Octave, after long years of not using Matlab I've taken some ML course on Coursera that was using Octave and I was able to be productive very quickly.
In the 90s there was lots of work on storing data in 3d holograms, by using photorefractive materials. That works by interfering a reference wave and an image wave (which contains the information in bit form for example) to create 2d maps and the multiple amps through translation. You can read out the information using a reference wave only. IBM was one of the big research labs in this space.
They achieved some impressive bit densities for the time. There were even some prototypes. However, the rapid increase of drive storage densities and the alignment issues of these systems, (together with the lack of good error correction codes I suspect), meant all this was shelved eventually.
There is now quite a bit of research on using laser writing for storage (Microsoft research is working on this for example). However largely in 2d (3d is difficult because you want 2d read outs for speed and the other layers distort your image), I think they are also thinking about spinning disc type devices, so CD 2.0.
Meh, you only need a little imagination to scan in 3D.
Instead of a disc, use a glass cylinder, mounted excentrically on an axis. (The actual glass cylinder doesn't even have to be excentric for this to work, because the data can be excentrically stored, but it's harder to explain. So assume it's a glass cylinder mounted excentrically on its axis.)
Spin the cylinder on its axis. For each revolution, move the laser one step along the axis. When the laser has reached the end of the cylinder, move the laser one stop closer to the spinning cylinder, then repeat the process in reverse.
The point of the excentricity is to get the "autofocus mechanism for free".
Search speeds could be a dog, but data density and durability could be insane. Plus it would look like something straight out of Star Trek. "Put the data crystal in there."
I think we misunderstand each other (and I've probably caused this confusion because I said CD 2.0). I think they are looking at reading whole 2D images at once, so not a point by point reading process.
Anecdotal - the only thing that worked for me was eating once per day, around 6PM. I was very alert the whole day, but not in an unpleasant way. The hunger pangs come and go, but are manageable. Tried this for a week, I don't drink coffee at all but this felt like I've had a couple of cups.
A nutritionist friend however dissuaded me to go this route, as apparently it is very hard for our bodies to process that amount of food and one will end up with various nutrient deficiencies in the long run. I'm curious to know if someone got intermittent fasting to work long term though.
I think there are lots of studies on intermittent fasting/single meal per day eating schedules (though I haven't looked much at them). From what I can tell, caloric periodic restriction tends to be a net good. I've personally been doing it for ~1 year and it's definitely more convenient since I just eat twice a day but I think I may also have not been eating enough (alongside doing a fair bit of exercising).
I guess the existing economics will be broken at some point if the cost of everything will be driven down in such a way. Are there any articles/works that explore this issue?