How is this news? Apple has had the exact specs (27 in, 5K res) available since something like 5-7 years ago, both as a stand-alone monitor and as an iMac.
sure, it may be a middle ground between the existing Pro XDR and 5K LG Ultrafine displays, but this is something that was just announced today, hence "news".
Of xargs, for, and while, I have limited myself to while. It's more typing everytime but saves me from having to remember so many quirks of each command.
cat input.file | ... | while read -r unit; do <cmd> ${unit}; done | ...
between 'while read -r unit' and 'while IFS= read -r unit' I can probably handle 90% of the cases. (maybe I should always use IFS since I tend to forget the proper way to use it).
That way will bite you when the tasks in question are cheaper than fork+exec. There was a thread just the other day in which folks were creating 8 million empty files with a bash loop over touch. But it's 60X faster (really, I measured) to use xargs, which will do batches (and parallelism if you tell it to).
The example of "foo bar" didn't work with while but inserting tr fixes it:
echo "foo bar" | tr ' ' '\n' | while read -r var; do echo ${var}; done
For examples in general, I guess something like "cat file.csv" could work. (the difference between using IFS= and not using it is essentially whether we want to preserve leading and trailing whitespaces or not. If we want to preserve, then we should use IFS=).
Imagine you have this pipeline that already works for data.csv. But now you have data2.csv which has some difference (e.g., some values are null, while the original data.csv had no null values).
Monads are an approach to making the existing pipeline work (with minimal changes) while still being able to handle both data.csv and data2.csv. The minimal changes follow a strict rule as follows (this is not a valid shell command anymore):
In other words, only two kinds of changes are allowed:
- You can bring in a wrap function, that modifies the entries of the given csv data.
- You can bring in a new kind of pipe ']' instead of '|'
The idea being, the wrap function takes in original data stream, and for each "unit" (a line in the csv file, called a value) produces a new kind of data-unit (called monadic-value). Then your new pipe ']' has some additional functionality that is aware of the new kind of data-unit and is able to, e.g., process the null values, while leaving the non-null values unchanged.
Note, you didn't have to modify any of the process-1 through process-n commands.
BTW, the null value handling monad is called the 'maybe monad' (and of course there are other kinds of monads).
If you make the existing pipeline work in this way, you essentially created a monad to solve your problem (monad here is the new mechanism consisting of the new value, and the two new changes, the wrap function, and the new pipe).
edit: There may be a need to also modify the '>' mechanism. But I think that is not essential to the idea of a monad, since you could replace ">" with "] process-n+1 >" (i.e., you created a new outermost function 'process-n+1' that simply converts the monadic-values back to regular values).
edit 2: If instead of handling null-values, the purpose is to "create side-effects" e.g., at every pipe invocation, dump all/some contents of the data into a log file, then the kind of monad you end up creating would be something like an "I/O monad".
Maybe a culture clash? Academia is all about status and prestige - more often scientific outcomes seem to be a means to get the former (why journals don't publish negative results, why studies fail to replicate, why stuff isn't open access, why people worry about getting scooped, etc.)
Tech (at its best) hates credentialism (sometimes I think to a point of over-correction).
That said, 80% of the devs in the bay area seem to have gone to Stanford or MIT, so...
AFAIK the xml-ast was always just intended as a debugging tool for llvm developers, not an API for externals (and thus came with no stability promise). You always were supposed to just link against clang/llvm and use the APIs it specifically provides to access the AST.