I use history a fair amount, and I've noticed an issue that none of these articles mention.
For history to be useful, you need to be able to reuse previous commands without giving them much thought; in the time it takes to think in detail about a command, you usually could have typed a new one without using history. As a result, commands that are reused via the history mechanism need to be "safe", with no hidden pitfalls.
For example, I use "find" a lot. A normal thing to do with "find" is to get a list of files to delete. Now, I could do that with "find ... -exec rm" or "find ... xargs ... rm", but that leaves me with an unsafe "find" command in my history, just waiting to bite me when I try to reuse it. If, some time later, I type "!fi" without thinking much about it, I might end up deleting some random files. Not good.
My solution, in this particular case, is never to do "find ... rm" on a single line. To remove a list produced by "find", I do "find ..." and then "rm `!!`". That runs the "find" command twice, of course, but the second time, everything has been cached, so it's very fast. (The real disadvantage is that the delay -- for the first "find" -- occurs in the middle of what is conceptually a single operation: after I've typed the "find", but before the "rm".)
So, has anyone else run into this issue? Thoughts? Other ways of dealing with it?
For history to be useful, you need to be able to reuse previous commands without giving them much thought; in the time it takes to think in detail about a command, you usually could have typed a new one without using history. As a result, commands that are reused via the history mechanism need to be "safe", with no hidden pitfalls.
For example, I use "find" a lot. A normal thing to do with "find" is to get a list of files to delete. Now, I could do that with "find ... -exec rm" or "find ... xargs ... rm", but that leaves me with an unsafe "find" command in my history, just waiting to bite me when I try to reuse it. If, some time later, I type "!fi" without thinking much about it, I might end up deleting some random files. Not good.
My solution, in this particular case, is never to do "find ... rm" on a single line. To remove a list produced by "find", I do "find ..." and then "rm `!!`". That runs the "find" command twice, of course, but the second time, everything has been cached, so it's very fast. (The real disadvantage is that the delay -- for the first "find" -- occurs in the middle of what is conceptually a single operation: after I've typed the "find", but before the "rm".)
So, has anyone else run into this issue? Thoughts? Other ways of dealing with it?