This will silently fail if you happen to have one file that matches test.* (like test.txt) in the current working directory, because test.* will be replaced by the name of this file and that's what find will see.
This will fail in zsh if no files match in the current working directory, because globbing will fail.
You need to quote this to avoid unintuitive results, in any shell.
find /home -type f -name 'test.*'
Not doing this will probably work most of the time, but it will probably confuse you the one time it won't and drive you crazy if you don't realize what is going on.
I've become very aware of this kind of things by using zsh which is stricter with failing globs than bash. Also quote anything that contains {, }, [, ], (, ), ?, ! or $ for similar reasons. Beware of ~ too. And & or | obviously, and also ;.
Do yourself a favor, quote the hell everything in shells that's not simple, alphanumeric strings or option names. Even if it's only alphanumeric actually, if it is a parameter value, quote it. This way, when you edit your command and somehow add a special character, you are already covered. It also makes your values stand out, making your command arguably easier to read because they might look more uniform.
edit: and quote with single quotes, unless you need variable expansion or to quote simple quotes (but be extra careful then)
This is before glob expansion happens before running the program.
> The order of expansions is: brace expansion; tilde expansion, parameter and variable expansion, arithmetic expansion, and command substitution (done in a left-to-right fashion); word splitting; and filename expansion.
Globs in an interactive shell should get expanded and then force you to press enter to confirm the command with expanded globs. Are there any shells that implement this?
In zsh, you can press tab after something that will get expanded, and it expands in place. I actually use it quite often to check whether the expansion is right.
However, this does not cover the case where an unexpected globbing happens like in this find situation.
I looked through zshoptions(1) and didn’t see anything like this. However you can turn off globbing entirely with “unsetopt glob”. There are however other expansions that occur outside of globbing, such as parameter substitution.
It might be possible to implement this using zsh’s line editor, but it would take some digging through the man pages to figure this out (or finding a zsh expert). Try zshzle(1).
Yeah this is probably a much more complicated answer than you were looking for.
On top of shell scripts (and on ~/.bashrc), so that a glob that doesn't expand will always be an error. That way, you won't have that unquoted test.* work by accident, and thus it's easier to train yourself to always quote.
Or, just use zsh, which has this behavior by default.
Also = when prefixed with a space, which works basically like `command -v`, expanding to the full path of the executable. For example `echo =grep` results in `/usr/bin/grep`.
Personally, I find find(1) to be an utter offender against the Unix philosophy of "doing one thing well". It can do a dozen things or so, and not particularly well either. (Execute actions? Delete stuff? Look for a string within files?). Just look at this post and how long the find(1) section is wrt. the ones about which(1), whereis(1), and locate(1).
I am sure that well-versed users can achieve wonderful things with it; myself, I either use fd or pipe "du -a" into grep (or rg), and move on with my life.
Agreed. My most common use case for "finding stuff" on linux boxes is me looking for particular strings in clear text files, so I often end up doing "grep string -Ri {/etc,/var} | less".
Also, though find seems to be POSIX compliant, I just don't like it's syntax and how flags are handled.
You cannot "always" do this because some of us frequently or occasionally work in environments where external, non-reviewed software simply isn't allowed.
`fd` seems pretty useful, but I tend to just `find . -name '*whatevs*'`. It's a bit longer, but one less tool to make sure you have available (and find is everywhere), and one less tool to learn. Even just `find . | grep whatevs` is fine.
I find it funny that desktop search engines are not mentioned with a word here. This would be the first place to start when using Linux on a desktop (but of course not a server, i.e. without any desktop environment).
These search engines are very powerful because they do deep file scanning, are based on mature frameworks such as Apache Lucene.
My speedy system tanks when Baloo (and fstrim) tries to run after booting because everything runs at default priority. They really need to set nice and ionice on those processes by default. Thus locate is what I'll usually end up using.
I tried Plasma ( with i3 to replave kwin, didn't turned out to be a goo experience btw, back to bare i3) some times ago and directly deactivated this baloofile after seeing it was taking 1.7gb at boot ( maybe it was temporary ). I just use fzf when I don't know where something is.
I find the current Nautilus search to be very powerful, especially in searching inside texts & PDFs. It certainly does a far better job than Mac Spotlight or Windows Search. Under the hood, I am pretty certain it runs an extended command with regexes
The find command existed when UNIX was in its infancy, long before the existence of CLI argument conventions such as double-dash long options, or positional vs. non positional order.
If you forget what all those system directories are for (/bin, /sbin, /usr/bin, etc), another useful command is "man hier". It's a reference page for the filesystem.
These are all great, but the one I find myself using constantly on source code and other text-oriented files is The Silver Searcher (ag)[1]. It’s not as useful for file _names_, but most of the time, I care about contents and this searches, in realtime, at an incredible speed. Add the -l flag to list only filenames and you’ve got an amazing code location tool.
Not for finding filenames, but for finding files containing stuff, there's the magical `grep` ! Probably my most used command for finding stuff in a massive legacy codebase.
grep --include='*.js' -rn methodName
will recursively search the current directory and all subdirectories for .js files containing the word 'methodName'
Obviously IDEs can do this as well, but since you can supply regex, you can get some pretty complicated cases.
Random arbitrary example but difficult to do with an IDE -
locate is, for whatever reason, tragically slow. The database format it uses is nonsensical and completely optimized for size on very outdated assumptions.
I use an implementation I have written in the shell itself whose database format is nothing more than every file path on the system separated by null bytes, that is simply grepped to find files; the speed difference is absurd.
—— — time locate */meme.png
/storage/home/user/pictures/macro/meme.png
real 0m0.885s
user 0m0.806s
sys 0m0.010s
—— — time greplocate /meme.png$
/storage/home/user/pictures/macro/meme.png
real 0m0.089s
user 0m0.079s
sys 0m0.011s
This implementation is highly naïve and simplistic, and offloads all the searching to GNU Grep, yet outperforms the actual `locate` command by an order of magnitude.
And plocate is yet orders of magnitude faster than GNU grep. :-) And updates its database faster. You don't specify which locate you're using, but mlocate and BSD locate are basically obsolete by now.
Thank you! It's becoming the default now, slowly (e.g., it will be the default in Debian and Ubuntu from the next releases, and Fedora is in a process to make it replace mlocate right now). It's just a tad too new, only about a year since 1.0.0. :-)
They are simply em-dashes, if an error should occur, they are replaced with numbers to indicate the error code; they are also color-coded to indicate when I'm not for instance in an SSH session by changing to another color:
—— — true
—— — false
—— 1 sh -c 'exit 120'
120 sh -c 'exit 20'
— 20
It's astonishing that searching for a file name by substring across a file system is still not instantaneous on most systems. On my laptop (a 2GHz Quad-Core Intel Core i5), `find / -iname 'quux' takes 2 minutes to find matches across an APFS partition with 2m files. Grepping for a substring in a file with 2m lines takes
a few milliseconds. Why don't modern file systems implement something like the `locate` database, except one that is always up-to-date, so that scanning for a file does not require expensive traversal?
Unfortunately it's not instant - it takes about a second for the tens of thousands of file entries to populate. But fzf searching that list is practically instant. After selecting the file I want, enter just returns me to the command line with the full path to the file. I can then ctrl-a and type 'vim' or 'code' or whatever. It's not a perfect workflow, but it's pretty good for finding files in complex folder structures.
To find what package provides a certain file. Definitely very useful when trying to build packages from source and not sure in what packages the reported missing libs/headers are located.
You could also do `strace --trace=open some_app` or `strace --trace=%file some_app` to have the strace do the filtering by itself (the latter will match any syscall that takes a filename)
As hobbyist Linux [server] user, I've been using which, with hit and miss results, and 'find' never really worked for me (I clearly don't have the magical aptitude for it). I've just given whereis a go - and that's perfect for me. Nice.
This entire HN comments page visualizes so effectively why I don't use Linux.
Even when someone gathers enough knowledge to write an introductory guide to using the damn thing, it turns out it's all wrong, has bugs, fails in unexpected ways or is a deprecated method that's being phased out.
You can't make these gotchas affecting Linux distributions go away by blaming someone who present things the way many people use POSIX interactive shells.
Being a streamer and a .NET advocate does not make one incompetent in POSIX shell. Thing is, POSIX shell is tricky and it's all too easy to fall into traps, even for experienced people.
This kind of things can be found in any OS, especially widespread ones, which take their roots from long ago. We now would have enough experience to make things with fewer gotchas, but this would break compatibility and habits, and we can't get rid of the huge existing ecosystem anyway.
In the shell area, there are attempts to fix the syntax, like fish [1]. I'm afraid this kind of things is doomed to remain niche, because when you deeply understand what it is trying to do, you are also probably used to classic POSIX shell enough that you may not want to change. Fish users also need to go back to bash or zsh to follow a number of tutorials and documentations, so even them can't avoid POSIX shells.
I myself adopted zsh to have the features of fish and keep the bash-like syntax since I need to deal with this syntax either way.
In the Windows world, and actually Unix too, there is also Powershell. Can't comment that much since I don't know it, but it does not seem to have taken off on Unix, and many people on Windows seem to use bash through WSL anyway.
My conclusion is that bash and bash-like shells are ruling the world, even on Windows, and we are stuck with the POSIX shell syntax for now and probably a long time.
Sure, but its good to also be able to use the default tools. Because there's going to be times where 'fd + rg' are (for whatever reason) not available.
Some reasons could be: not installed on machine, and not allowed (or not supposed to) install it globally or locally, embedded machines which don't have space to install new software.
This will fail in zsh if no files match in the current working directory, because globbing will fail.
You need to quote this to avoid unintuitive results, in any shell.
Not doing this will probably work most of the time, but it will probably confuse you the one time it won't and drive you crazy if you don't realize what is going on.I've become very aware of this kind of things by using zsh which is stricter with failing globs than bash. Also quote anything that contains {, }, [, ], (, ), ?, ! or $ for similar reasons. Beware of ~ too. And & or | obviously, and also ;.
Do yourself a favor, quote the hell everything in shells that's not simple, alphanumeric strings or option names. Even if it's only alphanumeric actually, if it is a parameter value, quote it. This way, when you edit your command and somehow add a special character, you are already covered. It also makes your values stand out, making your command arguably easier to read because they might look more uniform.
edit: and quote with single quotes, unless you need variable expansion or to quote simple quotes (but be extra careful then)