I wind up making lots of interim files when importing/exporting/munging/analyzing data–to help keep track of these things (CSVs, scripts and miscellany that I may or may not want to revisit in the future) I have a function I call today to auto-create daily scratch directories:
TODAY_DIR="$HOME/today/"
DATE_DIR=$(date +'%Y-%m-%d')
if [ ! -d $TODAY_DIR$DATE_DIR ];
then
mkdir -p $TODAY_DIR$DATE_DIR
fi;
echo $TODAY_DIR$DATE_DIR
So you can do stuff like this with less thinking/typing:
cp somefile.csv $(today)
I've been using this for a few years and continually find it handy, both at the command line and in keeping files clustered when I want to dig something up later. It is slightly less helpful if you regularly work past midnight, though!
I call this blingle. I call it after any long running operation that I want to be notified of. It pops a desktop notification and sends a push message to my phone.
I have letters like r and b aliased in my bash profile to check for and run a bash script, if it exists, in each project directory (r = ./run.sh, b = ./build.sh).
In each of those scripts, I typically have a one liner depending on what the project requires. A simple build one is:
#!/usr/bin/env bash
make build
And run:
#!/usr/bin/env bash
docker run foo/bar
Or maybe:
#!/usr/bin/env bash
python manage.py runserver
I might also add (source) environment variable settings, etc. Sort of like my own personal decentralized makefile.
Then I add each script to my .git/info/exclude for each project. It saves so much time switching between projects to not have to remember any particular one's build or run commands.
One slight modification: name the build and run scripts something that you will never expect to be in that repo (maybe like run-xyz.sh where xyz are my initials, 10 random characters, etc.).
Then, the filename can be excluded in a global gitignore file.
Yeah, good points. Maybe putting the script(s) inside a .whatever directory inside the project root like some other dev tools do is worth consideration. What do you think?
Really to do this right you should make it function like direnv and create a whitelist of scripts you trust (bonus points for including hashes of the script). On first run it'll ask you to review and trust the script, and then just work on subsequent runs.
This is awk which emits the stream of unique things, as they are seen. it doesn't require sorted input. It runs at the cost of building the obvious hash in memory so can drive you to swap over large inputs, but its portable, does not require post-install s/w typically not on small systems and it delivers outcomes fast.
I use it all the time when I have some UNIX pipe emitting things and I want to "see" the uniques before I do sort | uniq -c type things.
I run an AHK script with a little over 2000 abbreviations (e.g. typing 'abbn' expands to 'abbreviation'). It helps me type 100+ WPM without too much strain.
AHK has been working great for me over the years. My addresses, snippets of emails, expansions are all stored in an ahk file. I use various email addresses for different sites; so a@a would become aj(at)ajonit(dot)com or a@l would become admin(at)learnqtp(dot)com. The possibilities are endless.
Few year's back I created a video for my blog readers and published an AHK template. You may download it here http://www.learnqtp.com/get-productive-with-automation-autoh...
I created a Perl script called anyconnector which allows me to jump around between different Cisco Anyconnect VPN's using details stored in KeePass entries.
e.g.
To connect: anyconnector -c env-name
Disconnect: anyconnector -d
Get status: anyconnector -s
My thinkpad running Linux is a bit temperamental when changing displays, often enumerating an existing display port as a new one.
I use the following script to switch to dual external monitors at a standard resolution, and a counterpart script to switch back to the internal hidef monitor.
If only I could reliably fix xfce4's panel placement all of the time...and not have to restart chrome and pycharm/intelliJ on each display change!
Something that I added to my bash profile is this simple `cd` override that triggers `workon` from `virtualenvwrapper` if the folder I'm accessing has a `virtualenv` with the same name.
That seems to be somewhat popular on github, but I rarely receive feedback so I'm not sure if people star because they use them too, or just because they suspect they might.
This Bash function rebases and pushes all my feature branches on the upstream "develop" branch:
rebase-all ()
{
old=`git rev-parse --abbrev-ref HEAD`;
stashed=`git stash`;
for b in $(git branch --format '%(authorname) %(refname:short)' | sed -ne "s/^`git config --get user.name` //p" | grep -- -);
do
git checkout $b && git rebase origin/develop && git push --force || ( git rebase --abort && echo Could not rebase $b );
echo;
done;
git checkout $old;
if [ "$stashed" != "No local changes to save" ]; then
git stash pop;
fi
}
I'm in sales for a surveying/feedback SaaS co. Before demos I brand the survey/account with the prospect's company's colors & logo. During the demo I send custom branded, inline email surveys to a test gmail account to help my prospect understand the survey respondent POV. I also have to delete that test-survey email every hour so that the next prospect I demo isn't privy to who is exploring us. I often have to do 4-8 demos a day so as you can imagine that got old fast.
I wipe the inbox for my test gmail account every hour at the :45 minute mark using a ruby script. I don't automate the sending because it's part of the education of the prospect.
Biggest timesaver in the world for someone who listens to as much music as I do and doesn't want to deal with manually transcoding FLAC files.
iTunes and Apple Music will soon support FLAC natively, though -- previously it was out of necessity, space, and not wanting to use VLC and lose my main/single use of my Apple Watch. It's still about the latter two.
I recently also set it to output a manifest that I can feed into rsync on my work computer to pull my library from my backup server.
by default it shows a list of topics. Then if you run it with the topic, it displays the details about the topic. I use it to remember how to do less frequent stuff at my day job.
I wrote a Docker helper script, which resets it to a baseline state when I need to clear out cruft post-restart. I call it harpoon, so I can harpoon the whale when I go to restart my dev machine.
# Helper function for running command in each subdirectory under current one.
function each {
if [ -z $1 ]; then
: # If no command is given, then this is a no-op.
else
find . -maxdepth 1 -mindepth 1 -not -path '*/\.*' -type d -exec sh -c "(echo {} && cd {} && $* && echo)" \;
fi
}
zsh functions:
# Helper function for navigating tmux sessions
export WORKSPACE_ROOT=$HOME/workspace
ws() {
if [ -z $1 ]; then
if [ -z $TMUX ]; then
tmux attach-session
else
cd $WORKSPACE_ROOT
fi
elif [[ $1 == 'ls' ]]; then
tmux list-sessions
else
tmux attach -t $1 || cd $WORKSPACE_ROOT/$1/src; tmux new-session -s $1
fi
}
# Helper function for running command in each subdirectory under current one.
each() {
if [ -z $1 ]; then
# If no command is given, just exit
else
find . -maxdepth 1 -mindepth 1 -not -path '*/\.*' -type d -exec sh -c "(echo {} && cd {} && $* && echo)" \;
fi
}
I have a deployment script that is part of my one-click deployment to the customer's webserver. It's kind of a legacy to having really crap interwebs though, as my upload used to be in the 700-900 kbit/sec range, so it strips out anything from the deploy package that isn't absolutely essential to create sort of a delta of changes.
Makes deployments blistering fast now that I've got 30 Megabit/sec up though... :)
I wrote a docker-wrapped script to create keys and certificates for use with HTTPS for development. These are configured with algorithms which modern browsers won't complain about. It also allows alternate domains.
`watch-reload [watch dir] [reload domain]`, which watches for changes in a directory and triggers a reload for tabs matching a certain domain using a Chrome extension.
`watch-copy [copy to dir]`, which watches for changes in the current directory and builds and copies to another directory.
`set-lights [group name] [on/off/high/low]`, which posts commands to a Hue bridge.
I start projects very often and find myself using my scaffolding [tool](https://github.com/vutran/zel) for minimal dotfiles (which are downloaded from a repo) very often.
Also use it a lot for syncing dotfiles in my user directory between my work and home computers.
I like using a lot of aliases and chaining them together if I can and it makes sense.
The most common one I use all the time is one that changes the command prompt to show what git branch I'm on when in that directory. Just google it if you'd like.
If you repeat some command a lot or a chain of commands and it takes a lot of tedious typing, then script it.
I have a shell alias that creates a temp directory labeled with the current time stamp and suffixed tag then changes to it. Makes it very quick get a new scratch area for shell fu.
Plus all the scratch dirs are in one place so cleanup is a breeze to free up space.
I typically have a script that prints out my local IP, instead of if/ipconfig or some OS UI. If I'm hosting a static page with an HTML5 game, it's then pretty easy to access it on another computer or my phone.
Classify digital photographs that I need to review by its content. Import/export data from/to a DB, analyze it statistically and show graphically the info. Pretty standard.
There's a gui app on top of the script, so not sure if it counts, but I find SelfControl (selfcontrolapp.com) indispensible for staying focused each day.
I've installed it recently but some sites are hard. For example, I'd block YouTube in the blink of an eye, but what if I really need to look up a video tutorial or something in the middle of a work session?