This is a little project simply to make trivial tools in Go effortless for my personal usage. These tools are almost surely of low utility to most people, but may be instructive nonetheless.
I have CI/CD to build this into a single
binary and an
explode
tool that builds
symlinks
for each tool in the busybox style.
I have automation in my
dotfiles
to pull the latest binary at install time and run the explode
tool.
Here's a copy pasteable script to install the leatherman on OSX or Linux:
OS=$([ $(uname -s) = "Darwin" ] && echo "-osx")
LMURL="$(curl -s https://api.github.com/repos/frioux/leatherman/releases/latest |
grep browser_download_url |
cut -d '"' -f 4 |
grep -F leatherman${OS}.xz )"
mkdir -p ~/bin
curl -sL "$LMURL" > ~/bin/leatherman.xz
xz -d -f ~/bin/leatherman.xz
chmod +x ~/bin/leatherman
~/bin/leatherman explode
This asssumes that ~/bin
is in your path. The explode
command will create a
symlink for each of the tools listed below.
Each tool takes different args, but to run a tool you can either use a symlink
(presumably created by explode
):
$ echo "---\nfoo: 1" | yaml2json
{"foo":1}
or use it as a subcommand:
echo "---\nfoo: 1" | leatherman yaml2json
{"foo":1}
$ <someaddrs.txt addrs "$HOME/mail/gmail.sent/cur/*" >sortedaddrs.txt
Reads emails (in the mutt addrbook format, see below) on stdin and sorts them based on when they were most recently sent to.
Converts email addresses from the standard format ("Hello Friend" <foo@bar>
)
to the mutt (?) address book format, ie tab separated fields.
Note that this new version ignores the comment because, after actually auditing
my addressbook, most comments are incorrectly recognized by all tools. (for
example: <5555555555@vzw.com> (555) 555-5555
should not have a comment of
(555)
.)
Faster version of xbacklight
by directly writing to /sys
. Example:
Increase by 10%:
$ backlight 10
Decrease by 10%:
$ backlight -10
My personal, digital, wall of clocks.
Reads CSV on stdin and writes JSON on stdout; first line of input is the header, and thus the keys of the JSON.
Reads CSV on stdin and writes Markdown on stdout.
Powerful tool for debouncing lines of input.
Dump the contents of a mozlz4
(aka jsonlz4
) file commonly used by Firefox.
Just takes the name of the file to dump and writes to standard out.
Create undefer files for any lines containing a markdown link to
LWN.
Lines will be added in no particular order. Lines not containing links
(or otherwise running into errors) will be printed to standard out, with errors
going to standard error.
Produces a JSON representation of an email from a list of globs. Only headers are currently supported, patches welcome to support bodies.
$ email2json '/home/frew/var/mail/mitsi/cur/*' | head -1 | jq .
{
"Header": {
"Content-Type": "multipart/alternative; boundary=00163642688b8ef3070464661533",
"Date": "Thu, 5 Mar 2009 15:45:17 -0600",
"Delivered-To": "xxx",
"From": "fREW Schmidt <xxx>",
"Message-Id": "<fb3648c60903051345o728960f5l6cfb9e1f324bbf50@mail.gmail.com>",
"Mime-Version": "1.0",
"Received": "by 10.103.115.8 with HTTP; Thu, 5 Mar 2009 13:45:17 -0800 (PST)",
"Subject": "STATION",
"To": "xyzzy@googlegroups.com"
}
}
Reads text on STDIN and writes the same text back, converting any links to
Markdown links, with the title of the page as the title of the link. If you set
MOZ_COOKIEJAR
to the path of your cookies.sqlite
it will use those cookies
when loading the page.
Exports entire company directory as JSON.
Exports company org chart as JSON.
Create persistent functions by actually writing scripts. Example usage:
fn count-users 'wc -l < /etc/passwd'
Little tool for generating bcrypt hashes.
Creates time series data by counting lines and grouping them by a given date format.
Creates a proxy for https://tgftp.nws.noaa.gov, but http, and listening on 9090. This is because Ubuntu 18.04 ships with taffybar 0.4.6, which only supports http for the weather widgets and has hardcoded URLs.
To install, add this line to your hosts file:
127.0.0.1 tgftp.nws.noaa.gov
And run this iptables command:
iptables -t nat -A OUTPUT -o lo -p tcp --dport 80 -j REDIRECT --to-port 9090
minotaur -include internal -ignore yaml \
~/code/leatherman -- \
sh -c 'go test ~/code/leatherman/...'
Watches one or more directories (before the --
) and runs a script when events
in those directories occur. The script receives the events as arguments, so
you can exit early if only irrelevant files changed. The arguments are of the
form $event\t$filename
; for example CREATE x.pl
. As far as I know the
valid events are;
CHMOD
CREATE
REMOVE
RENAME
WRITE
The events are deduplicated and also debounced, so your script will never fire more often than once a second. If events are happening every half second the debouncing will cause the script to never run.
The underlying library supports emitting multiple events in a single line (ie
CREATE|CHMOD
) though I've not seen that in Linux.
minotaur
reëmits all output (both stderr and stdout) of the passed script to
standard out, so you could make a script like this to experiment with the
events with timestamps:
#!/bin/sh
for x in "$@"; do
echo "$x"
done | ts
You can do all kinds of interesting things in the script, for example you could verify that the events deserve a restart, then restart a service, then block till the service can serve traffic, then restart some other related service.
The -include
and -ignore
arguments are optional; by default -include
is
empty, so matches everything, and -ignore
matches .git
. You can also pass
-verbose
to include output about minotaur itself, like which directories it's
watching.
Pass hostname and login and this will print the password for the account.
pomotimer 2.5m
or
pomotimer 3m12s
Originally a timer for use with the pomodoro technique. Handy timer in any case since you can pass it arbitrary durations, pause it, reset it, and see it's progress.
alluni.pl | prefix-emoji-hist ~/.uni_history
Prints out the deduplicated lines from the passed file, converting characters to
unicode names, and then printing out the lines from STDIN, filtering out what's
already been printed. Note that alluni.pl
is in my
dotfiles
Ghetto tool to render email with a Local-Date included, if Date is not already in local time.
Extracts zipfiles, but does not extract .DS_Store
or __MACOSX/
.
Automatically extracts into a directory named after the zipfile if there is not
a single root for all files in the zipfile.
Minimalist rss client. Outputs links as markdown on STDOUT. Takes url to feed and path to state file. Example usage:
$ rss https://blog.afoolishmanifesto.com/index.xml afm.json
[Announcing shellquote](https://blog.afoolishmanifesto.com/posts/announcing-shellquote/)
[Detecting who used the EC2 metadata server with BCC](https://blog.afoolishmanifesto.com/posts/detecting-who-used-ec2-metadata-server-bcc/)
[Centralized known_hosts for ssh](https://blog.afoolishmanifesto.com/posts/centralized-known-hosts-for-ssh/)
[Buffered Channels in Golang](https://blog.afoolishmanifesto.com/posts/buffered-channels-in-golang/)
[C, Golang, Perl, and Unix](https://blog.afoolishmanifesto.com/posts/c-golang-perl-and-unix/)
List all of the available Sweet Maria's coffees as json documents per line. Here's how you might see the top ten coffees by rating:
$ sm-list | jq -r '[.Score, .Title, .URL ] | @tsv' | sort -n | tail -10
Serve a directory over http; takes a single optional parameter which is the dir
to serve, default is .
:
$ srv ~
Serving /home/frew on [::]:21873
Prepares the arguments such that they can be pasted into an ssh command safely. For example:
$ ssh-quote ls 'f*f'
'sh -c '\''ls '\''\'"''"'f*f'\''\'\'
$ echo 'foo = "bar"` | toml2json
{"foo":"bar"}
Reads TOML on stdin and writes JSON on stdout.
Takes a directory argument, prints contents of each file named before the current date, and then deletes the file.
If the current date were 2018-06-07
the starred files would be printed and
then deleted:
* 2018-01-01.txt
* 2018-06-06-awesome-file.md
2018-07-06.txt
$ uni ⢾
'⢾' @ 10430 aka BRAILLE PATTERN DOTS-234568 ( graphic | printable | symbol )
Inspects the unicode characters passed as an argument.
Reads YAML on stdin and writes JSON on stdout.
In an effort to make debugging simpler, I've created three ways to see what
leatherman
is doing:
LMTRACE=$somefile
will write an execution trace to $somefile
; look at that with go tool trace $somefile
Since so many of the tools are short lived my assumption is that the execution trace will be the most useful.
LMPROF=$somefile
will write a cpu profile to $somefile
; look at that with go tool pprof -http localhost:10123 $somefile
If you have a long running tool, the pprof http endpoint is exposed on
localhost:6060/debug/pprof
but picks a random port if that port is in use; the
port can be overridden by setting LMHTTPPROF=$someport
.