Aside: Although iTerm2 (on macOS) doesn't appear to work with this, iTerm2 actually supports full-color inline images, and has an "imgls" script which does something similar. (There's also "imgcat".)
I've been meaning to add sixel support since forever, but I like my way of doing it so much more because it supports the file formats people use, which are generally a much better encoding than sixel. That being said, this is nicer than iTerm2's equivalent (called imgls) because it re-encodes the image to make it smaller over the wire. That being said, I'd rather use a lot of bandwidth than have to install imagemagick everywhere. That's probably just my bias against dependencies showing :)
Not quite the same. iTerm provides escape sequence to inline base64-encoded images in a common format (e.g. PNG)[1] whereas Sixel is an it's own image format.
I find iTerm's approach is more simple yet more powerful: better compression, thus faster; no dithering, thus better image quality; it's even have GIF support. Sixel is an 30 year protocol designed for matrix printers, i'm not sure why we should stick to it. I think terminal emulators should adopt iTerm escape sequence rather than Sixel.
Aside of iTerm, image inlining is supported in Terminology[2]. Not sure about compatability with iTerm however.
I used the imgcat trick in my shell client project (https://github.com/u59u75u65/hkgbox-rs).
I was looking for a cross platform solution, but by the time I worked on it I didn't know Sixels. One of the things that is cool about imgcat is that it handles GIF animation by default.
It’s worse than that, there’s other shells that support their own ANSI escape sequence for transferring base64 encoded images: kitty (not to be confused with KiTTY - the PuTTY fork) and Terminology (of enlightenment DE) each have their own “standard” and they differ from iTerm2’s. They all have their own CLI tools too.
Then there is sixel - which solved a problem back when TTYs were VDUs but aren’t really suitable for a modern era where it’s all terminal emulators running on (U)HD displays. I don’t think sixel support is all that widespread though but happy to be corrected there.
What we really need is one CLI tool to rule them all. One tool that will detect the shell you’re using and default to the best escape sequence for that shell; falling back to ASCII art when all else fails.
This is something I’ve been actively working on in some of my spare time but it’s not ready for public consumption, yet....
Yeah I did say “terminal emulators” else where in that comment. Not really sure why I said “shells” in the first line. Must have been having a senior moment....
Sidenote: terminology does not transfer base64 images, but gives an URL (usually a local file, but it could work with http), so that does not work over ssh; but it's faster (no copy/encode/decode).
hterm, the terminal emulator portion of the Chrome (OS) Secure Shell app, supports them.
(Aside: I dislike the vagueness of the term "proprietary." I rarely hear anyone call the non-ANSI VT-100 or xterm sequences "proprietary," even though they got started in exactly the same way as iTerm2's.)
> I dislike the vagueness of the term "proprietary."
iTerm2 calls their escape codes proprietary[0]. As far as I can tell, they are not trying to create a standard. VT320, on the other hand, is an ANSI standard[1].
You can also specify filenames and, of course, use shell wild cards
However, some may be slow to render (like PDF), so lsix doesn't show them unless you ask specifically.
Regardless of the justification, I don't think creating such extra exception/edge-case behaviours are ever a good idea in a commandline tool like this. The equivalent in regular ls would be something like not showing more than X files "unless you ask specifically" --- which would just create more confusion than anything. As someone who has used the commandline for many years, one property which needs to be present for good usability is consistency. Having a bunch of edge-cases and special behaviours makes a tool hard to use for anything but the most extremely basic of operations.
Are PDFs really expected to preview in an image viewer though? I don't really think of them as images, so I see the flag to preview them as more of an "extra feature", which is exactly what flags are for.
If it didn't preview GIFs without a flag, or PNGs, then your point would absolutely stand.
I recall at least one vector drawing program offering PDF as target, and this being expected in some print shops, so yes, in some cases, PDFs can be images.
PDF is a weird format with a bunch of disjoint use cases.
That’s a weird edge case. It’s images to us nerds and hackers but it’s certainly not images to anyone in print. Where as PDFs were used to send final magazine pages (for example) which were ready for print.
(Source: I worked for a publishing company in the 90s)
I think the reason for that is more to do with history than anything. The original intended purpose of PDFs were to be a print quality final layout.
That’s not so much the case these days and Word documents / PowerPoint presentations have supported embedding fonts too so they’ve drifted a little into PDFs territory. But from a generalised and historic standpoint: Word documents were for editing and PDFs were for printing.
In my experience of academia, using PDFs as an image format is extremely common for sharing plotted data. The only common alternatives are (E)PS and PNG which are much worse.
You can use PDFs in Xcode. Instead of using a single image for all the different resolutions and polluting your fs is actually really nice to have a single file with preview
I, like gp, prefer consistency and predictability over unique tool-specific defaults. If every command line tool had it's own special quirks, I would need to consult man pages every single time I issue something more complex that a `cd`.
Perhaps lsix should be separated into two tools like find and locate; one that uses cached rendered previews and the other generates them on the fly.
Predictability is a tad subjective (I'll put consistency part aside for a moment here); e.g. you can think of pdf handling this way: lsix is about images first and foremost, so it makes sense to hide everything else, including pdfs, which are mostly books and papers and other things that thumbnails would hardly even help with. (This reasoning, however, becomes more brittle once we include videos into equation. Should it be images-only tool, or should it be about anything thumbnailable?)
Regular `ls` lists files in a variable number of columns depending on the terminal width. Many utils enable or disable color output depending on the destination of that output, and on terminal's capabilities.
CLI utils are full of special behaviors for convenience of use by people. To enable strict consistence that is mostly necessary for usage by other programs, there are flags.
The moment you pipe ‘ls’ it will fallback to using just one column.
In fact a great deal of CLI tools (and particularly coreutils like GNU) will have nice usability features this will auto-disable the moment the output file is not a TTY.
For example, run the following commands and compare the output:
ls ~
ls ~ | cat
grep --color=auto ${hostname} /etc/hosts
grep --color=auto ${hostname} /etc/hosts | cat
You might be surprised at just how different and predictable the terminal utilities behave when they detect they’re being piped.
Many CLI tools will detect if they are running in an interactive shell and change their output accordingly.
Git for example will colourise terminal output if it can tell a person is interacting with it, but if a script is calling then the colour markers won't be added (and there are many many other examples of similar behaviour).
ls doesn't show the majority of files in my home directory "unless I ask specifically", because of an accident of history that makes files starting with a period "hidden". And yes, this would infuriate me, if I hadn't discovered shell aliases within one day of becoming a linux user. It would also infuriate me if ls took five minutes to render output, and I would also solve that with a shell alias.
You can’t really argue that ls not showing hidden files by default is a quirk. Next you’ll be complaining that rm doesn’t delete root owned files when run as a regular user so you created an alias to sudo rm for every operation.
I’ll grant you that hidden files being defined by a prefix on the file name is a weird quirk - but at least that’s something which is consistent across all of Linux and UNIX (including OS X) and something that all tools for those respective platforms behave the same around. And to be fair, it’s a massively old convention that made sense back before metadata was a thing so it’s not even without some rationale.
Ultimately though, the issue here is you don’t want any files hidden by default. Which is the opposite of the point of a hidden file. So you’d have the same complaint if the “hidden flag” was file system metadata (eg on NTFS) as you would want if it were a file name prefix.
You're focusing on the wrong point in my comment. See the other replies by pyg in this thread. ls is not a good example of consistency to contrast with here.
pyg doesn’t say anything you hadn’t and he had the replies saying the same comments that I made. So I’m not really sure what you’re trying to argue about ls being inconsistent aside it hiding files that are widely recognised as hidden by other tools systemwide anyway.
Perhaps the GUI file managers are also inconsistent because they hide hidden files by default too?
I dunno. It seems to me this tool does the right thing, which is to avoid adding minutes to the task of printing a directory listing, which we expect to be instantaneous for local use. But the point I was trying to make is that if, for whatever reason, you prefer waiting two minutes for image previews of PDFs, you can always just `alias lsix="lsix -whatever"`.
Now that I think about it, pdf isn't an image type, so I'm not sure if I would expect a tool like this to handle that anyway.
PDFs have historically been used for print so I can see the rational behind handling them as images; however I can’t see many publishers using this tool.
toolName /somepath # process all files the tool chooses to
toolName /sompath/* # process every single file, even files it does not support (with a warning)
In the second case, toolName would not be aware of the '*' unless it was single-quoted. Bash would expand it to match every file, and the toolName binary would see $argN as each filename.
Yeah, but those files are known as "hidden files". In other words, it is the creators of those files who purposefully prepend them with a . to hide them from ls and other tools.
One can argue whether this "starting with a . means hidden" is a hack or not, but I think it's manifestly different than not showing particular file types based on size or something like that.
I agree just not showing pdf files by default because they render slowly is a design mistake.
> One can argue whether this "starting with a . means hidden" is a hack or not
The story from Rob Pike is that this came about from a combination of (1) not wanting to show "." and "..", and (2) some sloppy programming in the implementation of that feature.
The thing I'm getting at is not whether one behaviour or another is 'right' - just that they are both inconsistent so in this case 'consistency' is probably not the right design criterion on which to base this criticism.
It's still largely arbitrary. I'm pushing back against the general notion that standard Unix-y command line tools are some shining example of consistency. Most of it is just us confusing familiarity for consistency.
Semi-related fun fact: if you have iimage-mode turned on in Emacs (it might be the default on recent versions?), you can just cat images in eshell and have them show up just like this app. Unfortunately it doesn't do sixel graphics in a terminal yet.
I verified that "lsix" works with MacTerm (my project), which implements Sixels and incidentally also implements the iTerm2 sequences used by "imgls". Rendered areas can be selected and saved or dragged as whole pictures, which may be useful for remote images.
Pixel data doesn't necessarily imply binary data, though. In the Plain PPM format, pixels are encoded as plain text RGB tuples separated by white space. It's not inconceivable for such a simple format that one could write a simple image processor in a few lines of shell + coreutils.
They have a point - I can’t find any traces of a shell library implementing image parsing without calling out to a third-party tool written in not-shell.
You also have a point - I’m seriously debating if I can write one in an evening solely to say that one now exists. (I lost the debate with myself, sadly.)
EDIT: I mean “source language bash”, not “target language bash”, my emscripten friends.
In particular, the value 0 can't be stored in a bash variable, which almost completely rules out the possibility of byte-by-byte processing of general binary data. It's probably possible with absurd hacks.
As far as I know it's just xterm on desktop. It's fascinating that we have this standard that allows us to build programming environments like notebooks -- without resorting to layering hacks on hacks to make it work on web -- practically crossplatform yet we don't make use of it.
At repl.it we've been toying around with adding sixel support for xterm.js and we have a prototype up and people in our community are already building things on it.
> What if our CLI programs could output HTML or JSON depending on whether they are piped or printed.
Well, they can, and there are tools that read HTML data from their standard input and render it in a browser.
I don't understand the preoccupation with rendering things like this within terminal emulators. We have graphics displays, X11, and can mount anything locally. If I quickly want to view some thumbnails, I'll use feh, or maybe just a graphical file browser. If I want to view an HTML document I'll open that in my web browser. If I want to plot some data I can use gnuplot. None of this is particularly distracting to my workflow.
On the contrary, I think there is some value to having command line applications output data that is easy to parse and WYSIWYG, because its strength to me lies in the ease with which you can automate workflows using a shell operating on text data. That's sort of lost when all the data you see is filled with invisible markup. This is already the case to some lesser extent with colors and e.g. ls formatting.
XTerm (compiled with --enable-sixel option) You should launch xterm with "-ti 340" option. the SIXEL palette is limited to a maximum of 16 colors. http://invisible-island.net/xterm/
Note that it is the tmux maintainer, not the libsixel maintainer, who is saying no. He won't even add support if someone else provides the patch, unless the patch is small. Why? Because:
> "I am not interested in graphics in the terminal and there are few practical uses for them anyway."
Really? I wonder how the *nix environment would look like today if the tools (gnu, et al) only worked with the workflows that their original creators envisioned. In fact, just recently I had to SSH into a remote server and find a particular image. The filenames are random strings, and the timestamps were useless as the files were recently copied from an archive drive without the `-a` flag. Having image support in the terminal would have saved me enough time to pick up my little one from kindergarten myself instead of having to send someone else to pick him up for me.
Doesn't seem to work neither on Ubuntu nor Slackware Linux for me... It only outputs some error messages about "missing delegates for this format" and other errors (ImageMagick 6.9.7.4 and bash 4.4.18 in one case, and IM 6.9.4 and bash 4.3.48 in the other; same kind of errors).
By now a lot of ppl have commented on the tech (sixel) but it really annoys me when projects don't explain in simple terms the overall workings in the readme:
"lsix is a shell script that uses ImageMagick convert tool to list images on a shell screen on terminals that support the Sixel format"
There, was that so hard?!? :-(
EDIT: oops, it does mention it, everywhere actually. 1) need to pay more attention but 2) I was just venting because of so many other repos I have to spelunk to have answers for when a few words in the readme would do :-)
Despite being old as the hills and seemingly unchanged over the years ImageMagick still manages to be slow. It is a mystery how the developers managed this feat.
It's a shame that netpbm seems to be mostly dead, it always worked much better for the kind of tasks that most people use ImageMagick for, at least for me.
The script seems to depend on ImageMagick "convert" for most of its work; when I run "convert -version" I see "ImageMagick 7.0.6-0 Q16 x86_64 2017-06-12", is this newer than the version you are using?
https://www.iterm2.com/documentation-images.html