It's interesting to think that the goal of '..' was for navigation around the command line shell. Which is a tool that 95% of people developing software never really use, or use as a necessity (type one command and leave). It's really an interesting thought piece to consider that the idea of the '..' directory is as legacy as the headphone jack.
It's possible parent is some super-junior developer doing compartmentalized tasks and never leaving the IDE, but for anything past that avoiding the CLI and remaining productive is basically impossible.
Maybe it is just that I overuse the internet, but if you have simple needs web interfaces are actually really good. most of them are broken, but to navigate a media list (photos, movies, albums, pdfs etc. (maybe not text file)) the most comfortable interfaces are often web.
`history` tells me I used 17 commands this evening, I'm a plain old user in relative terms here [my day job only requires computers for accessing social media].
After developing on unix variants for over 20 years, I can say that most people developing software do not use the command line. I see this every day... Even skilled developers run from command line git and into a GUI tool, instead of really understanding the tooling they're using.
I cant imagine development without my shell at this point.
Even if I didn't use vim+tmux, so much other "stuff" needs the terminal. Compiling, profiling, testing, searching, moving files around, ssh/rsync and so on.
Sure, IDEs have most of this stuff, but they're not always as user friendly as cli tools, at least once you're acquainted with the later.
That’s pretty harsh, it really depends on the platform/project. If you’re doing a lot of development on windows for example you’re probably not doing a ton on the command line (not everyone is on a Mac doing web or mobile). I know doing Unity3d game development in windows 10 the only time I really use the command line is when I need to do something in git that sourcetree can’t do. And I know the command line plenty well! There’s just no huge reason for this current project.
Plus some languages sort of have their own customs. Like if you were doing Smalltalk a lot of things you’d use a command line for in web development or mobile dev you’d use a workspace in the IDE instead. And then you have Lisp, where people used to joke that Emacs had become its own OS essentially.
Yes, that is exactly what I mean. The majority of people developing software are doing it inside of Visual Studio Code or similar tools, where their only exposure to the command line is when they do "create-app"
You keep using the unqualified term "majority" with no way to actually back that up. This would probably be less critized if you stated it as your opinion (which it is) and not as a definitive statement.
Even a qualified appeal to the majority isn't a very good argument. There are many scenarios where the majority of events is caused by a minority of individuals.
For instance, 20% of developers could be responsible for 80% of all software, or 80% of the most used software, or the software that generates 80% of GDP, etc.
I think there could be a generational difference or "how you learned" angle at play here. For me learning the command line and early computer use went hand in hand. And I simply could not possibly imagine using git or other source control through an IDE and not a CLI. (I know they offer it, but it just doesn't seem natural.)
There may also be a factor in whether you learned to program as a child or at a university where Unix culture was dominant.
I learned to program (in the sense of writing programs in a compiled language) around the time the Macintosh and Amiga came out, before free Unix-like OSes on your PC were much of a thing. So while the Amiga did have a command line, and so did Macintosh Programmer's Workshop, I mostly saw a command line as the obsolete interface associated with MS-DOS. Obviously source code control was not a thing for a kid programming in the 80s.
Even though I have more recently used git than any other source control, I don't think the horrible interface has anything to do with its utility. You may like it, you may hate it, but it doesn't have to be like it is, it's just the personality it has. Kind of like Linus and his grouchiness.
In the early 80's I started on a Commodore with a BASIC interpreter "command line," spent a few years later learning DOS, and did not encounter UNIX until about '93 when a teacher mentioned Minix as a way to learn it. At work I installed Slackware from floppies, and was exposed to Sun/SGI/and VAX, good times. Took to each like a duck to water. Liked GUIs and GUI programming as well.
Never understood the folks that wanted to only use one or the other, not both, when they are complimentary. Avoiding either is doing yourself a disservice.
My start was on an Apple, but the rest tracks well enough. did Slack, then RH 5.2.
SGI handled the command line / gui matter particularly well. Most things had a GUI, and the GUI would issue the "--gui" or "-verbose" option to get the additional feedback needed for the GUI to behave more like one would expect, despite it basically being a wrapper for an otherwise CLI program.
The time I spent on IRIX really solidified when and where the two paradigms make sense. And they both do. There is no one size fits all winner here.
A BASIC interpreter is not what I was calling a command line. I guess I was thinking of it as implying an OS shell. Originally, the party line was that you did not need one on the Macintosh, but besides MPW, AppleScript came out by 1993.
For my part, I never understood why someone would want to go back to not using a GUI, once they had been invented. I mean, using a graphical interface doesn't prevent you from typing commands within windows. It's just a question of whether you limit yourself to the ancient teletype paradigm or not. So I never saw (post-1984) GUI and command line as equally valid and valuable worlds, because a GUI can encompass everything, while the command line doesn't.
It was an interpreter and command-line environment.
There's many reasons you'd still want to use a terminal interface, CLI tools, and a scripting language, easily searchable. They have staying power because they excel at certain repeatable tasks, whereas a GUI is often better for exploring.
No one has invented a lasting, portable GUI CLI, so that question is moot for now. Maybe they could, but the work involved is probably not worth it for the gain in functionality.
Next, that they are "ancient" is immaterial. They have enough of the features needed to be effective. Paper is ancient for example, and still useful in various situations.
Something like AppleScript may work acceptably for scripting, but is not available on 90% of the world's computers.
Also recommend giving a newer shell like Fish a try, it is quite helpful.
Command lines and scripting are useful for many things; I don't think they are alternatives to GUIs though.
People still use paper to a fair degree in many offices, as you say, but it's not an alternative to electronic records in the way Coke is an alternative to Pepsi.
Anecdotal, but I learned to program as a kid prior to college using free Ubuntu CDs[0], and became familiar with the command line sort of by necessity.
[0]: Ubuntu used to mail you an install CD, for free, anywhere in the world. In the early 2000s. It was pretty cool.
I've been using Linux for roughly 20 years, and I prefer the command line. But that's only because it's Linux. There's no inherent need to do a lot of things with a command line except in a context where it's the simplest, most stable, predictable, and documented way purely because of historical and cultural reasons.
Because of the long running Linux/Windows rivalry, hardly anybody can imagine something better than a Unix-clone any more. But the very name Unix was chosen because the original OS was not intended to be the be-all/end-all of OSes.
For the record, I use VS Code, but the fastest way to _open_ VS Code is
cd project-name
code .
It also lets me type Git commands directly into the terminal, which is way faster than doing it through VS Code. And there are still plenty of things, like rebasing, which VS Code doesn't have GUI support for.
> The majority of people developing software are doing it inside of Visual Studio Code or similar tools
I'm assuming this is an anecdote rather than data? Looking around my office, 19 out of 20 people are using the "xterm and chrome are the only two apps I run on my laptop" style of development; only one has a graphical IDE + chrome
Maybe if your experience using a shell is only Window's CMD. I live in the shell. My Linux machines are configured to automatically open a terminal upon login and the first thing I do when I get a new Windows installation is getting bash working on it even if it's not for development.