195
152
triyanox 3 days ago

Thanks for all the feedback! Let me clarify a few things about lla. The most amazing part of this project wasn't just building another ls alternative - it was the incredible learning journey. Building a systems tool in Rust while implementing a plugin architecture taught me more in a few weeks than months of reading could have. Yes, it does more than traditional ls, and that's intentional. The plugin system came from scratching my own itch of constantly switching between different terminal tools. Each feature added was a chance to dive deeper into systems programming and Unix internals. The performance still needs work, and the documentation could be better. But that's the beauty of open source - you ship it, learn from the feedback, and keep improving. Building in public is an incredible way to level up your skills. For anyone considering a similar project: pick a common tool you use daily and try reimagining it. You'll be surprised how much you learn along the way.

thebeardisred 2 days ago

Thank you for being one of the few projects replacing a POSIX tool which properly sets the expectation that it's for personal use. It causes me no end of consternation that I see many tools introduced which provide only the barest minimum of functionality and skip over extended attributes, ACLs, and fail to keep compatibility with flags, or don't properly separate STDOUT & STDERR.

While these may be sufficient for a naive developer, this oversight then breaks many downstream tools.

Again though, thanks for sharing. Bringing your own spin and ideas into the world can be anxiety inducing and I'm pleased you went about this in a helpful and measured way!

concerndc1tizen 2 days ago

Would you mind listing more common mistakes made by CLI developers?

ksenzee 2 days ago

Julia Evans had an interesting thread recently on “social rules” of the terminal: https://social.jvns.ca/@b0rk/113540676612640547

sphars 2 days ago

This is a good, open-source resource for guidelines on creating CLIs, which goes over some common mistakes.

https://clig.dev/

telgareith 2 days ago

These days: not building this such that they can be easily spit out as json and/or xml markup.

two_handfuls 2 days ago

Not obeying the --help flag.

fragmede 2 days ago

not behaving the same as robust cli tools. -h and --help and -v and --verbose and --version

matheusmoreira 2 days ago

If this is causing you any "consternation" at all, it means you expect too much from unpaid free software developers. The repository doesn't even have a sponsor link.

The software is provided as is, in the hope it will be useful, but without any warranty whatsoever.

All free and open source software licenses contain some version of the above statement.

All of this is implicitly for personal use. In the sense that it's not a product, just something people made because they needed a problem solved.

dimator 2 days ago

Honestly, what's the point of comments like this? No shit, it's done for a personal hobby, you're not breaking new ground with that idea.

However, this is a website of opinions, and gp's opinion is valid, because this forum is where opinions go. It's not as though gp said to stop doing this project.

This pedantic finger wagging is just so rote.

palata 1 day ago

For what it's worth, I think that both the parent and the grandparent are valid opinions.

One says that open source projects should clearly state when they are not meant as a serious replacement for standard tools. The other says that they disagree and that open source projects don't have to give any warning.

I guess I am a little in-between: if you open source your code, I don't think you have anything to do (it's already nice to put an open source license on it). If you advertise your open source tool (e.g. on this website), then it is polite to set expectations.

matheusmoreira 1 day ago

The point is expressing my opinion as a fellow free software developer. Mine is just as valid as yours or theirs. And I didn't say their opinions were invalid to begin with.

These hidden "expectations" that people seem to have regarding free and open source software can be incredibly demoralizing. It's something I wish would change. That's why I commented on it.

Chris2048 1 day ago

> These hidden "expectations" that people seem to have regarding free and open source software

Taking this to a logical conclusion; If a plumber/lawyer/<professional> offers services for free, and those services end up killing, or massively damaging someone, can they just say the same thing to absolve themselves of all liability?

I also wish to change things, in the opposite direction. FOSS devs should explicitly mark things as not-for-prod; rather than pushing things as prod-ready when they aren't. I think some kind of change will come upon FOSS in the future so people can rely on it, and sadly I think that change will be adoption by corporates (w/ legal budgets) rather than the FOSS devs/ORGs themselves becoming more mature.

matheusmoreira 1 day ago

> If a plumber/lawyer/<professional> offers services for free, and those services end up killing, or massively damaging someone, can they just say the same thing to absolve themselves of all liability?

There is a world of difference between what professionals such as doctors do and what free and open source developers do. It's not even remotely the same. I know because I happen to be both.

And even if they were in any way comparable, professionals get paid handsomely precisely because of the liability and responsibility. If people want this out of free software developers, they should start paying them some serious money.

> I also wish to change things, in the opposite direction.

If you want this, hire a professional to do it for you instead of pushing unwanted responsibility and liability onto the rest of us. I've got more than enough of that at my actual job and I absolutely do not want it in my free software development hobby. Adding liability to free software will kill it.

> FOSS devs should explicitly mark things as not-for-prod; rather than pushing things as prod-ready when they aren't.

It already is. Everything under a free or open source software license is already marked as such. The license says so. You use it at your own risk. Up to you to determine if that's good enough for you to use in production.

Chris2048 1 day ago

> professionals such as doctors do and what free and open source developers do

You are right - there is less in the way of personal liability in the case of devs (but for the odd PII here and there), Precisely why I think there will be a disruption coming.

> professionals get paid handsomely precisely because of the liability and responsibility

Devs are, or can also be, well-paid 'professionals'. And all are still capable of free, pro-bono work.

> instead of pushing unwanted responsibility and liability onto the rest of us

I'm not sure what you think I said..

"rather than pushing things as prod-ready when they aren't"

Is wrt the promotion of something as ready-for-production.

I'm not addressing the legal status as dictated by the (unproven) licence, which isn't relevant wrt liability anyway.

imoverclocked 3 days ago

Did anyone here use Genera on an original lisp machine? It had a pseudo-graphical interface and a directory listing provided clickable results. It would be really neat if we could use escaping to confer more information to the terminal about what a particular piece of text means.

Feature-request: bring back clickable ls results!

Bonus points for defining a new term type and standard for this.

rphln 3 days ago

There's already `ls --hyperlink` for clickable results, but that depends on your terminal supporting the URL escape sequence.

db48x 3 days ago

This is nice, but a poor substitute for what Genera was doing.

You see, Genera knows the actual type of everything that is clickable. When a program needs an input, objects of the wrong type _lose their interactivity_ for the duration. So if you list the files in some directory, the names of those files are indeed links that you can click on. Clicking on one would bring up a context menu of relevant actions (view, edit, print, delete, etc). If a program asks for a filename as input then clicking on a file instead supplies the file object to the program. Clicking on objects of other types does nothing.

petesergeant 2 days ago

> Genera knows the actual type of everything

I have this side-project fantasy of a very simple terminal pipe-types project. The basic idea is a set of very basic standardized types, demarcated using escape sequences. Dates, filenames, URLs, numbers, possibly one or two number units as well (time periods, file sizes only).

Tools that already produce columnar data (ls) get a flag that lets them output this format, and tools that work with piped data (cut, sort, uniq) get equivalents or modes that let them easily work with this.

Essentially, simple typed tables held in text, with enhancements for existing tooling to know how to deal with it. Would make my day-to-day on the command line much easier.

leafmeal 2 days ago

I think PowerShell works this way essentially. As I understand, all data is structured which makes formatting and piping to other programs much simpler.

db48x 2 days ago

Could be fun :)

But note that on the Lisp Machine/Genera, every type has a presentation and can be “printed” to the REPL. This includes any new classes that you create as part of your own programs. It’s not just a small list of standard types, but every type.

The standard tutorial for the system is to implement Conway’s Game of Life. It has you create a class to hold the game board and then guides you through the process of defining a presentation for it so that the it can be displayed easily.

svieira 2 days ago

Arcan is experimenting with something like this (among others): https://arcan-fe.com/2024/09/16/a-spreadsheet-and-a-debugger...

See also:

* NuShell (https://www.nushell.sh/)

two_handfuls 2 days ago

nushell goes in that direction. Programs can output tables, and the shell (or other tools) know how to work with this structured data.

loa_in_ 2 days ago

I always thought to do that by having a virtual file system that tags my files and so they are available at specific location if they fit the bill.

ramses0 2 days ago

https://kellyjonbrazil.github.io/jc/docs/parsers/ls.html

...glom on to this: "+JSONSchema" with some sort of UNIX-ish taxonomy. Everything from `man test`, add in `man du`, `date`, `... ago` (relative time) as you'd mentioned.

`jc ls | add_schema...` => `jq ...`

...or `jc ls --with-schema | jq ...`

(it appears as though `jc` already supports schema's, so perhaps it'd be `jc ls --with-types` or something, but there's your starting point!)

petesergeant 2 days ago

That's neat and a similar idea. I think JSON probably ends up being too expressive (not just an array of identically-shaped shallow objects), too restrictive (too few useful primitives), and also too verbose of a format, but the idea of a wrapping command like that as a starting point is neat

ramses0 2 days ago

I'll share this comment from 7 months ago with you:

https://news.ycombinator.com/item?id=40100069

"prefer shallow arrays of 'records', possibly with a deeply nested 'uri'-style identifier"

...the clutch result is: "it can be loaded into a database and treated as a table".

The origin of this technique for me was someone saying back in 2000'ish timeframe (and effectively modernized here):

    sqlite-utils insert example.db ls_part <( jc ls -lart )
    sqlite3 example.db --json \
      "SELECT COUNT(*) AS c, flags FROM ls_lart GROUP BY flags" 
    [
      {
        "c": 9,
        "flags": "-rw-r--r--"
      },
      {
        "c": 2,
        "flags": "drwxr-xr-x"
      }
    ]
...this is a 'trivial' example, but it puts a really fine point on the capabilities it unlocks. You're not restricted to building a single pipeline, you can use full relational queries (eg: `... WHERE date > ...`, `... LEFT JOIN files ON git_status...`), you can refer to things by column names rather than weird regexes or `awk` scripts.

This particular example is "dumb" (but ayyyy, I didn't get a UUOC cat award!) in that you can easily muddle through it in different (existing pipeline) ways, but SQL crushes the primitive POSIX relationship tooling (so old, ugly, and unused they're tough to find!), eg: `comm`, `paste`, `uniq`, `awk`

thaumasiotes 2 days ago

Tab completion has developed some similar features. I've seen shells that will only autocomplete what seem to be appropriate choices.

ptspts 2 days ago

I typically turn this off. Many times it's too slow, and many times it hides local filenames, and I do want local filenames.

Rendello 3 days ago

That's one aspect I prefer in playing with TempleOS over Linux. The rest of the command line is a bit of a pain, with no history, C-as-a-shell, etc.

westurner 3 days ago

  $ man ls | grep '\--hyperlink' -A 1
  --hyperlink[=WHEN]
         hyperlink file names WHEN

mbivert 3 days ago

Maybe some aspects of the Plan9 UI? (rio/9term, plumber; acme as well).

You should be able to get this to work on Unix with plan9port.

dotancohen 3 days ago

  > Feature-request: bring back clickable ls results!
Doesn't your desktop (or distro) have a graphical file manager? On KDE it's Dolphin, which ex-Windows users absolutely love. I don't know what it would be on Gnome or other desktops.

mabster 2 days ago

I'm not going to speak for Linux, but on Mac the Finder is annoying enough that I ended up using CLI for file manipulation (ranger).

shakna 2 days ago

My ssh client also supports mouse events, though.

yjftsjthsd-h 3 days ago

It's not really that, but have you tried ranger?

vunderba 3 days ago

Sounds like a fun project. However, from the readme:

Efficient file listing: Optimized for speed, even in large directories

What exactly is it doing differently to optimize for speed? Isn't it just using the regular fs lib?

jeffbee 3 days ago

On my system it uses twice as much CPU as plain old ls in a directory with just 13k files. To recursively list a directory with 500k leaf files, lla needs > 10x as much CPU. Apparently it is both slower and with higher complexity.

triyanox 3 days ago

Will definitely prioritize optimization in the next releases. Planning to benchmark against ls on various systems and file counts to get this properly sorted.

niek_pas 2 days ago

Not trying to “gotcha” you, but I would imagine that 10x the CPU of ls is still very little, or am I wrong?

jeffbee 2 days ago

In the case of the 500k tree, `lla` needs 2.5 seconds, so it's pretty substantial.

echoangle 2 days ago

Is listing a lot of files really CPU-limited? Isn’t the problem IO speed?

matheusmoreira 2 days ago

What exactly makes ls faster?

inquisitive-me 3 days ago

But it’s written in rust so it’s super fast. Did you take that into account when running your benchmarks? /s

porridgeraisin 3 days ago

One slept on filesystem cli tool on linux is `gio`. So it comes with glib2. But today glib2 is a dependency of vte, polkit, pipewire, ffmpeg, the entire gtk ecosystem,... you get the point. So you can basically depend on it being there on most linux installs, especially desktop.

Checkout the man page: https://www.mankier.com/1/gio

highlights:

- showing progress in `cp` equivalent

- Easy cli interface to freedesktop trash (!)

- tree command

- filesystem changes monitor (inotify wrapper)

INTPenis 2 days ago

I had no idea gio could do all those things. I've been using it to mount my smartphone from the CLI.

mellosouls 3 days ago

I clicked on this (without noting "github") expecting an essay on the joys of building an alternative to ls.

This is basically a Show HN without a summary I think.

fwiw:

https://news.ycombinator.com/showhn.html

ivanjermakov 2 days ago

Does UNIX philosophy holds anymore? Most of the modern CLI tools I've seen here try to be all at once: file manager, git client, grep.

I wonder if it was always like this or we're getting further and further from the idea of keeping programs simple and open.

Cthulhu_ 2 days ago

I would say it does; those tools rarely reimplement the functions you mention, but are abstractions on top of existing CLIs or libraries that do follow the UNIX philosophy.

This project in particular is not being sold as a drop-in replacement for ls.

monroewalker 3 days ago

Other than colorization, what are people getting out of ls replacements like this? I've recently started using ranger which might replace my ls usage for the most part since it not only shows everything in the directory but has vim like shortcuts for filtering, sorting, and searching the directory as well as previewing files and entering other directories

dhruvkb 2 days ago

Hi, author of `pls`[1] here. `pls` goes above and beyond what is typically possible with `ls` without going so far as to become an entire TUI file explorer like Broot[2].

Among a few things it does that `ls` (and other alternatives like `eza` don't do) are: - icons (SVG icons in terminals that support it, Nerd Fonts otherwise) - advanced filtering using regex - advanced sorting across multiple sort bases - styles and colors using customisable rules

For someone wanting to make the output of `ls` prettier (with a few extra bells and whistles) without having to relearn a new workflow, something like an `ls` replacement makes more sense.

[1]: https://pls.cli.rs [2]: https://dystroy.org/broot/

pmarreck 1 day ago

pls looks useful and I will retain it but eza is giving more icons for more things via (this is my alias for `l`, basically)

`eza --long --hyperlink --header --all --icons --git --sort name`

also the hyperlink thing is useful

Symbiote 2 days ago

ls does colored output. I'm surprised it's not the default for you.

cb321 2 days ago

If you run `dircolors --print-database|less` you will see that GNU ls only highlights/colors the path/filenames according to a simplistic scheme where a file can only resolve to one type even though on many terminals today "foreground overlays background overlays bold/italic/etc". (https://github.com/c-blake/lc#vector-typemulti-dimensionalit... has a more advanced idea.)

This tool by triyanox -- just from the screen shot if you click through -- will also colorize permission masks and sizes, dates, user & group.

Symbiote 2 days ago

I managed to scroll past the screenshot twice (now and earlier) before it had loaded.

Two settings for ls make some of the colouring less useful to me.

BLOCK_SIZE='1 formats sizes in bytes with comma separators. TIME_STYLE=long-iso formats the dates sensibly.

This means entries line up in neater columns.

  -rwxr-xr-x 1 root   root            852 2024-02-23 22:31 zsh5
  -rwxr-xr-x 1 root   root      1,022,760 2024-08-09 04:33 zstd
  lrwxrwxrwx 1 root   root              4 2024-08-09 04:33 zstdcat -> zstd

cb321 2 days ago

You could probably embed raw ANSI SGR color escape sequences { maybe from $(tput) if your terminal might be weird } inside a TIME_STYLE=+FORMAT to colorize the times.

In `lc`, mentioned a bit this thread, you can actually color the age like a "heat map" if you want. I.e. more recent times are more toward the red side of the rainbow and older ages toward the other "cooler" side ("cold storage"). Or whatever color scheme you like. So, if you know you're looking for something recent, the color pops out at you. If you like that kind of thing.

hobs 3 days ago

This github page doesn't say anything about why it turned out to be amazing, seems like a fun side project.

netsharc 3 days ago

Yeah, talk about hiding the headline...

I see a screenshot that looks like the output of ls, ok it has colors, and some filenames have "!!" behind it. Great success?

triyanox 3 days ago

Haha! Aren't all rust rewrites about colors take `bat` for example! Btw "!!" are from the git plugin, a quick way to see my workspace git status

fellowniusmonk 3 days ago

Yeah, why use this instead of ls? What makes it worthwhile as a daily driver?

dhruvkb 2 days ago

While you've specifically labeled this as "personal use", it is a commendable project that introduces some interesting new ideas. I might steal some ideas from it for my own `ls` alternative, `pls`[1].

[1]: https://pls.cli.rs

Cthulhu_ 2 days ago

"pls" as in "please give me a list of files"? Does `sudo pls` negate the "please"? :p

eviks 3 days ago

Excellent new idea re plugins, a lot of these tools are too inflexible !

cb321 3 days ago

`lc` mentioned elsethread [1] was always extensible with plugins for formatting and file-typing (but also always supported libmagic-based file-typology). There are other fairly distinctive ideas in `lc`, actually.. the README has a list.

While I like it and it's a good idea, I think the reality is that developers capable enough to write shared library/DLL plugins are more likely to just submit PRs and make such stuff built-in but maybe optional.

[1] https://news.ycombinator.com/item?id=42229841

eviks 3 days ago

"Always" is just 4 years? Lc is also one of these new tools

> more likely to just submit PRs and make such stuff built-in but maybe optional.

Which are more likely to just be rejected by the more conservative maintainers of the tool. That's the empowering beauty of plugins - no such barriers

cb321 3 days ago

Your tone is rather disputatious/critical, but we have literally no dispute here.

bornfreddy 3 days ago

I use git command line interface. Not because it is good (it isn't) or because I enjoy suffering (I think I don't), but because it is a standard on all the machines that have, you know, git.

What good is a ls alternative if I need to install it everywhere I need ls? I'd prefer using the standard ls even if it is not ideal. But maybe that's just me.

mshockwave 3 days ago

This is also one of the reasons I write C++ with vim without any auto-completion nor fancy plugins (I do use syntax highlighting though, but I think it comes by default with vim nowadays), as well as using GNU screen -- not every machines install tmux by default, surprisingly. In case I need to login into some random Linux box, I'm sure I'll be almost as productive as I am on my own machine.

kazinator 2 days ago

You mean, you're almost as unproductive on your development machine as on a random remote system that has no tools. And you somehow regard this as some sort of playing field leveling that generates an advantage.

Imagine a car mechanic that won't use a big hydraulic lift that hoists a car in seconds and lets him walk under it, claiming that by using a manually cranked portable jack, he can be almost as productive when fixing something by the roadside with emergency equipment as he is in his garage.

If you ever meet such a mechanic you can be sure that he programs computers as a hobby.

deredede 3 days ago

I assume this is tongue-in-cheek, but I don't think the comparison works at all.

I spend maybe 1% of my working hours (being generous) using `ls` and something like 50% (likely more) using my editor.

If there is some alternative to `ls` that makes my `ls` workflows 2x faster, my productivity increases by 0.5%. If I use a sub-optimal editor that makes my workflow 2x slower, I lose 25% of my productivity.

When I need to login to a remote box, I am also very likely to need to use `ls` since I am less familiar than on my own machine, whereas I am unlikely to do any sort of heavy development work (typically I just need to edit a couple configuration files, or do some git operations).

whartung 2 days ago

I did the same thing back in they day.

I developed on SCO (and, later, Unixware) on a PC, all of the clients were running the gamut of Unix OSes: HPUX, DGUX, AIX, SunOS, you name it.

Most of the time was spent on our box in the office, but I was constantly bouncing back and forth to client systems. Either on site, or over the modem. Having to juggle Termcaps and the whole thing. It was polyglot machine/OS world back then.

Just had to learn to get the best out of a baseline set of Unix tools. vi instead of emacs, awk instead of perl. Master those and never be left wanting on a new environment, so I can hit the ground running. No need to "bootstrap" (if the client would even let you, not always). Couldn't even rely on a C compiler.

easton 3 days ago

I’ve been on machines in the last few years that didn’t have screen either. Maybe it was a minimal install or something, but I specifically remember having to install it to get some long running stuff going.

(Thinking it was Ubuntu server, but guessing someone will correct me)

bee_rider 3 days ago

Tmux vs screen is an odd one; it kinda feels like screen was included in the era when people were actually trying to make the default install on servers kind of nice to use with a functional set of assumed programs. And now, it is fairly widespread just due to legacy.

Nowadays, and possibly for the better (every line of code is a potential bug and every bug is a potential vulnerability) it seems like systems don’t want to include this sort of stuff. So, I’m sure if the decision were made today, tmux or screen, tmux would win. Unfortunately, “none” seems like the real future option…

SoftTalker 3 days ago

Even ls isn't standard on all machines. GNU ls is different from BSD ls.

eviks 3 days ago

What's the point of suffering everywhere if you don't enjoy it? It's not like using a better alternative prevents you from knowing how to use ls, but only in those cases where there is no better alternative

matheusmoreira 1 day ago

Ubiquitousness is certainly a major selling point. The GNU coreutils are everywhere. I've made my peace with bash and make because I know they're always gonna be there.

This doesn't mean there's no value in developing one's own tools. Contributing to other projects can be quite difficult and time consuning. GNU projects are even more so.

We shouldn't limit ourselves to POSIX stuff either. Better software and tools can and should be built. Every attempt is valuable. And who knows? It might just turn into a staple of Linux distributions some day.

dbacar 3 days ago

Categorization and hashes seem to be good ideas, yet you could do all of these with other tools already. You could be knowing the tool 'exa', a similar ls alternative. Just wanted to mention.

p2detar 3 days ago

Coloring files of the same file-type is my favorite feature. Is the extension used to group them or a MIME-header parser? I guess the extension, since it is faster.

Symbiote 2 days ago

This is also part of GNU ls, at least.

  man dir_colors
I think.

johnisgood 1 day ago

You are right.

Koshkin 1 day ago

I didn't know that ls was missing plugins.

elashri 3 days ago

There seems to be a lot of projects that is now competing to replace ls (for people preferences)

For reference, those are the ones I am familiar with. They are somehow active in contrast to things like exa which is not maintained anymore.

eza: (https://github.com/eza-community/eza)

lsd: (https://github.com/Peltoche/lsd)

colorls: (https://github.com/athityakumar/colorls)

g: (https://github.com/Equationzhao/g)

ls++: (https://github.com/trapd00r/LS_COLORS)

logo-ls: (https://github.com/canta2899/logo-ls) - this is forked because main development stopped 4 years ago.

Any more?

Personally I prefer eza and wrote a zsh plugin that is basically aliases that matches what I have from my muscle memory.

iroddis 3 days ago

I’ve tried a few of these, but most of them seem to be following the trend of folding other shell functionality into one tool. Searching for contents (find + grep -H, or ripgrep), filtering (grep), sorting (ls does it natively, or you can use sort, sort -h for sorting human readable sizes), the list goes on and on.

I guess this is a mini lament that many of these tools are moving away from the Unix philosophy of do one thing well, and make it easy to chain.

And a last very small lament that BeOS didn’t succeed, and their filesystem-as-a-database approach didn’t become more standard.

burntsushi 3 days ago

You can still chain ripgrep. I specifically designed it so that you can chain it just like you would a normal grep.

It does indeed also include other functionality that might traditionally be left to other tools (like filtering files). But this is nothing that GNU grep wasn't already doing itself anyway.

IMO, it's better to view the Unix philosophy as a means to an end and not an end to itself. And IMO, it's important to weigh the benefits of coupling to the user experience.

fsckboy 3 days ago

>view the Unix philosophy as a means to an end and not an end to itself

it won't be a means to an end any more if you don't preserve it, so not breaking that aspect of it has to be one of your ends. if you use it to take ls to a new place but that place is not within the ecosystem, it will be an evolutionary dead end, or worse, the first meteor in the meteor storm that ends all life.

current/traditional unix may not be the be-all/end-all, but replacing it/changing it requires viewing it comprehensively and changing all the tools at once or having a plan to. A good example of this is Plan9

burntsushi 3 days ago

I don't know what you're trying to say and I don't see how it's in conflict with anything I've said.

fsckboy 3 days ago

>not an end to itself

it is an end to itself. the reason it's a means to an end is because that was its end goal. in being a means to an end, it is an end (its end) unto itself, opposite to what you said, imho

burntsushi 2 days ago

I still can't parse what you're saying. The Unix philosophy is a means to an end, where the ultimate end is improved user experience. The means is de-coupling and composition. But there are other means to improving the user experience.

> in being a means to an end, it is an end (its end) unto itself

This either makes zero sense or is vacuously true and clearly not in conflict with what I'm saying.

L3viathan 3 days ago

I think ripgrep specifically is counted in the comment you reply to as a tool that _does_ do one thing well, and that one should use it (or grep) in combination with an ls, instead of giving ls filtering abilities.

burntsushi 3 days ago

I suppose. But I wanted to point out that ripgrep couples functionality, specifically in contradiction to the Unix philosophy. And actually, many command, including "traditional" tooling, so as well.

The point is that many pay lip service to the Unix philosophy as if it were an end. But it isn't.

sudahtigabulan 3 days ago

> You can still chain ripgrep. I specifically designed it so that you can chain it just like you would a normal grep.

Headings on when isatty and off when piping the output put me off when I first tried ripgrep. I don't expect the tools to change their output format on me.

Luckily, you made this behavior configurable, so I'm a happy convert now.

burntsushi 3 days ago

> I don't expect the tools to change their output format on me.

You probably do! If you've ever used `ls`, then it does exactly this.

sudahtigabulan 3 days ago

If you mean the ANSI color stuff, yes - I do expect these to disappear :)

I meant the "shape" of the output. It just doesn't follow the principle of least surprise.

edit: you probably meant the columns. I forgot about that, I haven't parsed ls(1) output in ages ;)

burntsushi 3 days ago

Yes. The columns. The point is that commands have been changing their output format, not just their colors, based on tty for ages. So the criticism you lodge against ripgrep also applies to some of the most core commands you probably use daily.

I would be quite surprised if you didn't rely on this without even knowing it. Even a simple `ls | wc -l` relies on it.

I say this because it's tiring to see folks lament about this feature in ripgrep as if it's something new that ripgrep does. It's not. It's a well established idiom among Unix command line tools.

volemo 2 days ago

Isn’t “don’t parse ls” like the third commandment of Unix?

burntsushi 2 days ago

You've never done `ls | wc -l`?

BenjiWiebe 2 days ago

I've always assumed that ls doesn't change it's output when piped; I've always done ls -1|wc -l. I guess I can save on a few keystrokes now.

eviks 3 days ago

They don't do one thing well since it's all text, not structured data, which makes chained analysis a challenge, which leads to the desire for integration

bayindirh 3 days ago

ls is tabular data, and you can format it (ls -1, ls -l, ls -w, plus sorting, field formatting, and more), and you can cut/parse/format in a standard way. Every field sans the filename is fixed length, can be handled with awk/cut/sed according your daily mood and requirements, etc. etc.

So, ls can be chained very nicely, which I do every day, even without thinking.

You don't need to have a "structured data with fields" to parse it. You just need to think it like a tabular data with line/column numbers (ls -l, etc.) or just line numbers (ls -1).

So, as long as ls does one thing well, it's alright.

Ah, some of the "enhanced" ls tools can't distinguish between pipe and a terminal, and always print color/format escape codes to pipe too, doubling the fun of using them. So, thanks, I'll stick with my standard ls. That one works.

eviks 3 days ago

> You don't need to have a "structured data with fields" to parse it.

You do if you want to have nice things like being able to format your output without having to worry about breaking the dumb tools down the pipe, which can't sort the numbers they don't see:

- 2.1K (this isn't the same as the second) - 2.1K - 2.1M

Also, why do I need to count columns like a cave man in 'sort -k 5' instead of doing the obvious "sort by size"?

> print color/format escape codes to pipe too

A problem that would disappear with... structured data!

> Ah, some of the "enhanced" ls tools

so use the other "some" that can?

bayindirh 3 days ago

> which can't sort the numbers they don't see

Then you sort at the point you can see the numbers and discard them later.

> Also, why do I need to count columns like a cave man in 'sort -k 5' instead of doing the obvious "sort by size"

awk can sort the columns for you. Plus, ls can already sort by size. Try "ls -lS " for biggest file first, or "ls -lSr" for smallest file first. Add "-h" to make human readable.

> A problem that would disappear with... structured data!

No. A problem that would disappear with "a small if block which asks which environment I'm in". If you're in a shell "-t" test in sh/bash will tell you that. If you're coding a tool, there are standard ways to do that (via termcap IIRC). Standard UNIX tools are doing this for decades now.

IOW, structured data is not a cure for laziness...

> so use the other "some" that can?

Yes, because their authors are not that lazy.

eviks 3 days ago

> Then you sort at the point you can see the numbers and discard them later

This sort of human overhead is only needed to compensate for the deficiencies of the data structures

> ls can already sort by size

That's the benefit of integration you're arguing against with your deficient piping suggestions

> IOW, structured data is not a cure for laziness...

It is precisely what good design is for - it reduces the need for various dumb workarounds that bad design requires, which means you can be more lazy and avoid said workaround

> Yes, because their authors are not that lazy.

This just ignores the argument, which was "some better new tools don't do that" isn't relevant when some better new tools also do that

Retr0id 3 days ago

vanilla ls has never been particularly chainable - https://mywiki.wooledge.org/ParsingLs

machinestops 3 days ago

A lot of this post hinges on the fact that newlines in filenames were legal, and that people wrote shell without handling quoting correctly. While quoting (as well as ls altering filenames) is still an issue, find -print0, read -d '', and similar are no longer neccessary. Newlines are now forbidden in filenames: https://blog.toast.cafe/posix2024-xcu

threePointFive 3 days ago

> Newlines are now forbidden in filenames

No. To quote that article

> A bunch of C functions are now encouraged to report EILSEQ if the last component of a pathname to a file they are to create contains a newline

This, yes, makes newlines in filenames effectively illegal on operating systems strictly conforming to the new POSIX standard. However, older systems will not be enforcing this and any operating system which exposes a syscall interface that does not require libc (such as Linux) is also not required to emit any errors. The only time even in the future that you should NOT worry about handling the newline case is on filesystems where it's is expressly forbidden, such as NTFS.

machinestops 3 days ago

Most utilities that create files are encouraged to error on newline filenames, which makes this effective illegality stronger. The post also discusses the future of this encouragement, which is turning it into a requirement.

> However, older systems will not be enforcing this

Eventually, newlines in filenames will go the way of /usr/xpg4/bin/sh.

I'd like to note that up until this point, there hasn't (and isn't) been a fully POSIX compliant way to do many shell operations on newline containing filenames. They are already effectively unsupported, and the standard that adds support also discourages them from being created and used. The best way to handle them up until this point has been to not use sh(1).

CJefferson 3 days ago

Linux isn't POSIX compliant, and as far as I know has no plans to ban newlines in filenames, or even add an option to disable newlines.

machinestops 3 days ago

In past, there have been Linux-based operating systems that have been certified as Single Unix Specification compliant, and part of said specification is POSIX. I would imagine GNU and Busybox and Musl will be willing to implement the changes proposed by POSIX 2024, which inevitably leads down the road of newlines being banned.

CJefferson 2 days ago

Howw would that work? Checking strings passed to open and rejecting them? Would we then have undeletable files, as we can't refer to their filenames?

pyuser583 3 days ago

I know Linux allows newlines in filenames, but every time I hear it I want to drink.

from-nibly 3 days ago

If you like that philosophy check out nushell. They go pretty hard core on that and they can because of structured output

amelius 3 days ago

I agree with this.

If they want something that is easy to use in a non-scriptable way, maybe they should replicate Norton Commander instead.

darkest_ruby 3 days ago

Look into far2l

bawolff 3 days ago

Tbh, i dont understand why people want to rewrite ls of all things.

Like don't get me wrong, if they had fun, that's great.

But all i use ls for is getting a list of files. I barely ever even use the -la options. There just doesn't seem like a lot of room for improvement in something so simple.

dhruvkb 2 days ago

Hi, author of `pls`[1] here. I started `pls` as a hobby project to scratch a personal itch: a "prettier" alternative to `ls`, with more colors and customisable icons. I also wanted to learn Rust as a secondary motivation.

But as I added more and more features to it, it has become a good tool that does a number of things that `ls` doesn't do (unless you chain it with other tools like `sort` or `grep`) and even other `ls` replacements don't do.

So even though `ls` is fantastic as-is, it's always fun to build something of your own, add a little more polish in areas that matter to you and put it out there to see if it resonates with more people.

[1]: https://pls.cli.rs

benrutter 3 days ago

I think the standard ls doesn't have much in terms of color/icons, so its simplicity probably makes it a great side project for improving on.

Not a big surface area, some easy improvements. A whole lot less stressful than rewriting grep (although I'm massively grateful Burnt Sushi did such a crazy thing)

triyanox 3 days ago

Thanks @benrutter! You nailed it - ls is like the "Hello World" of system tools. Simple enough that you won't tear your hair out, but meaty enough to learn a ton. Started with "ooh, pretty colors!" and before I knew it I was deep in filesystem APIs and terminal wizardry. Way less scary than tackling grep. Sometimes the best projects are the ones where you can't mess up too badly... well, unless you accidentally delete everything while testing

roywashere 3 days ago

Well, recursive display is nice, I guess, as well as searching on partial filenames

mbivert 3 days ago

Has been roughly doing the job since the 70s (?):

  $ du -a | grep blbl

abnry 3 days ago

> I barely ever even use the -la options.

Certainly I use these less than plain "ls," but digging through hidden files and folders and looking at timestamps is very important for me.

dangus 2 days ago

I use ls -la via the ll alias exclusively. I find it far more readable to my eyes than plain ls.

Hidden files are almost always of interest to me since my job involves configuring servers.

cb321 2 days ago

https://github.com/c-blake/lc shows all files, including hidden files (starting with dot aka dot files) by default, suppressible in output with -xdot or a shell/internal alias to the same effect.

It helps to start with a more extensible/less built-in idea of "file type". "odd permissions" are another type that might interest someone, for example, such as "setgid but not group-executable" or "writable but not readable" or etc.

Yes, I know one can also use `find` or etc. for that, but there's no crime in there being >1 way to see things and, for some people, colors can make things really stand out - as can sort order which is another more color-blind possibility in `lc` as well as the simple filter-or-not of ls -a/-A.

karmakaze 3 days ago

That's the first thing I noticed in the options, it has modified date but not create or access date (listing or sorting) that I could tell. Of course it could be added, or I could just use `ls`.

ZoomZoomZoom 2 days ago

Take a look at lc (but not the terminal screenshots! ;)): https://github.com/c-blake/lc

lc is a highly configurable "multi-dimensional"[1] file lister written in Nim focused on flexibility and configurability.

Key features:

- Multi-level sorting by combinations of attributes like size, time, and file type, with user-defined precedence

- Configurable file kind sorting order

- Value-dependent coloring for file attributes such as timestamps, permissions, or sizes.

- Abbreviations: Automatically shorten filenames, user/group names or symlink targets.

- File type classification: Integrates libmagic for file type inspection.

- Hyperlink support

- Per-directory configs: custom behaviors for specific directories using local tweak files (.lc).

- Lightweight (~900 lines of code) with only author's CLI library "cligen" and Nim's stdlib as dependencies.

and more.

[1]: https://github.com/c-blake/lc#vector-typemulti-dimensionalit...

treve 3 days ago

It's a rite of passage. I had some colorful 'dir' alternatives on MS-DOS 5 and eventually made my own with Turbo Pascal. Easy & fun afternoon project

dhruvkb 2 days ago

I wanted to plug `pls`[1], a tool that I wrote and maintain. It does a few things that `eza` (another great tool nonetheless, and a massive inspiration) cannot do[2].

[1]: https://pls.cli.rs [2]: https://pls.cli.rs/about/comparison/

triyanox 3 days ago

Thanks for the great list! Yep, eza and g are fantastic - I actually use eza daily and love how g handles git integration. What made me excited to experiment with lla was playing with the plugin architecture. While these other tools have great built-in features, I wanted to see if I could make something where the community could easily add their own capabilities without touching the core code. Kind of like how vim and neovim handle plugins. Got inspired by how people keep building these ls alternatives to scratch their own unique itches. Figured why not make it easier for everyone to scratch their own itch through plugins? Still very much an experiment, but it's been fun seeing what's possible!

vunderba 3 days ago

Eza is great. I was pleasantly surprised at how nice the mime type icons meshed with the terminal.

medv 3 days ago

Also “walk” is great for interactive navigation.

- https://github.com/antonmedv/walk

cb321 3 days ago
bastardoperator 3 days ago

I also used eza to replace the tree command with the --tree flag.

yasser_kaddoura 3 days ago

I have these aliases for various purposes:

# Different options to search for files

# da=36 cyan timestamps

alias ls="EZA_COLORS='da=36' eza --time-style=relative --color-scale=age"

alias lsa="ls --almost-all" # ignore . ..

alias l="ls --long --classify=always" # show file indicators

alias la="l --almost-all"

# Tree view

alias ltreea="ls --tree"

alias ltree="ltreea --level=2"

# Sort by time or size

alias lt="ls --long --sort=time"

alias lta="lt --almost-all"

# lsd is faster than eza

alias lss="lsd --long --total-size --sort=size --reverse"

alias lssa="lss --almost-all"

lla seems to go beyond what ls should do for some reason. Why show git and code complexity info? Just use tools dedicated for these things, otherwise, it will be an unmaintainable mess. If you can solve a problem easily with external tools, then there's no reason to add a feature for it.

tejohnso 2 days ago

That's a great list. I have a similar list and the aliases grow out of frequently used arguments. For example, I found myself often doing an ls -Altch and so lsth was born. I find that aliases that or born of frequently used arguments are easily remembered. Over time that one grew to include a pipe to head because most of the time I just want to see the top 20 or so most recently modified files in the directory.

zvr 3 days ago

Creating command-line utilities is nice, but I personally lament the lack of man pages when people write something new.

triyanox 3 days ago

That's the amazing part I'm talking about the learning experience you get from weeks of working on something like that is better than reading countless documentations

zvr 3 days ago

Oh, of course the development is fun and exciting and a learning experience.

But before inviting others to use something, please think of how to make its use more clear. After all, I assume you post this so that people use it, not only admire your coding skills. There is a group of people who have learned to read and rely on man pages.

For example, the top-level README says:

> -s, --sort <CRITERIA>: Sort by "name", "size", or "date"

OK, does "date" refer to creation date, modification date, access date? I can understand "size", but does it produce smallest-first or largest-first? It might not matter if... ah, no, there is no -r/--reverse flag. Can I have more than one "criteria" (since the plural is used)?

Getting answers for such questions now means I have to go read the code in src/args.rs and follow to the implementation of the various functions. And in a few days, when I have the same questions again and I have forgotten the options, I will again have to dive into the code.

Please consider providing a short man page. It documents the "calling interface" to your program and makes it easier to use. I usually start writing one even before implementing the whole thing, to clearly articulate what I expect the program to do.

triyanox 3 days ago

Fair critique about the documentation - this needs proper attention. Writing a man page first is a solid approach - it forces clear thinking about the interface before implementation. I'll prioritize adding complete documentation for all options and the plugin system. The code works, but without good docs it's not truly useful.

seb1204 3 days ago

While a man page or good documentation is maybe not too intriguing for you I consider it essential for other users to adopt. Maybe there are new or modern ways to create man pages that can be stimulating for your learning experience?

voidUpdate 2 days ago

I know its only for personal use, but I've never had any problems with ls not being "high-performance" enough...

tambourine_man 3 days ago

brew support?

triyanox 3 days ago

Great idea! I will be working on it!

iwontberude 3 days ago

The things I take for granted. This is a breath of fresh air! Way to rethink the fundamentals!

skrebbel 3 days ago

I can't tell if you're being sarcastic or not.

iwontberude 2 days ago

For the record I was not being sarcastic but maybe I was feeling a bit too romantic or overly supportive of OP

cb321 2 days ago

I notice prior HN comments of yours mention the physical design of the NeXT cube. I cannot say it will make you not hate software, but you still might appreciate that another alternative ls, https://github.com/c-blake/lc, both re-thinks/breaks more radically with ls-tradition and adapts well to something very similar to a terminal variant of the https://en.wikipedia.org/wiki/Miller_columns used in the NeXT file tree graphical browser/navigator via simple shell process substitution composition. E.g., a 3-level scenario on an 80-column looks like:

    paste <(lc -1m25 ../..) <(lc -1m25 ..) <(lc -1m25 .)
Some shell script that uses $((COLUMNS)) arithmetic to do 2 or 4 or whatever terminal width is a pretty simple exercise for the reader and one might want to pipe to less.

darkfrancisco 2 days ago

You can guess it is written in Rust before even checking the repo whenever you see that somebody made a clone of some popular systems tool like top, ls, cd, etc.

imp0cat 2 days ago

I know, right?! It's a common theme.

But recently, there were two submissions here that actually turn the "rewrite in Rust" meme into something substantial.

The two factions of C++ https://news.ycombinator.com/item?id=42231489 On "Safe" C++ https://news.ycombinator.com/item?id=42186475

Be warned that the second one is a really long read!

Koshkin 1 day ago

I, for one, have been wishing for a high-performance, extensible alternative to emacs for a long time.