this post was submitted on 29 Oct 2023
240 points (95.8% liked)

Linux

48372 readers
1912 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by AlpΓ‘r-Etele MΓ©der, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Dust is a rewrite of du (in rust obviously) that visualizes your directory tree and what percentage each file takes up. But it only prints as many files fit in your terminal height, so you see only the largest files. It's been a better experience that du, which isn't always easy to navigate to find big files (or atleast I'm not good at it.)

Anyway, found a log file at .local/state/nvim/log that was 70gb. I deleted it. Hope it doesn't bite me. Been pushing around 95% of disk space for a while so this was a huge win πŸ‘

all 50 comments
sorted by: hot top controversial new old
[–] [email protected] 82 points 1 year ago (12 children)

ncdu is the best utility for this type of thing. I use it all the time.

[–] [email protected] 16 points 1 year ago

I install ncdu on any machine I set up, because installing it when it's needed may be tricky

[–] [email protected] 14 points 1 year ago* (last edited 1 year ago)

Try dua. It's like ncdu but uses multiple threads so it's a lot faster., especially on SSDs.

load more comments (10 replies)
[–] KazuyaDarklight 55 points 1 year ago (1 children)

Came in expecting a story of tragedy, congrats. πŸŽ‰

[–] [email protected] 2 points 1 year ago (1 children)

But did he even look at the log file? They don't get that big when things are running properly, so it was probably warning him about something. Like "Warning: Whatever you do, don't delete this file. It contains the protocol conversion you will need to interface with the alien computers to prevent their takeover."

[–] [email protected] 8 points 1 year ago

PTSD from the days long ago when X11 error log would fill up the disk when certain applications were used.

[–] [email protected] 54 points 1 year ago (2 children)

Try ncdu as well. No instructions needed, just run ncdu /path/to/your/directory.

[–] NorthWestWind 11 points 1 year ago

If you want to scan without crossing partitions, run with -x

[–] [email protected] 38 points 1 year ago (5 children)

I usually use something like du -sh * | sort -hr | less, so you don't need to install anything on your machine.

[–] mvirts 8 points 1 year ago

Same, but when it's real bad sort fails πŸ˜… for some reason my root is always hitting 100%

I usually go for du -hx | sort -h and rely on my terminal scroll back.

[–] [email protected] 5 points 1 year ago (1 children)

dust does more than what this script does, its a whole new tool. I find dust more human readable by default.

[–] [email protected] 2 points 1 year ago (1 children)

Maybe, but I need it one time per year or so. It is not a task for which I want to install a separate tool.

[–] [email protected] 0 points 1 year ago

Perfect for your use case, not as much for others. People sharing tools, and all the different ways to solve this type of problem is great for everyone.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (2 children)

Almost the same here. Well, du -shc *|sort -hr

I admin around three hundred linux servers and this is one of my most common tasks - although I use -shc as I like the total too, and don't bother with less as it's only the biggest files and dirs that I'm interested in and they show up last, so no need to scrollback.

When managing a lot of servers, the storage requirements when installing extra software is never trivial. (Although our storage does do very clever compression and it might recognise the duplication of the file even across many vm filesystems, I'm never quite sure that works as advertised on small files)

[–] pete_the_cat 2 points 1 year ago (1 children)

We'd use du -xh --max-depth=1|sort -hr

[–] [email protected] 1 points 1 year ago (1 children)

du -xh --max-depth=1|sort -hr

Interesting. Do you often deal with dirs on different filesystems?

[–] pete_the_cat 1 points 1 year ago (1 children)

Yeah, I was a Linux System Admin/Engineering for MLB/Disney+ for 5 years. When I was an admin, one of our tasks was clearing out filled filesystems on hosts that alerted.

[–] [email protected] 1 points 1 year ago (1 children)

Sounds pretty similar to what I do now - but never needed the -x. Guess that might be quicker when you're nested somewhere there is a bunch of nfs/smb stuff mounted in.

[–] pete_the_cat 2 points 1 year ago

We'd do it from root (/) and drill down from there, it was usually /var/lib or /var/logs that was filling up, but occasionally someone would upload a 4.5 GB file to their home folder which has a quota of 5 GB.

Using ncdu would have been the best way, but that would require it being installed on about 7 thousand machines.

[–] [email protected] 2 points 1 year ago (1 children)

I admin around three hundred linux servers

What do you use for management? Ansible? Puppet? Chef? Something else entirely?

[–] [email protected] 2 points 1 year ago (1 children)

Main tool is Uyuni, but we use Ansible and AWX for building new vms, and adhoc ansible for some changes.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Interesting; I hadn't heard of Uyuni before. Thanks for the info!

[–] [email protected] 2 points 1 year ago

I'd say head -n25 instead of less since the offending files are probably near the top anyway

[–] [email protected] 2 points 1 year ago (1 children)

Or head instead of less to get the top entries

[–] [email protected] 1 points 1 year ago

With sort -hr, the biggest ones are generally at the bottom already, which is often what most people care about.

[–] [email protected] 24 points 1 year ago

So like filelight?

[–] badloop 22 points 1 year ago (1 children)
[–] [email protected] 3 points 1 year ago

Yeah I got turned onto ncdu recently and I’ve been installing it on every vm I work on now

[–] [email protected] 9 points 1 year ago (1 children)

A 70gb log file?? Am I misunderstanding something or wouldn't that be hundreds of millions of lines

[–] [email protected] 7 points 1 year ago

I've definitely had to handle 30gb plain text files before so I am inclined to believe twice as much should be just as possible

[–] donio 7 points 1 year ago* (last edited 1 year ago)

Maybe other tools support this too but one thing I like about xdiskusage is that you can pipe regular du output into it. That means that I can run du on some remote host that doesn't have anything fancy installed, scp it back to my desktop and analyze it there. I can also pre-process the du output before feeding it into xdiskusage.

I also often work with textual du output directly, just sorting it by size is very often all I need to see.

[–] [email protected] 6 points 1 year ago

You guys aren't using du -sh ./{dir1,dir2} | sort -nh | head?

[–] [email protected] 4 points 1 year ago

I use gdu and never had any issues like that with it