This fine morning KDE greeted me with a particularly nasty warning:
Seems like it is time for some spring cleanup… And by spring, I of course mean winter. But where to start?
Well, the best place is usually to nuke your temp files. There are many ways to do this, but my go-to tool is Bleach Bit. It is a multi-platform tool that does a remarkably decent job sweeping up garbage that crufts up linux filesystems without actually doing any major damage. Also, it is quite fast.
Unfortunately in my case, it only managed to free up about 300MB of space. That’s certainly bigger than 89MB of free space I had previously, but still not great.
Here is a question: what does it mean when deleting temp files makes almost no difference with respect to unused space on your hard drive? It means that all the space was eaten up by your activity – files you downloaded, software you installed and etc. So the first thing to do is to clean up your home directory.
If you are like me, you probably have one or more “dump” folders where you just shove random files you don’t feel like filing anywhere else. Mine are:
- ~/Downloads/ which is the default download folder in Chrome
- ~/tmp/ which is where I dump random files, logs, and etc. For example, if I need to quickly photoshop a file and upload it to the internet for lulz, it goes into this directory.
- ~/wkspc/ is a higher level temp dir where I put random tests and code snippets I don’t plan on saving
As a rule, it is usually safe for me to purge these directories outright when I’m running low on space. Whenever I find myself working on something inside ~/wkspc/ for more than a brief, fleeting instance, I usually move it to ~/projects/ and put it under source control. Everything else, is a fair game.
Sadly, nuking those folders gave me very meager results – probably because most garbage I download and generate on day to day basis is rather small. So where was all my space tied up? I decided to find out using du command:
du -sBM * | sort -nr
This will give you a nice list of folders ordered by size that looks a bit like this:
I actually took this screenshot after doing some cleanup, but you can sort of see how it works. The largest repositories of junk in my case are my Dropbox folder which I can’t really do much about, and my Desktop folder where I had a large directory of work related crap I did not want to get rid of. The rest of the directories looked fairly clean. And yet running df would show that / was 96% full.
Then I got another idea – my previous search explicitly excluded dot-files. So why not check them specifically:
du -sBM .* | sort -nr
Can you say jackpot?
VirtualBox directory has grew to be over 50GB large. Not good! I immediately sprung to action, deleted bunch of old snapshots, dumped unused VDI files to an external hard drive and went to town compacting the current ones. How do you compact them you ask? Using the friendly
VBoxManage modifyhd /path/to/your.vdi compact
Actually if your VM runs a Windows OS, should follow the advice listed here:
- Run the VM
- Defrag your disk once or twice
- Zero-out empty space with Sysinternals sdelete
- Then compact it using the command above
For me, this yielded about 10-20% savings per VDI file, which was not insignificant. But since I was already in cleanup mode I decided to keep going.
Having deleted most things in ~ that I was willing to part with, I turned to software. I can be pretty bad about installing software. Often I will download and install packages that I use once, and never touch again. I’m especially fond of downloading web browsers, IDE’s and scripting language run-times just to see how they feel. Half of these things don’t need to take up space on my hard drive.
So I decided to generate a list of installed packages and order it by size:
dpkg-query -Wf '${Installed-Size}\t${Package}\n' | sort -rn | less
Biggest offenders?
It seems that the biggest packages on my system were Haskell, Google Chrome, Open Office and… A shit-load of old kernel image files. See, this is the kind of thing that happens on Linux when you just let the OS upgrade itself whenever it wants. Every time there is a kernel security patch or an update apt leaves the old kernel version intact. This is good, because you can always revert to your old kernel version if the new one breaks everything and ruins your day. But after a while you get a long, long list of defunct kernel images and header files. You can actually see the entire list like this:
dpkg -l 'linux-*'
How do you clean this up? Well, you aptitude purge all the kernel files, except the current one. You can check what you are running right now via uname -r command. Then sit down, and prepare to type in a lot of package names…
Or use this script to generate a list of all installed kernel files, except the current one:
dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'
I can’t actually claim ownership of this one – this sed monstrosity was actually made by the folks at Ubuntu Genius blog. In fact, they went one step beyond showing you how to automatically purge these things in a single command:
dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d' | xargs sudo apt-get -y purge
In my case, the uninstallation took close to an hour to complete, and reclaimed over 8GB of space, without damaging anything important.
For a good measure I still went back and uninstalled useless things like extra web browsers, various ide’s, language runtimes and any GUI tools that haven’t been touched since I installed them. All in all I think this was a successful cleanup:
My / is now only at 81% with over 17G of free space. What do you think? Do you have any disk cleanup tips that you would like to share here? Let me know in the comments.
switch to a larger disk …
well I will have to do that soon too
I think I’ve mentioned it before: I do most of the “random” things in
/tmp
. That’s also the directory I have set as my browser’s download directory, so those files don’t build up on me. Cleanup is automatic!I’ve never had a need for cleanup utility in Linux. There aren’t as many places for cruft to build up as there is in Windows. If I notice a dot directory is getting big I can generally just blow it away. All important configuration is versioned and safe.
Like you, if I’m running low on space one of the main culprits is likely to be VirtualBox images. I’ll just delete a few that I notice I’m not using anymore.
I didn’t realize Linux kernels would take up so much space. I need to keep a more careful eye on those. None of my current systems are old enough (recent OS installs) to have collected a long list of kernels so it’s not a problem at the moment.
Running out of disk space is always a nightmare for me. The problem is that each time I use du to find the culprits, I always end with the three directories I don’t want to delete: music, images, and movies/videos (surprisingly, music is the largest). As Eric mentioned, the only realistic option that I have is to get a bigger disk. For years I have been contemplating the idea of building a NAS solution as this one, but I haven’t have the money to do it. Maybe next year :S
I buy another harddrive. Most things I do are actually located on an NFS share on my file server, so running out of space means that I need to add another drive or upgrade an existing drive. At $50/TB, it seems cheaper to expand than to spend time after it fills up trying to find room (time is money).
Anything valuable is backed up remotely, everything else is expendable. If a drive dies or I have to wipe out my home directory, it’s no major loss.
@ Chris Wellons:
Nice idea!
Just mention that
ncdu
is more convinient instrument than puredu
You Sir, are a bad influence! Having read your post, I thought I’d see what crap I’d collected over time. After using bleachbit, clearing old Kernels and removing some VM snapshots I cleared just under 30Gb. It’s very rare I clear anything out, so this was a good idea (just wish it was that easy to clean my house!).
One thing though – I tried your ‘du -sBM .* | sort -nr’ and the only output I got was for
.
..
it didn’t list my hidden dirs at all. Any suggestions?
I have another 5Gb in Downloads, mainly random bits I downloaded and have never used since. They’re next to go.
Does the package manager keep old packages in a cache somewhere so that you can downgrade? I know pacman (the Arch package manager) does and deleting those old packages can free up tons of space.
mcai8sh4 wrote:
Same problem here, using Wheezy. It seems that the dot in “.*” makes du recurse parent directories.
It worked better in Ubuntu 10.04 LTS, but it still parses parent dirs.
@Luke: which Ubuntu version are you using?
That bleach-bit is pretty neat for clearing various caches in ~
I quite like JDiskreport (thou I should find an open source equiv), for generating pie graphs, to visualize the stuff nicely.
~/iso can be a lil nasty for me, I dont tend to delete them, even had a copy of Ubuntu 6.10 still in there.
@ crackofdusk:
Yes it does. You may use:
apt-get clean
apt-get autoclean
apt-get autoremove
Read the man-page to see the difference between the 3 options.
@ ST/op: thanks for the tip. Now I remember using some of those options in my Ubuntu days. My comment was for Luke’s benefit, though. I think you can free a gigabyte or two by purging old packages.
@ ST/op : Thanks for the heads up – I should have mentioned I’m using Ubuntu 12.04
Glad to know it’s not something I’ve mullered whilst playing around with things.
Cheers.
@ Eric:
Yeah, but it is a pain in the ass.
@ Chris Wellons:
I seriously need to start ding this. Putting random shit in /tmp that it.
@ agn0sis:
Nice. I’m just considering plugging a whole bunch of large drives into a PogoPlug. I also have un-rooted GoFlex NAS drive which was similarly cheep.
@ Jason Switzer:
Yes, for most things this works well. My biggest source of hard drive space drain are virtual machine images. Putting your VM files on a network is not recommended (you gonna have a bad time).
@ tengu911:
I might need to check it out, but it’s ncurses right? This means I can’t pipe output for further processing, no?
@ Ron:
I offload iso files onto an external drive. They are usually huge, and the space adds up quickly.
@ mcai8sh4:
I call that a good influence. ;)
@ ST/op:
I think it’s 10.4. It’s been bothering me to upgrade to the next LTS but I’m ignoring it hoping it will go away. ;)
@ crackofdusk:
It keeps cache but I’m not sure if it actually keeps complete copies of old versions.
@ ST/op:
Yep, I usually run apt-get clean regularly to purge the cache whenever I notice I’m low on space.
@ mcai8sh4:
Yeah, it’s probably just me using version that is too old. :P For some reason that command was working for me. Go figure.
@ tengu911:
Wow that’s really awesome. I used to do du -sh * | sort -h and this gives me pretty much exactly that plus caching and interactive recursion. Thanks!
I use to do that too, but sometimes want an iso when out and about, dont normally take the externel drive with me (well sometiems the little 2.5 500G, but thats often full of crap too, treat it like a dumping ground too much)
Luke, I am always amazed at how similarly you blog compared to when I need you. Over the past two weeks I have been bouncing between 2GB and a few KB of free space the more I download the more I delete but it seems like I am getting less and less space. I always hate just deleting things even though I know I will never need them again, mainly because it would be a pain in the ass.
I just recently bought an external HD on newegg and will be plugging it into an old laptop I have and making it a pseudo NAS (with the added advantage of being a mythbuntu player)
Pingback: WordPress: Vanishing Categories | Terminally Incoherent