• Total Members: 12852
  • Latest: pyz0123
  • Total Posts: 29022
  • Total Topics: 8482
  • Online Today: 1584
  • Online Ever: 51419
  • (01. January 2010., 10:27:49)

Author Topic: Finding largest files and folders with linux  (Read 1429 times)

0 Members and 1 Guest are viewing this topic.


  • SCF VIP Member
  • *****
  • Posts: 3507
  • KARMA: 152
  • Gender: Female
Finding largest files and folders with linux
« on: 19. January 2015., 10:26:35 »
I'm transferring my recovered but crashed server harddrive to a 4x80GB RAID5 array. That's effectively only 150GB, so one can imagine the difficulties cleaning up the original installation of over 500GB (cleaning = moving data to external hard drive; clone original failing disk with still booting OS to new RAID5; disconnect old failing drive; boot RAID5; enable new raid with 2x 1TB disks to move User Downloads etc folders and virtual machines to. Of course, RAID 2x 1TB will be mirroring!).

One of the problems I found was that my blockchains were using an awful lot of data! Bitcoin blockchain for example is over 30GB big! Luckily, I can move blockchain files to a new location (away from the RAID5 array to the new  1TB Mirror). Either I'll use hard linking (NTFS feature) or simply use a command option. I do want to keep my wallets on the RAID5, only blockchains should move. So hard linking probably is my best bet. Link biggest files with non-essential blockchain data away from RAID5.

Anyways, because Windows based boot DVDs failed to copy (read) some source files, I'm moving all my data with RIP Linux. It's easy enough to see that my blockchains used over 100GB, but which folders were using the most space??

How Do I Find The Largest Top 10 Files and Directories On a Linux / UNIX / BSD?

by NIXCRAFT on APRIL 3, 2006

How do I find the largest top files and directories on a Linux or Unix like operating systems?

Sometime it is necessary to find out what file(s) or directories are eating up all your disk space. Further, it may be necessary to find out it at the particular location such as /tmp or /var or /home.
There is no simple command available to find out the largest files/directories on a Linux/UNIX/BSD filesystem. However, combination of following three commands (using pipes) you can easily find out list of largest files:

Estimated completion time   1m
du command : Estimate file space usage.
sort command : Sort lines of text files or given input data.
head command : Output the first part of files i.e. to display first 10 largest file.
find command : Search file.

Type the following command at the shell prompt to find out top 10 largest file/directories:
# du -a /var | sort -n -r | head -n 10

Sample outputs:

1008372 /var
313236  /var/www
253964  /var/log
192544  /var/lib
152628  /var/spool
152508  /var/spool/squid
136524  /var/spool/squid/00
95736   /var/log/mrtg.log
74688   /var/log/squid
62544   /var/cache
If you want more human readable output try:
$ cd /path/to/some/where
$ du -hsx * | sort -rh | head -10


du command -h option : display sizes in human readable format (e.g., 1K, 234M, 2G).
du command -s option : show only a total for each argument (summary).
du command -x option : skip directories on different file systems.
sort command -r option : reverse the result of comparisons.
sort command -h option : compare human readable numbers. This is GNU sort specific option only.
head command -10 OR -n 10 option : show the first 10 lines.
The above command will only work of GNU/sort is installed. Other Unix like operating system should use the following version (see comments below):

for i in G M K; do du -ah | grep [0-9]$i | sort -nr -k 1; done | head -n 11
Sample outputs:

179M   .
84M   ./uploads
57M   ./images
51M   ./images/faq
49M   ./images/faq/2013
48M   ./uploads/cms
37M   ./videos/faq/2013/12
37M   ./videos/faq/2013
37M   ./videos/faq
37M   ./videos
36M   ./uploads/faq
Find the largest file in a directory and its subdirectories using the find command

Type the following GNU/find command:

## Warning: only works with GNU find ##
find /path/to/dir/ -printf '%s %p\n'| sort -nr | head -10
find . -printf '%s %p\n'| sort -nr | head -10
Sample outputs:

5700875 ./images/faq/2013/11/iftop-outputs.gif
5459671 ./videos/faq/2013/12/glances/glances.webm
5091119 ./videos/faq/2013/12/glances/glances.ogv
4706278 ./images/faq/2013/09/
3911341 ./videos/faq/2013/12/vim-exit/vim-exit.ogv
3640181 ./videos/faq/2013/12/python-subprocess/python-subprocess.webm
3571712 ./images/faq/2013/12/glances-demo-large.gif
3222684 ./videos/faq/2013/12/vim-exit/vim-exit.mp4
3198164 ./videos/faq/2013/12/python-subprocess/python-subprocess.ogv
3056537 ./images/faq/2013/08/debian-as-parent-distribution.png.bak
You can skip directories and only display files, type:

find /path/to/search/ -type f -printf '%s %p\n'| sort -nr | head -10

find /path/to/search/ -type f -iname "*.mp4" -printf '%s %p\n'| sort -nr | head -10
Hunt down disk space hogs with ducks

Use the following bash shell alias:

alias ducks='du -cks * | sort -rn | head'
Run it as follows to get top 10 files/dirs eating your disk space:
$ ducks

1 Anonymous April 12, 2006 at 11:13 am
Great, but what if I only want the largest files and not the directories?


2 nixcraft April 12, 2006 at 6:26 pm
To find out largest file only use command ls as follows in current directory:
ls -lSh . | head -5
-rw-r–r– 1 vivek vivek 267M 2004-08-04 15:37 WindowsXP-KB835935-SP2-ENU.exe
-rw-r–r– 1 vivek vivek 96M 2005-12-30 14:03 VMware-workstation-5.5.1-19175.tar.gz
ls -lSh /bin | head -5
You can also use find command but not du:
find /var -type f -ls | sort -k 7 -r -n | head -10

Hope this helps

nixcraft April 12, 2006 at 6:35 pm
And yes to find the smallest files use command:
ls -lSr /var

Or use find command with -size flag.
find / -type f -size +20000k -exec ls -lh {} ; | awk ‘{ print $8 “: ” $5 }’

Read man page of find for more info.


4 Spechal June 22, 2011 at 5:47 pm
“find / -type f -size +20000k -exec ls -lh {} ; | awk ‘{ print $8 “: ” $5 }’”

needs to have the exec altered

find / -type f -size +20000k -exec ls -lh {} \; | awk ‘{ print $8 “: ” $5 }’

Also, I find this output easier to read

find . -type f -size +20000k -exec ls -lh {} \; | awk ‘{print $5″: “$8}’
More information about bitcoin, altcoin & crypto in general? GO TO

Cuisvis hominis est errare, nullius nisi insipientis in errore persevare... So why not get the real SCForum employees to help YOUR troubled computer!!! SCF Remote PC Assist


  • SCF Global Moderator
  • *****
  • Posts: 765
  • KARMA: 101
  • Gender: Male
Re: Finding largest files and folders with linux
« Reply #1 on: 19. January 2015., 23:57:18 »
+1 for usefullness :p

Samker's Computer Forum -

Re: Finding largest files and folders with linux
« Reply #1 on: 19. January 2015., 23:57:18 »


With Quick-Reply you can write a post when viewing a topic without loading a new page. You can still use bulletin board code and smileys as you would in a normal post.

Name: Email:
Type the letters shown in the picture
Listen to the letters / Request another image
Type the letters shown in the picture:
Second Anti-Bot trap, type or simply copy-paste below (only the red letters)

Enter your email address to receive daily email with ' - Samker's Computer Forum' newest content:

Terms of Use | Privacy Policy | Advertising