You could try looking for large files that may be taking up LOTS of
space. I run the following script on my server weekly via cron to send
me a list of the top 100 largest files. It has been helpful in the past
when a log file got way large due to a runaway process continuing to log
errors, etc.
Here's the cron entry:
0 3 * * 0 /root/admin_stuff/bin/find_big_files > /dev/null 2>&1
Here's the script:
#!/bin/bash
#
# find_big_files: Find the top 100 biggest files & mail results to root
#
DATE=`date`
LOGFILE=/root/admin_stuff/dat/admin_log.log
echo $DATE - Running $0 >> $LOGFILE
find / -depth -print -exec ls -l {} \; | grep -v web | grep -v "disk-"
| grep -v backups | grep -v kcore | grep ./ | sort -n -r -b -k5 | head
-n 100 | /usr//bin/mail -s " Weekly listing of 100 biggest files" root
You may need to tweak it a bit since I skip some files (including my web
site which is about 28GB).