Jump to content

Automatically Delete old files from directory



I'm currently using some syslog-ng servers as a collector(destination) for my F5 devices logs and then Splunk reads those syslog-ng servers and ingests the data.  Well as you can imagine, these files get pretty huge on these syslog-ng servers and if you don't keep your eye on it the /logs/ directory/partition gets full on our redhat Linux servers.  So the best solution is to run a job daily to remove files older than two days and how I do that is shown here.

First you identify the directory and in my case its /logs/

Next you determine how old do you want to keep the files and in my case I'm okay with two days worth

Then you decide at what time each day you want to remove the old files.  I want to remove the old files before the day really starts.

And you end up editing your crontab file

sudo vi /etc/crontab

Adding a line that looks like this which runs every day at 5am since the server is in central time which means it runs at 6am eastern

0 5 * * * root /usr/bin/find /logs/ -type f -mtime +1 -exec rm {} +

So the +1 is for two days since it automatically gives 24 hours and the +1 is another 24 hours for a total of two days.


Recommended Comments

There are no comments to display.

Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...