Hi,
to get sure that you have an resonable large pacman
package cache for downgrades you need an automatic pacman cache management.
I made an automatic pacman cache management that uses just about 3-4 GB for an 60 days
rolling cache.
sudo pacman -S pacman-contrib
sudo mkdir -p /etc/pacman.d/scripts
sudo mkdir -p /etc/pacman.d/hooks
Make
/etc/pacman.d/scripts/cleancache.sh with the editor of your choice.
#!/bin/bash
#
# Days in cache
DAYSIC=60
# Package versions
PACVERS=2
#
FILECOUNT1=0
FILECOUNT2=0
CANDIPAC=""
#
# Count packages to delete
let FILECOUNT1=$(/usr/bin/find /var/cache/pacman/pkg/ -mindepth 1 -name '*' -mtime +$DAYSIC | wc -l)
if [ $FILECOUNT1 -gt 0 ]; then
echo -n "Delete $FILECOUNT1 files older than $DAYSIC days "
fi
#
CANDIPAC=$( /usr/bin/paccache -d -k $PACVERS )
if !(`echo $CANDIPAC | grep -sqie 'no candidate packages found'`); then
let FILECOUNT2=$(echo $CANDIPAC | grep -sie 'run.*candidates' | cut -f2 -d ':' | cut -f2 -d ' ' | sed 's/^[\t ]*//g')
if [ $FILECOUNT1 -gt 0 ]; then
echo "and $FILECOUNT2 files more than $PACVERS versions from the cache directory "
else
echo "Delete $FILECOUNT2 files more than $PACVERS versions from the cache directory "
fi
fi
#
# Delete files older than $DAYSIC days from the cache dir
/usr/bin/find /var/cache/pacman/pkg/ -mindepth 1 -name '*' -mtime +$DAYSIC -delete
# Delete more than $PACVERS versions of a package
/usr/bin/paccache -r -q -k $PACVERS
/usr/bin/du -smh /var/cache/pacman/pkg
Make
/etc/pacman.d/hooks/cleancache.hook with the editor of your choice.
[Trigger]
Type = File
Operation = Install
Operation = Upgrade
Operation = Remove
Target = *
[Action]
When = PostTransaction
Exec = /etc/pacman.d/scripts/cleancache.sh
sudo chmod +x /etc/pacman.d/scripts/cleancache.sh
Now every time you update your system your cache gets automatic managed.
Have fun.
This is a very good idea for users wanting to keep a cache. I have a suggestion since it should be easy to implement: maximum size for package to be kept, e.g. discard files over ~50m and their .sig companions :)
Hi Hitman,
> maximum size for package to be kept
This contradicts the goal of having a complete cache of all updates from at least the last 60 days.
Many people have the storage space for more, i.e. 90 days.
If the last Libreoffice with 150+MB is broken, I also want to be able to downgrade it.
> .sig companions
Just a few bytes that are no worth a exception.
This is how the output of the cache management system looks like:
(4/5) cleancache.hook
Delete 2 files older than 60 days and 3 files more than 2 versions from the cache directory
3.2G /var/cache/pacman/pkg
Looks nice!
From my personal experience the most breakages were because of libraries, that's my logic for thinking about a size limitation, yes the list won't be complete but it further strengtthens the qualities of this, even better planned and leaner cache dir. with more versions taking the same ~3G room.
Yeah i was looking at how you execute the find command and that it's taking all files in a bunch, already deleting them, so for exceptions (well inclusions here) they will need to be paired. By the way variable FILECOUNT1 can be reused to not run that find command twice, when deleting pipe it in a loop :)
Hi Hitman,
look my goal was to have the update packages of last update and the one before that.
My pacman package cache had, so far, enabled me together with tail -n 500 /var/log/pacman.log to fix
all update regressions in minutes.
This was my goal when developing this automatic pacman package management system.
Another goal was to make the script as clear and easy to understand as possible.
I don't want spent more than 3 seconds to understand what a 5 year old script/program is doing.
Simplicity before effectiveness.
You are welcome to make your own more sophisticated version :)
Btw. I have developed a robust, fast, well working and cron job suitable (with little modification) cloud backup script in this spirit.
* All data and dir/filenames are secure encrytped.
* Data economical differencial backups/syncs.
If there is interesst, I'm happy to publish it here.
That is correct even though i don't usually do this, hehe.
Of course the suggestion remains a suggestion.
Seems like an interesting backup solution, as long as you're sure it's safe for the destinacion why not post it too.