Home
| Calendar
| Mail Lists
| List Archives
| Desktop SIG
| Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings Linux Cafe | Meeting Notes | Linux Links | Bling | About BLU |
On Debian derived systems using apt, I would think it would be a common desire to be able to undo a package upgrade by not only removing the new package, but reinstalling the old package. Generally, you can do this, and fairly easily with the right arguments to apt-get (or aptitude), but there are some cases where the usual approach fails. One of them is if you have an old system that is no longer being supported. Then you can't simply download the older package. There are other circumstances other than upgrading that might necessitate a package re-install, which will also be thwarted if you can't download the original package. In theory, your old packages can be found in /var/cache/apt/archives, providing you haven't ran "apt-get clean". And the autoclean apt-get[1] sub-command is actually smart enough to not delete a package that is no longer available for download. 1. http://linux.die.net/man/8/apt-get But what happens when the package is available for download at the time autoclean is ran, but later when you try and upgrade to a newer version, the older version is no longer available for download? By default, a modern Ubuntu system uses: /etc/apt/apt.conf.d/10periodic: APT::Periodic::Update-Package-Lists "1"; APT::Periodic::Download-Upgradeable-Packages "0"; APT::Periodic::AutocleanInterval "0"; and: /etc/apt/apt.conf.d/20archive: APT::Archives::MaxAge "30"; APT::Archives::MinAge "2"; APT::Archives::MaxSize "500"; to configure the script at: /etc/cron.daily/apt I haven't been able to find any documentation for these variables other than the incline comments in the script. The above says that the system will not autoclean, but will delete packages older than 30 days, and will delete the largest packages when the size exceeds 500 MB. What seems to be missing is a policy directive that succinctly specifies that I want the current and previous version of every package retained. Also interesting that no one has implemented an apt-get or aptitude subcommand to intelligently revert a specified package to the prior version, automatically making use of /var/cache/apt/archives. (For that matter, I don't think either apt-get or aptitude make use of /var/cache/apt/archives other than to write files to to. I don't think there are any sub-commands that will look for packages there. I guess they assume if things break badly enough, you'll manually dig up old packages there and use 'dpkg -i' to install them.) I should be able to run something like: aptitude revert package to have aptitude lookup up the prior successfully installed version of package, find it in /var/cache/apt/archives, install it, then ask if I want to put a version hold on the package. I know one of the responses to this question is going to be to use apt-proxy[2] or Apt-Cacher-NG[3], but these tools are about sharing a cache of packages across multiple machines on your LAN. Great if you have multiple machines of the same type. Less useful otherwise, and more overhead if you don't. More importantly, a caching proxy doesn't know what has been successfully installed on a machine, so that prior version can be expired. Apt does. 2. http://apt-proxy.sourceforge.net/ 3. https://www.unix-ag.uni-kl.de/~bloch/acng/html/index.html I also saw some answers to this question suggesting the use of APTonCD[4], which as the name implies, is a tool for creating a backup of your packages on CD or DVD, and then includes directions on how to make those archives directly usable from apt by adding it to your sources list. Presuming you can configure the tool to archive to a file system instead of a DVD (and if not, a lower-level tool, apt-move[5], can do the job), it still wouldn't intelligently implement the policy I'm looking for, with automatic expiration of older versions. 4. http://aptoncd.sourceforge.net/ 5. http://sourceforge.net/projects/apt-move/ The cron.daily/apt script also seems to support some notion of a backup: # APT::Periodic::BackupArchiveInterval "0"; # - Backup after n-days if archive contents changed.(0=disable) # # APT::Periodic::BackupLevel "3"; # - Backup level.(0=disable), 1 is invalid. # # Dir::Cache::Backup "backup/"; # - Set periodic package backup directory but none of these parameters are altered in /etc/apt/apt.conf.d/. A quick read of the script would suggest it creates directories like backup0, backup1, backup2, ... up to APT::Periodic::BackupLevel, containing hard links to the files in /var/cache/apt/archives that have been added since the last backup run. (There is no /var/cache/apt/archives/backup* directories on the system I'm examining, so the "0" setting on the first variable apparently is disabling it.) It's a little bit close to what I want, but the archiving and expiration of older packages should be driven by the act of installing newer package versions, rather than a daily cron job. In any case, I think the end result would have the same drawbacks to APTonCD or apt-move. Possibly even more drawbacks, as at least those two tools create a properly structured repository that can be directly accessed with apt-get, rather than manually sifted through and managed with dpkg. Another obvious solution is just general backups. Sure. That works. But a big reason for using package management is so you can leverage that metadata to intelligently add and remove specific files from your system as a collection. Resorting to a general purpose backup solution means you've exhausted the limits of what the packaging system can do for you, and that's just not the case here. Searches on this topic turned up the suggestions above, but not what I was expecting, which was some apt.conf parameter to enable this behavior. It seems like it should be a common need. -Tom -- Tom Metro Venture Logic, Newton, MA, USA "Enterprise solutions through open source." Professional Profile: http://tmetro.venturelogic.com/
BLU is a member of BostonUserGroups | |
We also thank MIT for the use of their facilities. |