Keeping many Debian servers up to date with apt-proxy

Posted by kgfullerton on Mon 23 Jan 2006 at 14:10

Maybe, like me, you've got more than one Debian box on your network - either at home or at work and you want to keep them up to date with apt but are on a slow link or metered bandwidth? If so, apt-proxy could be the answer for you.

apt-proxy is a Python based daemon that caches all apt requests that go through it, and stores a copy of the files locally, so you only need to download a copy of the .deb via apt once.

Installation is simple - just run

apt-get install apt-proxy
on the machine you want as your proxy server. After installation is complete you'll have a daemon listening on port 9999.

Packages downloaded are stored in /var/cache/apt-proxy by default, but this and many other options, including the servers to use to download from, can be changed in the config file /etc/apt-proxy/apt-proxy-v2.conf - you can change the port and the interface that apt-proxy listens on.

The only change left is to change your /etc/apt/sources.list on all Debian machines and change them to look at http://$APT_PROXY_MACHINE:9999/debian.

Now, the next apt-get update and apt-get upgrade should be fast for the rest of the machines on your network.



Posted by Kellen (85.178.xx.xx) on Mon 23 Jan 2006 at 15:25
[ View Weblogs ]
Shouldn't you leave the debian repositories there as fallbacks and just have your local cache/proxy listed first?

[ Parent | Reply to this comment ]

Posted by dopehouse (84.130.xx.xx) on Mon 23 Jan 2006 at 15:52
[ View dopehouse's Scratchpad ]
I think that there is no need for a backup in the clients sources.list . If the proxy is offline, than you've a high priority task to solve before do any updates on the clients. And in the apt-proxy-v2.conf you can specify mulitply servers to download from. So if one of the repositories is down, the proxy switches to the next one.

Here is an example cut from my apt-proxy-v2.conf:
backends =

If the is down, than the next server will be used.

[ Parent | Reply to this comment ]

Posted by GoodTimes (146.180.xx.xx) on Mon 23 Jan 2006 at 17:24
[ View Weblogs ]
In the interest of NOT RTFM'ing, can you show a little more of what your sources.list looks like?

I wanted to try this without reading ANY of the apt-proxy documentation just to see how easily it would be to drop in for my servers

But a sources.list that just has one line


didn't work. Sure, if I read the documentation, i'd probably be able to figure it out...but...that wasn't the point...


Through correctness comes ease
-The Destroyer series

[ Parent | Reply to this comment ]

Posted by Anonymous (84.92.xx.xx) on Mon 23 Jan 2006 at 21:34
Apt-cacher is much better. Apt-proxy doesn't work properly for many people (see the bug reports, and note the age of some of them) and the work arounds that people had posted didn't even seem to work on my system (unstable). No problems with apt-cacher.

[ Parent | Reply to this comment ]

Posted by pjs (217.70.xx.xx) on Tue 24 Jan 2006 at 20:42
I've been using approx ( without any problem. The first one I tried was apt-proxy but I noticed it was a bit unstable, with approx I don't have complains. Maybe some day I'll try apt-cacher too.

[ Parent | Reply to this comment ]

Posted by ajt (204.193.xx.xx) on Fri 17 Apr 2009 at 09:49
[ View Weblogs ]

Could you elaborate on using approx? Do you still use it?

"It's Not Magic, It's Work"

[ Parent | Reply to this comment ]

Posted by KLFMANiK (195.28.xx.xx) on Wed 25 Jan 2006 at 07:17
i'm using http replicator -

HTTP Replicator is a general purpose, replicating HTTP proxy server. All downloads through the proxy are checked against a private cache, which is an exact copy of the remote file structure. If the requested file is in the cache, replicator sends it out at LAN speeds. If not in the cache, it will simultaneously download the file and stream it to multiple clients. No matter how many machines request the same file, only one copy comes down the Internet pipe. This is very useful for maintaining a cache of Debian or Gentoo packages.

so i have normal sources.list and apt.conf with Acquire::http::Proxy

for long time ago i used apt-proxy, but now i'm happy with http-replicator ... the best feature is: exact copy of the remote file structure - is this possible with approx, apt-proxy-v2 or apt-cacher? or i must for archive use apt-move? and is apt-proxy ready for Packages.diff ???

[ Parent | Reply to this comment ]

Posted by dkg (216.254.xx.xx) on Wed 25 Jan 2006 at 15:35
[ View dkg's Scratchpad | View Weblogs ]
I also use a general-purpose HTTP/FTP proxy for apt caching. I use squid, which has a very nice debian package.

Similar to you, i have a line in /etc/apt/apt.conf.d/99local that reads something like:

Acquire::http::Proxy "";

Using a generic proxy like squid seems better to me than using an apt-specific proxy because:

  • it has a wider userbase than any apt-specific proxy (more debugging, better community support)
  • it can be reused for other proxying tasks
  • it is deliberately agnostic about the data being fetched (no changes need to be made to the proxy if the apt protocol/repository layout shifts subtly)
I don't really see an advantage to the special-purpose apt-proxy or its ilk over a more generic tool, but that's probably because i haven't thought about it enough. I'd like to hear other people's arguments for (or against!) them, though. Anyone?

[ Parent | Reply to this comment ]

Posted by KLFMANiK (195.28.xx.xx) on Wed 25 Jan 2006 at 16:30
so squid is fine for proxying tasks, but can you copy specific .deb package from cache ??? or can you specify how long you can preserve .deb package in cache, or preserve version-1 ??? (The number of most recent versions to keep)

http-replicator is fine - i used many backports, so i made archive debian cd with used packages - so i can create cd/dvd only with my lovely packages ;-)

or i can mirror some subdirectory with all packages (e.g. xorg), then copy into http-replicator cache and i have complete archive ...

http-replicator can run in static mode:

Static mode: files are known to never change so files that are present are served from cache directly without contacting the server.

but http-replicator don't know ftp protocol ...

so if some apt proxy utility can use more failover debian mirrors, http or ftp, maintenance of mirror, no sources.list changes and exact copy of directory structure - it can be great util

i'm still using dselect, so with http-replicator i have:

DSelect::Options "-o Acquire::http::Proxy='http://my-http-replicator:8080'";
DSelect::UpdateOptions "-o Acquire::http::Proxy='none'";

and i'm using apt-rsync too on slow link ...

[ Parent | Reply to this comment ]

Posted by pnomblot (82.127.xx.xx) on Tue 29 May 2007 at 13:05
Could you describe how to setup apt-proxy when it is itself installed behind a firewall (with basic access authentification) ?

[ Parent | Reply to this comment ]

Posted by Anonymous (143.238.xx.xx) on Tue 24 Jul 2007 at 04:42
If you are getting this error after modifying your /etc/apt/sources.list and doing an apt-get update .

E: Type ''; is not known on line 10 in source list /etc/apt/sources.list

Then,like me, you probably placed something silly such as:

into your /etc/apt/sources.list rather than the correctly formed:

deb testing main

a great tutorial. thanks for sharing (c:

A Noni-Moose

[ Parent | Reply to this comment ]

Sign In







Current Poll

Will you stick to systemd as the default in Debian?

( 919 votes ~ 35 comments )