There can be a few occasions where you may need to manually purge the local DNS cache and/or the actual web cache of a Blue Coat ProxySG appliance. While both the DNS cache and web cache will eventually age out it can be helpful to sometimes speed up the process by flushing/purging the DNS and web cache.
While this can all be done from the web interface I generally prefer the CLI (if available). The Blue Coat ProxySG appliances that I managed are setup for SSH access you may need to confirm that SSH is enabled (telnet might be enabled).
Let’s start by connecting to the BlueCoat ProxySG appliance (proxysg.acme.org);
[root@linuxhost etc]# ssh -l admin proxysg.acme.org admin@proxysg.acme.org's password: proxysg.acme.org - Blue Coat SG510 Series>
Once we’re connected we need to go into privledged mode to issue the commands;
proxysg.acme.org - Blue Coat SG510 Series>enable Enable Password:
Now that we’re in privledged mode we can clear the web content cache with the following command;
proxysg.acme.org - Blue Coat SG510 Series#clear-cache ok
And to clear the DNS cache we can use the following command;
proxysg.acme.org - Blue Coat SG510 Series#purge-dns-cache ok
And don’t forget to logout when you’re all done.
proxysg.acme.org - Blue Coat SG510 Series#exit Connection to proxysg.acme.org closed.
Cheers!
For what it’s worth, on our 400s running SGOS 5.2.4.8 Proxy Edition the purge-dns-cache doesn’t exist.
Michael,
How often does the Proxy SG 810 update it’s cache information?
The reason I ask; many sites are dynamic and require constant updating to prevent stale information. I was forced to create a policy to always verify constantly changing sites.
Hi Jim,
There are some configuration options that can help control the behavior but it doesn’t operate too much different than a normal web browser. It will utilize the HTTP headers to determine if the content has changed. There are usually a few websites that don’t follow best practices so you may need to exclude them from caching via policy. I’ve only come across a few at my organization… in the early days I tried to explain to the site operators that they weren’t following best practices and they had their web servers mis-configured. In the end it’s easier to just add them to a policy rule that disables caching for that URL.
Cheers!