کد:
http://www.isaserver.org/tutorials/Understanding-Web-Caching-Concepts-ISA-Firewall.html

Types of Web Caching

As stated above, there are two basic types of Web caching:

  • Forward caching
  • Reverse caching

The 2006 ISA Firewall performs both of these, so let’s look at each a little more closely.
Forward Caching

One way to reduce Internet bandwidth consumption is to store frequently-accessed Web objects on the local network, where they can be retrieved by internal users without going out to a server on the Internet. This is called forward Web caching, and it has the added advantage of making access for internal users faster because they are retrieving the Web objects (such as pages, graphics, audio and video files) over a fast LAN connection, typically 1 Gbps, instead of a slower Internet connection at perhaps 5-15 Mbps.
All Web caching servers support forward Web caching. Forward caching accelerates the response to outbound requests when users on the internal network request a Web object from a server on the Internet. Frequently requested objects are stored on the caching server. This means they can be retrieved via the fast local network connection instead of over a slower Internet connection.
Forward caching takes place when a user on a network protected by the 2006 ISA firewall makes a request for Web content. The requested content is placed in the Web cache after the first user makes a request. The next (and subsequent) user(s) who requests the same content from the Internet has the content delivered from the Web cache on the ISA Server 2004 machine instead of from the Internet Web server. This reduces the amount of traffic on the Internet connection and reduces overall network costs. In addition, the content is delivered to the user much more quickly from cache than it is from the actual Web server. This increases user satisfaction and productivity.
The primary “bottom line” benefit of 2006 ISA Firewall’s forward caching is cost savings realized by reduced bandwidth usage on the Internet connection.
Reverse Caching

Reverse caching, in contrast, reduces traffic on the internal network and speeds access for external users when the company hosts its own Web sites. Frequently-requested objects on the internal Web servers are cached at the network edge, on a proxy server, so that the load on the Web servers is reduced.
In generic caching documentation, reverse caches are sometimes referred to as “gateway caches” or “surrogate caches.”
Reverse caching is appropriate when your organization hosts its own internal Web sites that are made available to external Internet or intranet users. The caching server stores those objects that are frequently requested from the Internal Web servers and serves them to Internet users. This speeds access for the external users and it also lightens the load on the internal Web servers and reduces traffic on the internal network.
Reverse caching takes place when a user on the Internet makes a request for Web content that is located on a Web server published by an ISA Firewall Web Publishing Rule. The ISA Firewall retrieves the content from the Web server on an internal network or another network protected by the firewall and returns that information to the Internet user who requested the content. The ISA Firewall caches the content it retrieves from the Web server on the internal network. When subsequent users request the same information, the content is served from the ISA Firewall’s cache instead of being retrieved from the originating Web site.
There are two principle benefits to the reverse caching scenario:

  • Reverse caching reduces bandwidth usage on the internal network.
  • Reverse caching allows Web content to be available when the Web server is offline.

How Reverse Caching Reduces Bandwidth Usage

Reverse caching reduces bandwidth usage on the internal network when cached content is served directly from the ISA Firewall. No bandwidth usage is required on the internal network; thus, this bandwidth becomes available to internal network users. Corporate networks that are already having issues with insufficient bandwidth will benefit from this configuration.
How Reverse Caching Increases Availability of Web Content

There is an even more compelling advantage to reverse caching: its ability to make Web site content available when the Web server is offline. This can be part of a high-availability plan for your Web services.
Web servers can go offline for several reasons. For example, the Web server will be down for a time when routine maintenance needs to be performed or after the server experiences a hardware or software crash. No matter why the server is offline, downtime can create a negative experience ranging from a minor inconvenience to a serious problem, for Internet users when they try to access content on the site. The big advantage of the ISA Firewall’s reverse caching feature is that it makes it possible for you to take the Web server offline and still have Web site content available to Internet users because the content is served from the ISA Firewall’s cache.
Web Caching Architectures

Multiple Web-caching servers can be used together to provide for more efficient caching. There are two basic caching architectures that use multiple caching servers working together:

  • Distributed Caching
  • Hierarchical Caching

As the name implies, distributed caching distributes, or spreads, the cached Web objects across two or more caching servers. These servers are all on the same level on the network. The figure below illustrates how distributed caching works.

Hierarchical caching works a little differently. In this setup, caching servers are placed at different levels on the network. Upstream caching servers communicate with downstream proxies. For example, a caching server is placed at each branch office. These servers communicate with the caching array at the main office. Requests are serviced first from the local cache, then from a centralized cache before going out to the Internet server for the request.
Hierarchical caching is illustrated in the figure below.
Hierarchical caching is more efficient in terms of bandwidth usage, but distributed caching is more efficient in terms of disk space usage.

Finally, you can combine the two methods to create a hybrid caching architecture. The combination gives you the “best of both worlds,” improving performance and efficiency. A hybrid caching architecture is shown in the figure below.

Web Caching Protocols

When multiple Web caching servers work together, they need a way to communicate with each other, so that if the Web object requested by the client is not found in a server’s cache, it can query other caching servers before the “last resort” of going out and retrieving the document from the Internet.
There are a number of different protocols that can be used for communications between Web caching servers. The most popular of these are the following:

  • Cache Array Routing Protocol (CARP), a hash-based protocol that allows multiple caching proxies to be arrayed as a single logical cache and uses a hash function to ascertain to which cache a request should be sent. The hash function can also be used by the Web Proxy client to determine where the content is located in a distributed cache.
  • Internet Cache Protocol (ICP), a message-based protocol defined in RFC 2186 that is based on UDP/IP and was originally used for hierarchical caching by the Harvest project, from which the Squid open-source caching software was derived.
  • HyperText Caching Protocol (HTCP), which permits full request and response headers to be used in cache management.
  • Web Cache Coordination Protocol (WCCP), a router-based protocol that removes distribution of requests from the caches and uses service groups to which caches belong. The router calculates hash functions.
  • Cache digests, a hash-based protocol that is implemented in Squid, which uses an array of bits called a Bloom filter to code a summary of documents stored by each proxy.

ISA 2006 Enterprise Edition uses CARP to extend the caching feature set included with the ISA 2006 Firewall’s Standard Edition.
Summary

In this article we did a short review of the Web caching features included with the ISA Firewall. We saw that the ISA firewall is able to perform Web caching for outbound requests, which is called Forward Caching. The ISA Firewall is also able to perform Web caching for inbound requests, which is called Reverse Caching. There are several types of ISA Web caching architectures that you can build out. The two most common are distributed caching and hierarchical caching. Distributed caching helps conserve disk space and hierarchical caching helps reduce bandwidth. You can combine distributed and hierarchical caching to get the best of both worlds.







موضوعات مشابه: