next up previous
Next: Proxy Server Up: Binonymizer - A Two-Way Previous: Requirements


The system consists of two main parts: a Web server that is enhanced by several standard modules and the ``scrambler''. The Web server is responsible to answer user requests and to send the required documents to the clients. The scrambler resides in an independently running process that communicates with the Web server. The two main functions of the scrambler are scrambling plain URLs and resolving scrambled URLs (SURLs).

The following figure shows the architecture of the system.

Figure 1: Architecture
\includegraphics [width=5cm]{Architecture.eps}

For a better understanding how the system works, we demonstrate the life cycle of a scrambled URL:

The first user request consists of a plain, valid URL. The server locates the appropriate document and parses it. Each URL found in this document is sent to the scrambler. Now, two different cases have to be distinguished: either the URL refers to a document that resides locally on the same server or it is an external URL. In the latter case, a proxy is needed to retrieve the requested documents from external servers. This proxy is described in a section of its own. In either case, it doesn't matter which media type is referenced and which HTML tag is used.

After a document is parsed and all URLs are substituted, the Web server sends the document to the user. This connection is secured by the use of a SSL or TLS connection between the Web server and the client. Therefore, the content of the document is protected against packet sniffing.

Current Web browsers store URLs of visited web sites in a history list, even URLs of securely received documents. This makes it possible for a tool to retrieve the same document the user has requested. To avoid this attack SURLs are only valid in the context of the same SSL/TLS session between client and server.

If the user requests an SURL the browser sends the SURL back to the Web server. The Web server is able to determine whether the requested URL is scrambled and contacts the scrambler, if necessary. The scrambler receives the SURL and examines it. If it is not valid within the current session it denies access to the requested document. Otherwise, it returns the appropriate plain URL to the Web server. Now, the Web server can retrieve the new document.

next up previous
Next: Proxy Server Up: Binonymizer - A Two-Way Previous: Requirements
Tim Wellhausen