|
|
6.1 | Workstations and Servers | ||
| 6.1.3 | Client-server relationship |
|
The client-server computing model
distributes processing over multiple computers. Distributed processing
enables access to remote systems for the purpose of sharing
information and network resources. In a client-server environment, the
client and server share or distribute processing responsibilities.
Most network operating systems are designed around the client-server
model to provide network services to users. A computer on a network
can be referred to as a host, workstation, client, or server. A
computer running TCP/IP, whether it is a workstation or a server, is
considered a host computer.
Definitions of other commonly used terms are:
An example of a client-server relationship is a File Transfer Protocol (FTP) session. FTP is a universal method of transferring a file from one computer to another. For the client to transfer a file to or from the server, the server must be running the FTP daemon or service. In this case, the client requests the file to be transferred. The server provides the services necessary to receive or send the file. The Internet is also a good example of a distributed processing client-server computing relationship. The client or front end typically handles user presentation functions, such as screen formatting, input forms, and data editing. This is done with a browser, such as Netscape or Internet Explorer. Web browsers send requests to web servers. When the browser requests data from the server, the server responds, and the browser program receives a reply from the web server. The browser then displays the HTTP data that was received. The server or back end handles the client's requests for Web pages and provides HTTP or WWW services. Another example of a client-server relationship is a database server and a data entry or query client in a LAN. The client or front end might be running an application written in the C or Java language, and the server or back end could be running Oracle or other database management software. In this case, the client would handle formatting and presentation tasks for the user. The server would provide database storage and data retrieval services for the user.
In a typical file server environment, the
client might have to retrieve large portions of the database files to
process the files locally. This retrieval of the database files can
cause excess network traffic. With the client-server model, the client
presents a request to the server, and the server database engine might
process 100,000 records and pass only a few back to the client to
satisfy the request. Servers are typically much more powerful than
client computers and are better suited to processing large amounts of
data. With client-server computing, the large database is stored, and
the processing takes place on the server. The client has to deal only
with creating the query. A relatively small amount of data or results
might be passed across the network. This satisfies the client query
and results in less usage of network bandwidth. The graphic shows an
example of client-server computing. Note that the workstation and
server normally would be connected to the LAN by a hub or switch.
The distribution of functions in client-server networks brings substantial advantages, but also incurs some costs. Although the aggregation of resources on server systems brings greater security, simpler access, and coordinated control, the server introduces a single point of failure into the network. Without an operational server, the network cannot function at all. Additionally, servers require trained, expert staff to administer and maintain them, which increases the expense of running the network. Server systems require additional hardware and specialized software that adds substantially to the cost.
|