Time-sharing

related topics
{system, computer, user}
{company, market, business}
{math, number, function}
{work, book, publish}
{theory, work, human}
{school, student, university}

Time-sharing is the sharing of a computing resource among many users by means of multiprogramming and multi-tasking. Its introduction in the 1960s, and emergence as the prominent model of computing in the 1970s, represents a major technological shift in the history of computing.

By allowing a large number of users to interact concurrently with a single computer, time-sharing dramatically lowered the cost of providing computing capability, made it possible for individuals and organizations to use a computer without owning one, and promoted the interactive use of computers and the development of new interactive applications.

Contents

History

Batch processing

The earliest computers were extremely expensive devices, and very slow. Machines were typically dedicated to a particular set of tasks and operated by control panel, the operator manually entering small programs via switches in order to load and run other programs. These programs might take hours, even weeks, to run. As computers grew in speed, run times dropped, and suddenly the time taken to start up the next program became a concern. The batch processing methodologies evolved to decrease these dead times, queuing up programs so that as soon as one completed, the next would start.

To support a batch processing operation, a number of card punch or paper tape writers would be used by programmers, who would use these inexpensive machines to write their programs "offline". When they completed typing them, they were submitted to the operations team, who would schedule them for running. Important programs would be run quickly, less important ones were unpredictable. When the program was finally run, the output, generally printed, would be returned to the programmer. The complete process might take days, during which the programmer might never see the computer.

The alternative, allowing the user to operate the computer directly, was generally far too expensive to consider. This was because the user had long delays where they were simply sitting there entering code. This limited developments in direct interactivity to organizations that could afford to waste computing cycles, large universities for the most part. Programmers at the universities decried the inhumanist behaviors that batch processing imposed, to the point that Stanford students made a short film humorously critiquing it. They experimented with new ways to directly interact with the computer, a field today known as human-computer interaction.

Full article ▸

related documents
GeForce 256
Real-time Transport Protocol
NuBus
Xfce
MPEG-4
Logic analyzer
Repeater
Dynamic DNS
Comparator
RF modulator
Psion Organiser
Response time (technology)
NS320xx
Communications system
Virtual circuit
Sampling rate
Speech coding
Fibre Channel
Pulse dialing
Motorola 68000 family
Time-division multiplexing
Encapsulated PostScript
Backplane
Ku band
Open mail relay
IEEE 802.15
Communications in Gibraltar
Telephony
Digital Private Network Signalling System
Microsoft Developer Network