Тексты по английскому языку
   На этой странице помещена третья часть текста по английскому языку про "Историю компьютера", которую нужно устно превести на русский группам A и B1, занимающимся у Цапаевой Ю.А.
   Если хотите, можете скачать заархивированную Word-овскую версию этого английского текста (6 117 байт, RAR 2.0).
   Advances In the 1950's
   Early in the 50's two important engineering discoveries changed the image of the electronic computer field from one of fast but unreliable hardware to an image of relatively high reliability and even more capability. These discoveries were the magnetic core memory and the Transistor-Circuit Element. These technical discoveries quickly found their way into new models of digital computers. RAM capacities increased from 8,000 to 64,000 words in commercially available machines by the 1960's, with access times of 2 to 3 MS (Milliseconds). These machines were very expensive to purchase or even to rent and were particularly expensive to operate because of the cost of expanding programming. Such computers were mostly found in large computer centers operated by industry, government, and private laboratories - staffed with many programmers and support personnel. This situation led to modes of operation enabling the sharing of the high potential available.
   One such mode is batch processing, in which problems are prepared and then held ready for computation on a relatively cheap storage medium. Magnetic drums, magnetic - disk packs, or magnetic tapes were usually used. When the computer finishes with a problem, it "dumps" the whole problem (program and results) on one of these peripheral storage units and starts on a new problem.
   Another mode for fast, powerful machines is called time-sharing. In time-sharing, the computer processes many jobs in such rapid succession that each job runs as if the other jobs did not exist, thus keeping each "customer" satisfied. Such operating modes need elaborate executable programs to attend to the administration of the various tasks.
   Advances In the 1960's
   In the 1960's, efforts to design and develop the fastest possible computer with the greatest capacity reached a turning point with the LARC machine, built for the Livermore Radiation Laboratories of the University of California by the Sperry - Rand Corporation, and the Stretch computer by IBM. The LARC had A base memory of 98,000 words and multiplied in 10 Greek MU seconds. Stretch was mode with several degrees of memory having slower access for the ranks of greater capacity, the fastest access time being less then 1 Greek MU Second and the total capacity in the vicinity of 100,000,000 words.
   During this period, the major computer manufacturers began to offer a range of capabilities and prices, as well as accessories such as:
    · Consoles
    · Card Feeders
    · Page Printers
    · Cathode-ray-tube displays
    · Graphing devices
   These were widely used in businesses for such things as:
    · Accounting
    · Payroll
    · Inventory control
    · Ordering Supplies
    · Billing
   CPU's for these uses did not have to be very fast arithmetically and were usually used to access large amounts of records on file, keeping these up to date. By far, the most number of computer systems were sold for the more simple uses, such as hospitals (keeping track of patient records, medications, and treatments given). They were also used in libraries, such as the National Medical Library retrieval system, and in the Chemical Abstracts System, where computer records on file now cover nearly all known chemical compounds.
   More Recent Advances
   The trend during the 1970's was, to some extent, moving away from very powerful, single?purpose computers and toward a larger range of applications for cheaper computer systems. Most continuous?process manufacturing, such as petroleum refining and electrical?power distribution systems, now used computers of smaller capability for controlling and regulating their jobs.
   In the 1960's, the problems in programming applications were an obstacle to the independence of medium sized on-site computers, but gains in applications programming language technologies removed these obstacles. Applications languages were now available for controlling a great range of manufacturing processes, for using machine tools with computers, and for many other things. Moreover, a new revolution in computer hardware was under way, involving shrinking of computer-logic circuitry and of components by what are called large?scale integration (LSI) techniques. In the 1950's it was realized that "scaling down" the size of electronic digital computer circuits and parts would increase speed and efficiency and by that, improve performance, if they could only find a way to do this. About 1960 photo printing of conductive circuit boards to eliminate wiring became more developed. Then it became possible to build resistors and capacitors into the circuitry by the same process. In the 1970's, vacuum
   deposition of transistors became the norm, and entire assemblies, with adders, shifting registers, and counters, became available on tiny "chips".
   In the 1980's, very large scale integration (VLSI), in which hundreds of thousands of transistors were placed on a single chip, became more and more common. Many companies, some new to the computer field, introduced in the 1970's programmable minicomputers supplied with software packages. The "shrinking" trend continued with the introduction of personal computers (PC's), which are programmable machines small enough and inexpensive enough to be purchased and used by individuals.
   Many companies, such as Apple Computer and Radio Shack, introduced very successful PC's in the l970's, encouraged in part by a fad in computer (video) games. In the 1980's some friction occurred in the crowded PC field, with Apple and IBM keeping strong. In the manufacturing of semiconductor chips, the Intel and Motorola Corporations were very competitive into the 1980's, although Japanese firms were making strong economic advances, especially in the area of memory chips. By the late 1980's, some personal computers were run by microprocessors that handling 32 bits of data at a time, could process about 4,000,000 instructions per second.
   Microprocessors equipped with read-only memory (ROM), which stores constantly used, unchanging programs, now performed an increased number of process-control, testing, monitoring, and diagnosing functions, like automobile ignition systems, automobile-engine diagnosis, and production-line inspection duties.
   Cray Research and Control Data Inc. dominated the field of supercomputers, or the most powerful computer systems, through the 1970's and 1980's. In the early 1980's, however, the Japanese government announced a gigantic plan to design and build a new generation of supercomputers. This new generation, the so-called "fifth" generation, is using new technologies in very large integration, along with new programming languages, and will be capable of amazing feats in the area of artificial intelligence, such as voice recognition.
   Progress in the area of software has not matched the great advances in hardware. Software has become the major cost of many systems because programming productivity has not increased very quickly. New programming techniques, such as object-oriented programming, have been developed to help relieve this problem. Despite difficulties with software, however, the cost per calculation of computers is rapidly lessening, and their convenience and efficiency are expected to increase in the early future.
   The computer field continues to experience huge growth. Computer networking, computer mail, and electronic publishing are just a few of the applications that have grown in recent years. Advances in technologies continue to produce cheaper and more powerful computers offering the promise that in the near future, computers or terminals will reside in most, if not all homes, offices, and schools.