A Concise History
MIT Press
The history of computing could be told as the story of hardware and
software, or the story of the Internet, or the story of "smart" hand-held
devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In
this concise and accessible account of the invention and development of digital
technology, computer historian Paul Ceruzzi offers a broader and more useful
perspective. He identifies four major threads that run throughout all of computing's
technological development: digitization--the coding of information, computation, and
control in binary form, ones and zeros; the convergence of multiple streams of
techniques, devices, and machines, yielding more than the sum of their parts; the
steady advance of electronic technology, as characterized famously by "Moore's
Law"; and the human-machine interface. Ceruzzi guides us through computing
history, telling how a Bell Labs mathematician coined the word "digital"
in 1942 (to describe a high-speed method of calculating used in anti-aircraft
devices), and recounting the development of the punch card (for use in the 1890 U.S.
Census). He describes the ENIAC, built for scientific and military applications; the
UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor.
Ceruzzi's account traces the world-changing evolution of the computer from a
room-size ensemble of machinery to a "minicomputer" to a desktop computer
to a pocket-sized smart phone. He describes the development of the silicon chip,
which could store ever-increasing amounts of data and enabled ever-decreasing device
size. He visits that hotbed of innovation, Silicon Valley, and brings the story up
to the present with the Internet, the World Wide Web, and social
networking.
Ceruzzi
Computing jetzt bestellen!
software, or the story of the Internet, or the story of "smart" hand-held
devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In
this concise and accessible account of the invention and development of digital
technology, computer historian Paul Ceruzzi offers a broader and more useful
perspective. He identifies four major threads that run throughout all of computing's
technological development: digitization--the coding of information, computation, and
control in binary form, ones and zeros; the convergence of multiple streams of
techniques, devices, and machines, yielding more than the sum of their parts; the
steady advance of electronic technology, as characterized famously by "Moore's
Law"; and the human-machine interface. Ceruzzi guides us through computing
history, telling how a Bell Labs mathematician coined the word "digital"
in 1942 (to describe a high-speed method of calculating used in anti-aircraft
devices), and recounting the development of the punch card (for use in the 1890 U.S.
Census). He describes the ENIAC, built for scientific and military applications; the
UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor.
Ceruzzi's account traces the world-changing evolution of the computer from a
room-size ensemble of machinery to a "minicomputer" to a desktop computer
to a pocket-sized smart phone. He describes the development of the silicon chip,
which could store ever-increasing amounts of data and enabled ever-decreasing device
size. He visits that hotbed of innovation, Silicon Valley, and brings the story up
to the present with the Internet, the World Wide Web, and social
networking.
Autoren/Hrsg.
Fachgebiete
Weitere Infos & Material
Bitte ändern Sie das Passwort