Introduction to Computing History (01:28)
World War II code breaking efforts led to the invention of computers. A computer is just a machine that manipulates data according to a user's instructions.
Antikythera Mechanism (00:52)
The first known precursor to the computer was discovered in the wreck of an ancient Roman ship in 1900 but not understood for decades after. It calculated the sun's and moon's position.
Father of the Computer (01:28)
In the nineteenth century, Charles Babbage conceived models for mechanical computers, but neither was built until 1991.
First Computer Programmer (02:12)
Early computers ran on punch cards, originally designed to automate weaving. Countess Augusta Ada Byron wrote an analysis of Babbage's computers and created programs for them.
Binary System Invented (02:56)
The binary code, a two-digit numbering system, made digital computers possible; the first was long lost and forgotten. Calculators work because buttons you press feed the computer a code; they don't literally add up numbers.
Turing and Code Breaking (03:05)
Before automated calculators, people needed printed mathematical tables to perform complex calculations, which could take days to complete. Turing's World War II work on code breaking theory led to the creation of modern computers.
IBM and Punch Card Business (02:27)
IBM initially based its business on punch cards for computers, and worried non-punch card computers would harm its card-processing business. Whether a hole was punched or not was the basis for a kind of binary code.
The first computer resembling those of today was ENIAC. Computers speak the language of on and off switches. After ENIAC, transistors replaced vacuum tubes.
Microprocessors made possible computers small enough to use at home, but at first could only be used by hobbyists.
Apple and Floppy Disks (03:35)
Steve Wozniak, using a new design, created an easy-to-use computer. In the Apple II, Wozniak and Jobs used an audio-cassette for data storage, the first portable storage device, using magnetic recording. Moore's Law is discussed.
Powerful supercomputers take on functions like modeling the human brain. Size increases information transfer time, limiting speed; a solution may be to send signals using light rather than electricity.
Credits: Electric Mind: Quirky Science (00:43)
Credits: Electric Mind: Quirky Science
For additional digital leasing and purchase options contact a media consultant at 800-257-5126
(press option 3) or email@example.com.