0% found this document useful (0 votes)
13 views6 pages

Chapter 1 Introduction To Computer Science

The document describes the history of computing from the first mechanical calculators to modern computers, passing through successive generations marked by the evolution of technologies (transistors, integrated circuits, microprocessors). It also presents the main concepts related to computing as well as the significant programming languages and operating systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views6 pages

Chapter 1 Introduction To Computer Science

The document describes the history of computing from the first mechanical calculators to modern computers, passing through successive generations marked by the evolution of technologies (transistors, integrated circuits, microprocessors). It also presents the main concepts related to computing as well as the significant programming languages and operating systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Chapter 1: Introduction to Computer Science

1. Definition of information
Sense 1: Information is news, data, documentation about
something or about someone, brought to someone's attention. Example: a report
of information.

Sense 2: In computing and telecommunications, information is an element of


knowledge (voice, data, image) that can be stored, processed, or transmitted using a
support and a standardized coding mode.

It exists in several forms:

. Auditory form: sound, music


. Visual form: image, text,
. Audiovisual form: games, film, drawing
2.Information Science
Interdisciplinary science studying coding and measurement of information, as well as its
modes of transmission and storage.

3. Information processing:
Information processing is a logical sequence of actions that allow for transformation.
data into results.

Example:

constantine uppercase, bold, underline CONSTANTINE

3, 15, 30, 2 (30-15)+(3*2) 21

4. Definition of computer science


The word computer science is the contraction of two words:
Computer science is defined as the science that studies the automatic processing of
the information by a computer.

5. Computer (hardware)
The computer is a machine (a set of electronic circuits) that allows for
automatic information processing.

.
a. Computer hardware
Central unit: it is the brain of the computer, it is responsible for processing
the information.
Peripherals: they are connected to the central unit and allow for input and/or output.
release of information.

There are three types of devices

. Input devices: allow sending information (data) to


the central unit

keyboard, mouse, microphone, scanner ...

. Output devices: allow output of information (results) from


the central unit.

Example: Screen, printer, speaker ...

. Storage devices (input/output): allow for saving


information in a permanent way.

Example: Hard disk, flash disk, CD-ROM...

6.Software

Software is a set of programs that allows for the use of hardware.


the computer. There are two main categories of software:
. System software (Operating system): it is the set of programs
central part of a computer device that serves as an interface between the hardware and the
application software

Example: Windows, linux, macOS

. Application software: these are the programs used directly by


the user to carry out a task.

Example: To draw we use Paint.


To process a text, we use Microsoft Word.

7. Computer system

A computer system is the set of hardware and software resources.

Computer system = Hardware + software.


History of computing

The reasons:
Humans have always sought to improve their way of calculating for two reasons:

. he is slow
. He is wrong!

2. Prehistory
Since ancient times, to assist in his calculations, man has used pebbles.
placed on the ground or on a flat stone

- 700 BC: Pebbles strung on stems and cleverly arranged to form this
It's called a Chinese abacus.

3. Ancestors and precursors


The 17th century sees the birth of mechanical calculators.

1642: Blaise Pascal, at the age of 19, creates the 'Pascaline', a mechanical calculating machine based on
gear wheels, capable of performing addition and subtraction.

1673: Leibniz, great mathematician, improves the Pascaline by adding multiplication and
the division.

1805: Jacquard creates automatic looms, which use "programs" under


perforated card format.

1822-1833: The mechanical machines invented by Charles Babbage in the 19th century: a
specialized calculator (the difference machine) and a universal calculator (the machine
analytical). These machines remained in the planning stage.

The English logician George Boole publishes his book 'The Mathematical Analysis of Logic',
where he defines the so-called 'Boolean' logical operators, based on two values 0/1 for coding
True/False.

-1931: the German Konrad Zuse builds an automatic calculator, the Z1.

- 1936: Alan Turing proposed his definition of 'Turing machines'.

1945: John Von Neumann wrote a report in which he proposed the internal architecture of a
universal calculator (computer), now called 'Von Neumann architecture'.

Contemporary History
From this date, the computer exists and its material history is therefore reduced to
the evolution of technological progress, which is usually broken down in terms of
generations
First generation: the monsters

1946: construction of the ENIAC (Electronic Numerical Integrator and Computer), it measures 30
tonnes, occupies 160m2and its memory consists of 18,000 vacuum tubes, its power is
equivalent to that of a small current calculator.

1947: Invention of the transistor

1949: construction of the EDVAC, the first computer built according to von Neumann architecture.
Neumann and storing his data on magnetic disks.

1952: IBM commercializes the first vacuum tube computers, IBM 650 then
IBM 701.

1955: IBM launches the IBM 704 developed by Gene Amdahl. It is the first machine
commercial device with a math coprocessor. Power: 5 kFLOPS.

2. Second generation: integration of the transistor (1956 - 1960)

-1956: Creation of the first transistor computer, the TRADIC.

1957: Creation of the first universal programming language, FORTRAN (FORmula


TRANslator) by John Backus of IBM.

-1958: the IBM 7044, 64 kilobytes of memory, is the first computer to integrate
transistors; John McCarthy invents LISP, the first language of Artificial Intelligence

1959: conception of COBOL (Common Business Oriented Language): language of


specialized programming for management and the banking sector.

1960: conception of ALGOL (ALGOrithmic Language), advanced computing language


scientific.

3. Third generation: integrated circuits

1962: the term "computer science" was created in France by contraction of "information
automatic

1964: use of integrated circuits (miniature electronic circuits)

1969: first attempt at remote file transfer over the Arpanet network, ancestor
of the Internet; invention of the PASCAL language.

1971: introduction of floppy disks for the IBM 370; design of the LOGO language, intended for
the educational initiation to the concepts of programming.
4. Fourth generation: microcomputers

1971: The first Intel microprocessor (the Intel 4004) contained 2300 transistors and executed
60,000 instructions per second.

- 1972: conception of the C language, particularly suited for programming and use
of operating systems

1974: Motorola markets its first 8-bit processor, the 6800

1975: Bill Gates commercializes the BASIC language and creates the company Microsoft with Paul.
Allen

- 1976: Steve Jobs (21 years old, working at Atari) and Steve Wozniak (26 years old, working at
Hewlett Packard) finish their computer that they name Apple Computer. They found the
Apple Inc. was founded on April 1, 1976.

1981: IBM launches the PC (for Personal Computer, which means 'personal computer').

Fifth generation: the graphical interface and networks

1984: Apple's McIntosh introduces a graphical interface for the first time (menus,
icons...) and the mouse; design of the C++ language, object-oriented version of the C language.

1985: Microsoft introduces its new graphical interface Microsoft Windows 1.0

1989: The era of the Internet begins and the World Wide Web is invented.

1995: Windows 95 generalizes the graphical interface on PCs.

1998: birth of Google

You might also like