Project
Project: Information Technology
Classe: X-B
School: “Alush Bardhi”Farke
Subject: Anglisht
Accepted: Fitim Domi
Worked: Amelja Aga
Sadber Hafuzi
Melisa Bela
Beraldo Shadini
INFORMATION TECHNOLOGY (IT)
• Information technology (IT) is a set of related fields that encompass computer systems,
software, programming languages, data and information processing, and storage. IT forms
part of information and communications technology (ICT). An information technology
system (IT system) is generally an information system, a communications system, or, more
specifically speaking, a computer system — including all hardware, software, and peripheral
equipment — operated by a limited group of IT users, and an IT project usually refers to the
commissioning and implementation of an IT system. IT systems play a vital role in
facilitating efficient data management, enhancing communication networks, and supporting
organizational processes across various industries. Successful IT projects require meticulous
planning, seamless integration, and ongoing maintenance to ensure optimal functionality
and alignment with organizational objectives.
• Although humans have been storing, retrieving, manipulating, and
communicating information since the earliest writing systems were
developed, the term information technology in its modern sense first
appeared in a 1958 article published in the Harvard Business Review; authors
Harold J. Leavitt and Thomas L. Whisler commented that "the new technology
does not yet have a single established name. We shall call it information
technology (IT)."Their definition consists of three categories: techniques for
processing, the application of statistical and mathematical methods to
decision-making, and the simulation of higher-order thinking through
computer programs.
HISTORY
• Ideas of computer science were first mentioned before the 1950s under the
Massachusetts Institute of Technology (MIT) and Harvard University, where they had
discussed and began thinking of computer circuits and numerical calculations. As time
went on, the field of information technology and computer science became more
complex and was able to handle the processing of more data. Scholarly articles began
to be published from different organizations.Looking at early computing, Alan Turing,
J. Presper Eckert, and John Mauchly were considered some of the major pioneers of
computer technology in the mid-1900s. Giving them such credit for their
developments, most of their efforts were focused on designing the first digital
computer. Along with that, topics such as artificial intelligence began to be brought up
as Turing was beginning to question such technology of the time period.
• Devices have been used to aid computation for thousands of years, probably initially in the form
of a tally stick.The Antikythera mechanism, dating from about the beginning of the first century
BC, is generally considered the earliest known mechanical analog computer, and the earliest
known geared mechanism.[12] Comparable geared devices did not emerge in Europe until the
16th century, and it was not until 1645 that the first mechanical calculator capable of performing
the four basic arithmetical operations was developed.Electronic computers, using either relays or
valves, began to appear in the early 1940s. The electromechanical Zuse Z3, completed in 1941,
was the world's first programmable computer, and by modern standards one of the first
machines that could be considered a complete computing machine. During the Second World
War, Colossus developed the first electronic digital computer to decrypt German messages.
Although it was programmable, it was not general-purpose, being designed to perform only a
single task. It also lacked the ability to store its program in memory; programming was carried
out using plugs and switches to alter the internal wiring. The first recognizably modern electronic
digital stored-program computer was the Manchester Baby, which ran its first program on 21
June 1948.The development of transistors in the late 1940s at Bell Laboratories allowed a new
generation of computers to be designed with greatly reduced power consumption. The first
commercially available stored-program computer, the Ferranti Mark I, contained 4050 valves and
had a power consumption of 25 kilowatts. By comparison, the first transistorized computer
developed at the University of Manchester and operational by November 1953, consumed only
150 watts in its final version.
DATA PROCESSING
• Early electronic computers such as Colossus made use of punched tape, a long strip of paper on which
data was represented by a series of holes, a technology now obsolete. Electronic data storage, which is
used in modern computers, dates from World War II, when a form of delay-line memory was developed to
remove the clutter from radar signals, the first practical application of which was the mercury delay line.
The first random-access digital storage device was the Williams tube, which was based on a standard
cathode ray tube. However, the information stored in it and delay-line memory was volatile in the fact
that it had to be continuously refreshed, and thus was lost once power was removed. The earliest form of
non-volatile computer storage was the magnetic drum, invented in 1932 and used in the Ferranti Mark 1,
the world's first commercially available general-purpose electronic computer.IBM introduced the first hard
disk drive in 1956, as a component of their 305 RAMAC computer system.: 6 Most digital data today is still
stored magnetically on hard disks, or optically on media such as CD-ROMs.: 4–5 Until 2002 most
information was stored on analog devices, but that year digital storage capacity exceeded analog for the
first time. As of 2007, almost 94% of the data stored worldwide was held digitally: 52% on hard disks,
28% on optical devices, and 11% on digital magnetic tape. It has been estimated that the worldwide
capacity to store information on electronic devices grew from less than 3 exabytes in 1986 to 295
exabytes in 2007, doubling roughly every 3 years.
SERVICES
• Email
• The technology and services it provides for sending and receiving electronic messages (called
"letters" or "electronic letters") over a distributed (including global) computer network. In
terms of the composition of elements and the principle of operation, electronic mail
practically repeats the system of regular (paper) mail, borrowing both terms (mail, letter,
envelope, attachment, box, delivery, and others) and characteristic features — ease of use,
message transmission delays, sufficient reliability and at the same time no guarantee of
delivery. The advantages of e-mail are: easily perceived and remembered by a person
addresses of the form user_name@domain_name (for example, [email protected]);
the ability to transfer both plain text and formatted, as well as arbitrary files; independence
of servers (in the general case, they address each other directly); sufficiently high reliability
of message delivery; ease of use by humans and programs.
• Search system
• A software and hardware complex with a web interface that provides the ability
to search for information on the Internet. A search engine usually means a site
that hosts the interface (front-end) of the system. The software part of a search
engine is a search engine (search engine) — a set of programs that provides
the functionality of a search engine and is usually a trade secret of the search
engine developer company. Most search engines look for information on World
Wide Web sites, but there are also systems that can look for files on FTP
servers, items in online stores, and information on Usenet newsgroups.
Improving search is one of the priorities of the modern Internet (see the Deep
Web article about the main problems in the work of search engines).
• Commercial effects
• Companies in the information technology field are often discussed as a group as the "tech
sector" or the "tech industry."[51][52][53] These titles can be misleading at times and should
not be mistaken for "tech companies;" which are generally large scale, for-profit corporations
that sell consumer technology and software. It is also worth noting that from a business
perspective, Information technology departments are a "cost center" the majority of the time.
A cost center is a department or staff which incurs expenses, or "costs", within a company
rather than generating profits or revenue streams. Modern businesses rely heavily on
technology for their day-to-day operations, so the expenses delegated to cover technology
that facilitates business in a more efficient manner are usually seen as "just the cost of doing
business." IT departments are allocated funds by senior leadership and must attempt to
achieve the desired deliverables while staying within that budget. Government and the
private sector might have different funding mechanisms, but the principles are more-or-less
the same. This is an often overlooked reason for the rapid interest in automation and Artificial
Intelligence, but the constant pressure to do more with less is opening the door for
automation to take control of at least some minor operations in large companies.
• Many companies now have IT departments for managing the computers,
networks, and other technical areas of their businesses. Companies have
also sought to integrate IT with business outcomes and decision-making
through a BizOps or business operations department.In a business context,
the Information Technology Association of America has defined information
technology as "the study, design, development, application,
implementation, support, or management of computer-based information
systems".[page needed] The responsibilities of those working in the field
include network administration, software development and installation, and
the planning and management of an organization's technology life cycle, by
which hardware and software are maintained, upgraded, and replaced.