INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
INTRODUCTION TO COMPUTING The Computer Industry - An Overview
What is Computing? Definition: The computer industry consists of
companies that develop, manufacture, market,
Definition: Computing is any activity that is goal- maintain, and support computers and related
oriented and requires, benefits from, or creates products and services.
computers. It encompasses the use and
development of algorithms, software, hardware, Key Sectors:
data structures, and networks.
o Hardware Manufacturing
Disciplines: According to CMO No. 25, s. 2015,
computing includes several disciplines: o Software Development
o Computer Science o IT Services & Consulting
o Information Systems o Telecommunications
o Information Technology o Cloud Computing
o Software Engineering o Cybersecurity
o Computer Engineering o Artificial Intelligence & Machine Learning
Knowledge Areas in Computing (CHED Framework) o Gaming & Entertainment Tech
Area Description Career Opportunities in the Computer Industry
Algorithms & The study of computational Sector Possible Careers
Complexity problems and their efficiency.
Hardware Engineer, Embedded
Hardware
Programming Systems Developer.
Writing code in multiple languages.
Fundamentals
Software Developer, Mobile App
Software
Designing, testing, and maintaining Developer.
Software Development
software systems.
IT Services Network Administrator, System Analyst.
Understanding hardware
Computer Architecture Cybersecurity Security Analyst, Ethical Hacker.
components and system design.
Managing hardware-software Cloud Computing Cloud Architect, DevOps Engineer.
Operating Systems
interaction.
AI & ML AI Researcher, Data Scientist.
Networking & Data transmission and network
Gaming Game Developer, UX Designer.
Communication protocols.
Education &
Storing, managing, and retrieving Professor, Curriculum Developer.
Data Management Academia
data.
Human-Computer
Designing user-friendly interfaces.
Interaction Functions of Jobs in an IT/IS Department
Protecting systems and data from Common Job Functions:
Security
threats.
o System Administrators: Maintain
Social & Professional networks and servers.
Ethics, laws, and societal impact.
Issues
o Database Administrators: Manage
databases and ensure data integrity.
Why are these Knowledge Areas Important? o Network Engineers: Design and manage
communication networks.
They provide a foundation for solving complex
technological problems. o Help Desk Support: Provide assistance
to users with technical issues.
They enable adaptability as technologies evolve.
o Web Developers: Create and maintain
They prepare students for a variety of roles in the
websites.
computing industry.
o Project Managers: Oversee tech-related
They help in choosing specializations and career
projects.
paths.
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
o Security Analysts: Monitor and protect Certification Area Provider
digital assets.
Summary
Cisco CCNA Networking Cisco
Knowledge Areas in Computing: Key areas
include Algorithms, Programming, Software, AWS Certified Solutions
Networking, Data, Security, and Human-Computer Cloud Computing Amazon
Architect
Interaction (HCI).
CISSP Cybersecurity (ISC)²
Computer Industry: Consists of diverse sectors
with various career paths. Project
PMP PMI
IT/IS Department Roles: These roles are crucial Management
for an organization's efficiency and innovation.
Microsoft Azure Administrator Cloud Microsoft
CEH (Certified Ethical EC-
Security
WHY ARE CERTIFICATIONS IMPORTANT? Hacker) Council
Professional Validation: Certifications prove your Google Cloud Professional
Cloud Google
knowledge and skills. Cloud Architect
Career Advancement: They enhance your CISA IT Auditing ISACA
employability and salary potential.
Certified Data Scientist (CDS) Data Science Various
Specialization: They allow you to focus on a
specific area, such as networking, security, or cloud
computing.
Certification Pathways
Industry Recognition: Many employers prefer or
require certified professionals. Entry-Level Certifications:
Lifelong Learning: They keep you updated with o CompTIA A+, Network+, Security+
evolving technologies.
o Google IT Support Certificate
Major Certification Providers
Professional-Level Certifications:
Provider Description
o Cisco CCNP, Microsoft Azure
Provides entry-level and foundational Administrator
CompTIA certifications (e.g., A+, Network+,
Security+). o AWS Certified Developer
Offers networking certifications (e.g., Expert-Level Certifications:
Cisco
CCNA, CCNP, CCIE). o CISSP, CISA, CCIE
Offers cloud and enterprise certifications o AWS Certified Solutions Architect –
Microsoft
(e.g., Azure, Microsoft 365).
Professional
Amazon Web Provides cloud computing certifications Latest Trends in Computing (2024–2025)
Services (AWS) (e.g., AWS Certified Solutions Architect).
Trend Description
Offers Google Cloud certifications for
Google Cloud
developers and architects. Artificial Intelligence Includes machine learning, deep
(AI) learning, and generative AI.
Provides cybersecurity certifications (e.g.,
(ISC)²
CISSP, SSCP). Solving complex problems using
Quantum Computing
quantum mechanics.
Offers IT governance, audit, and security
ISACA
certifications (e.g., CISA, CISM). Processing data closer to the source
Edge Computing
to reduce latency.
Provides project management
PMI
certifications (e.g., PMP). Cybersecurity AI-driven threat detection and zero-
Advancements trust architecture.
Cloud Computing Multi-cloud, hybrid cloud, and
Popular Certifications in Computing
Evolution serverless computing.
Certification Area Provider
Internet of Things (IoT) Connecting everyday devices to the
CompTIA A+ IT Fundamentals CompTIA
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
Trend Description o Offered by providers such as CompTIA,
Cisco, Microsoft, and AWS.
internet for smart applications.
Emerging Trends:
Blockchain & Used beyond cryptocurrency in
Decentralized Tech supply chain, healthcare, and more. o AI, quantum computing, edge computing,
cybersecurity, and green tech are among
Ethical AI & Ensuring fairness, transparency, and the emerging trends.
Responsible Tech accountability in AI systems.
o These trends are shaping the future of
computing and the global economy.
Emerging Technologies Shaping the Future
Technology Impact Module 2.0: Introduction to Computers - Evolution of
Computers
Automating tasks, improving
AI & Machine I. Introduction: What is a computer?
decision-making, and enabling smart
Learning
systems.
Computer: An electronic device that is designed to
Includes AR, VR, and MR, used in manipulate data or information. It can store,
Extended Reality (XR) retrieve, and process information based on program
gaming, education, and remote work.
instructions. A computer essentially performs
Enabling faster connectivity for real- calculations and executes logical operations at high
5G and Beyond speeds.
time applications and IoT growth.
Virtual replicas of physical systems Basic Functions of a Computer:
Digital Twins used in manufacturing and 1. Input: The process of receiving data from
healthcare. devices such as keyboards, mice,
scanners, and digital cameras.
Self-driving cars, drones, and
Autonomous Systems
robotics. 2. Processing: The manipulation and
analysis of input data according to
Energy-efficient computing to reduce instructions.
Green Computing
environmental impact.
3. Output: The presentation of processed
Low-Code/No-Code Democratizing software development data in a usable format, like displaying it
Development for non-programmers. on a screen, printing it, or playing
audio/video.
Natural Language Enhancing human-computer
Processing (NLP) interaction through voice and text. 4. Storage: The function of saving data for
future use on devices like hard drives,
SSDs, or external storage.
How to Stay Updated with Trends II. Early Computing Devices
Follow tech news (e.g., TechCrunch, The Verge, Abacus: This is likely the most ancient calculating
Wired). device, dating back thousands of years. It is a
manual tool with beads on rods or wires that are
Join professional organizations (e.g., ACM, IEEE,
manipulated to perform arithmetic operations such
CompTIA).
as addition, subtraction, multiplication, and division.
Attend conferences (e.g., CES, RSA Conference,
Napier's Bones: Invented by John Napier in the
Google I/O).
17th century, this was a manual calculation aid. It
Enroll in online courses (e.g., Coursera, Udemy, was a set of rods with multiplication tables
edX). engraved on them, simplifying multiplication and
division.
Participate in webinars and workshops.
Pascaline: Created by Blaise Pascal in the 17th
Get certified in new technologies. century, this mechanical calculator could perform
addition and subtraction. It was a significant
Join online communities (e.g., Reddit, Stack innovation because it mechanized arithmetic
Overflow, GitHub). operations.
Summary Difference Engine & Analytical Engine: Charles
Babbage, known as the "father of computers,"
Certifications:
conceived these two machines in the 19th century.
o Help validate skills, boost career growth,
o Difference Engine: A mechanical
and offer specialization.
computer designed to accurately print
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
mathematical tables using the method of Characteristics: Computers became much smaller
differences. and more efficient, generating less heat and
consuming less power. Increased processing power
o Analytical Engine: Considered the allowed them to handle complex tasks faster.
precursor to modern computers, this
machine was programmable and could Operating Systems: Operating systems became
perform a variety of calculations, though it more sophisticated, managing hardware and
was never fully built due to technological software resources and allowing multiple users to
limitations at the time. interact with the computer simultaneously.
III. First-Generation Computers Time-Sharing Systems: This development allowed
multiple users to share a single computer's
Technology: These computers relied on vacuum resources by dividing processing time into small
tubes for computation. These were glass tubes with intervals.
a vacuum and electrodes that controlled the flow of
electrons to perform basic logic and arithmetic. VI. Fourth-Generation Computers
Characteristics: Technology: This era was marked by the
development of microprocessors, which put an
o Size: Enormous, often filling entire rooms. entire CPU on a single chip. This led to the creation
of smaller and more affordable microcomputers.
o Speed: Relatively slow.
Personal Computers (PCs): Microprocessors
o Power Consumption: Very high due to fueled the rise of PCs, which were designed for
the vacuum tubes, which also generated a individual use in homes, offices, and schools.
lot of heat.
Graphical User Interfaces (GUIs): GUIs were
o Programming: Difficult and required introduced to make computers more user-friendly
specialized knowledge. by replacing command-line interfaces with icons
and menus, allowing for more intuitive interaction.
Key Examples:
Networking and the Internet: Networking became
o ENIAC (Electronic Numerical Integrator
essential, allowing computers to communicate and
and Computer): One of the first share information. The invention of the internet
electronic, general-purpose computers, provided a global network of interconnected
developed during World War II to calculate computers.
artillery firing tables.
VII. Fifth-Generation Computers
o UNIVAC (Universal Automatic
Computer): The first commercial Key Concept: The fifth generation is primarily
computer, used for both scientific and characterized by the integration of Artificial
commercial purposes. Intelligence (AI).
IV. Second-Generation Computers Key Terms:
Technology: The key innovation was the o Artificial Intelligence (AI): The ability for
replacement of vacuum tubes with transistors. machines to think, reason, and learn like
Transistors were smaller, more reliable, and humans.
consumed less power and generated less heat.
o Expert Systems: A part of AI that
Characteristics: Computers became smaller, emulates a human expert's judgment in a
faster, and more reliable. specific field.
Programming: Assembly language was o Parallel Processing: A process where
introduced, which made programming easier than multiple processors work on different parts
using machine code by using mnemonics. The of a problem at the same time to increase
foundation for high-level languages like FORTRAN processing speed.
and COBOL was also laid.
Operating Systems: Batch operating systems
were developed to improve efficiency by grouping
multiple jobs together and processing them
sequentially. Future Trends and Possibilities:
V. Third-Generation Computers o Natural Language Processing:
Computers that can fluently understand
Technology: This generation was defined by the and respond to human language.
introduction of integrated circuits (ICs). These tiny
chips could pack thousands of transistors, o Robotics: Advanced robots that can
significantly reducing size and increasing perform complex tasks autonomously.
processing power.
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
o Quantum Computing: Using quantum Business: Computers increase efficiency by
mechanics to achieve exponentially faster automating tasks, improve productivity, and handle
computations. repetitive tasks. E-commerce also creates new
business opportunities and expands market reach.
o Neuromorphic Computing: Building
computers inspired by the structure of the Entertainment: Computers offer immersive and
human brain. interactive experiences through gaming, enable
digital music storage and streaming, and provide
o Augmented and Virtual Reality: access to a large library of films and TV shows.
Immersive experiences that blur the line Virtual reality offers immersive experiences that
between the physical and digital worlds. simulate real-world or imaginary worlds.
Summary Table of Key Concepts: II. Disadvantages of Computers
Generation Core Technology Key Developments Health Problems: Prolonged screen exposure can
lead to eye fatigue, dryness, and headaches.
ENIAC and UNIVAC, slow Sedentary work can lead to obesity and other
First
Vacuum Tubes speed, large size, high health issues. Excessive use can lead to addiction,
Generation
power consumption resulting in mental and physical problems.
Smaller, faster, more Privacy Concerns:
reliable computers,
Second o Data Breach: A cyber attack can
Transistors assembly language, and
Generation compromise sensitive personal
early high-level languages
like FORTRAN and COBOL information.
Miniaturization, increased o Identity Theft: Stealing personal data to
processing power, conduct fraudulent activities.
Third Integrated
sophisticated operating
Generation Circuits (ICs) o Surveillance: Government and corporate
systems, and time-sharing
surveillance of online activities is a
systems
concern.
Personal Computers (PCs), Dependence:
Fourth Graphical User Interfaces
Microprocessors
Generation (GUIs), networking, and the o Over-reliance on Technology: Excessive
Internet reliance on computers can erode problem-
solving and critical thinking skills.
Parallel processing,
supercomputers, and o Job Replacement: Automation and
Fifth Artificial
emerging technologies like artificial intelligence can cause
Generation Intelligence (AI)
quantum computing and unemployment in various sectors.
natural language processing
Digital Divide: Not everyone has access to a
computer or internet connection, which creates an
inequality of access.
MODULE 2.1 - INTRODUCTION TO COMPUTERS -
ADVANTAGES AND DISADVANTAGES OF COMPUTERS Environmental Consequences: The disposal of
electronic waste (e-Waste) leads to pollution. Data
I. Advantages of Computers
centers and other computer equipment consume a
Education: Computers have made education large amount of energy.
accessible through online courses and virtual III. Ethical Considerations
classrooms. They also provide access to a vast
array of educational materials and powerful Key Terms and Definitions:
research tools.
o Copyright: Restricts the use of original
Communication: Computers facilitate fast and works, including music, literature, and
efficient communication through email, connect software. Copying a copyrighted work
people globally via social media, and enable face- without permission is illegal.
to-face interaction across distances through video
conferencing. o Plagiarism: Submitting someone else's
work as your own without proper citation.
Remote Work: Computers allow people to work
from anywhere with an internet connection, which o Cyberbullying: The process of bullying or
increases flexibility and improves work-life balance. harassing someone using electronic
communication.
Healthcare: Computers improve patient care and
reduce errors through electronic health records. o Online Harassment: Includes online
They also provide remote healthcare services via threats, stalking, and other forms of
telemedicine and accelerate medical research. harassment.
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
o Digital Footprint: The information in 1947, which began replacing vacuum
available about a person online, which can tubes in computers.
have lasting effects.
o Robert Noyce's idea for putting an
o Privacy: A person's right to their own integrated circuit (IC) on a silicon chip in
personal data. 1959 paved the way for miniaturizing
computer components.
o Security: The protection of information
from unauthorized access, use, disclosure, o Ted Hoff, Federico Faggin, Stanley
modification, or destruction. Mazor, and Masatoshi Shima designed
the first commercially available
Summary Table of Key Concepts: microprocessor, released by Intel in 1971,
which made personal computers (PCs)
Ethical Issue Description possible.
Intellectual Deals with copyright and plagiarism, The Importance of Computers in Today's World:
Property ensuring original works are not used or Computers are an integral part of modern life,
Rights copied without permission or citation. revolutionizing fields like education, business,
healthcare, and entertainment.
Focuses on issues like cyberbullying and
Online Ethics online harassment, and the lasting effects o Education: They provide access to
of a person's digital footprint. knowledge, enable personalized learning,
and facilitate collaboration.
Involves the delicate balance between
Privacy vs. keeping personal information safe (privacy) o Business: They increase efficiency and
Security and protecting information from productivity, enable global connectivity,
unauthorized access (security). and aid in data-driven decision-making.
o Healthcare: They assist in diagnosis and
treatment, facilitate medical research, and
MODULE 2.2: INTRODUCTION TO COMPUTERS:
enable telemedicine.
ELEMENTS, DATA VS. INFORMATION
o Entertainment: They have revolutionized
I. Introduction
the creation and consumption of digital
What is a computer? A computer is an electronic content, and enable virtual experiences.
device that processes data according to instructions
o Society: They drive economic growth,
provided by computer programs. They can perform
a wide range of tasks, from basic calculations to social change, and global connectivity.
complex operations involving artificial intelligence II. Elements of a Computer
(AI).
Computer Hardware: The physical components of
Who invented the computer? The invention of the a computer. Hardware and software are
computer cannot be attributed to a single individual. complementary, and a device can only function
It was an evolution involving the work of numerous when both work together.
scientists, engineers, and mathematicians.
o Internal Components: Those necessary
o Charles Babbage is often referred to as
for the proper functioning of the computer.
the "father of the computer" for promoting
the concept of programming and Motherboard: A circuit board
automating computation. that holds the CPU and other
essential hardware, acting as a
o Ada Lovelace is credited with writing the central hub.
first algorithm intended for a machine.
CPU (Central Processing Unit):
o George Boole developed a logic system The "brain" of the computer that
that laid the groundwork for the binary processes and executes digital
system used in programming and digital instructions.
circuits.
RAM (Random Access
o Alan Turing formalized the concepts of Memory): Temporary memory
"algorithm" and "computation" with his storage that makes information
Turing machine. immediately accessible to
programs; it is volatile, so data is
o John von Neumann proposed the cleared when the computer is off.
principle of stored programs in 1945, a
significant innovation. Hard drive: A physical storage
device for permanent and
o John Bardeen, Walter Brattain, and temporary data.
William Shockley invented the transistor
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
SSD (Solid-state drive): A non- Category Data Information
volatile storage device that uses
flash memory and can safely and unstructured. structured.
store data when the computer is
powered down. Lacks meaningful Provides context and
Context
context. insights.
Graphics processing unit
(GPU): A chip that processes Simple and
Refined output that facilitates
graphical data. Purpose unorganized
strategic decision-making.
elements.
Network interface card (NIC): A
circuit board or chip that enables Understanding that changes
The number of
the computer to connect to a to a website led to an
Example visitors to a website
network. increase or decrease in
in one month.
monthly site visitors.
o External Components (Peripherals):
Items attached to the computer to add or
enhance functionality.
IV. Knowledge and Computers
Input Components: Devices that
provide instructions to the Knowledge: The application of information to solve
software. Examples include a problems or achieve goals.
mouse, keyboard, microphone,
Role of Computers in Creating Knowledge:
camera, and touchpad.
o Storing and processing vast amounts of
Output Components: Devices
data quickly and efficiently.
that render results from the
software's execution. Examples o Facilitating collaboration among people
include a monitor, printer, worldwide.
speakers, and headphones.
o Analyzing large datasets to identify
Computer Software: A collection of programs and
patterns and trends.
procedures that perform tasks on a computer.
o Simulating real-world scenarios to test
o System Software: The interface between
hypotheses.
application software and the system. It is
written in a low-level language and Impact of Computers on Decision-Making:
manages system resources. An operating
system (OS) is a main part of system o Providing access to vast amounts of
software, managing resources like the information.
CPU, printer, and hard disk.
o Enabling data-driven decisions by
o Application Software: Runs at the user's analyzing trends and patterns.
request on the platform provided by
system software. It is written in a high- o Supporting complex decision-making by
level language and performs specific modeling complex systems.
tasks, such as managing documents or
o Automating decision-making processes,
developing visuals.
freeing up human resources.
III. Data and Information
Key Terms and Definitions:
o Data: Raw, unprocessed facts, including
numbers, symbols, and text. Data is raw
and unstructured, lacking meaningful MODULE 2.3: INTRODUCTION TO COMPUTERS: TYPES
context. AND CLASSIFICATIONS
o Information: Data that has been I. Types of Computer Systems
processed, organized, and interpreted to
add meaning and value. Information is the Analog Computers: These computers operate on
comprehensible output derived from raw continuous, physical quantities like voltage,
data that helps inform decisions and pressure, or temperature. They represent data as
strategies. It provides context and insights. continuous electrical signals. They are fast but not
highly accurate and are used for specific, real-time
Summary Table of Key Concepts: measurements and simulations.
Category Data Information Digital Computers: These computers operate on
discrete data represented by binary digits (0s and
Form Raw, unprocessed, Processed, organized, and 1s). They are highly accurate, fast, and versatile.
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
They are the most common type of computer and Examples and Use
Type of Computer Description
are used in a wide range of applications, including Cases
PCs, laptops, and smartphones.
The fastest and
Hybrid Computers: These computers combine the Weather
most powerful
best features of both analog and digital computers. forecasting,
computers in the
They can handle both continuous and discrete data, Supercomputers scientific
world, designed for
making them useful in complex fields like medical simulations, and
complex
sciences, robotics, and industrial process control. nuclear research.
calculations.
II. Components of a Computer System
Hardware: The physical, tangible parts of a
computer system, such as the CPU, motherboard,
keyboard, and monitor. MODULE 2.4 - INTRODUCTION TO COMPUTERS:
RESEARCH AND APPLICATIONS IN DIFFERENT
Software: The set of instructions and programs that COMPUTING FIELDS
tell the hardware what to do. It is intangible and
includes applications like word processors and I. Artificial Intelligence and Machine Learning (AI/ML)
operating systems like Windows.
Definition: Artificial intelligence (AI) is a field of
Firmware: A type of software permanently stored science concerned with creating computers and
on a hardware device's read-only memory (ROM). It machines that can reason, learn, and act in ways
provides low-level control for the device's hardware. that would typically require human intelligence. It is
An example is the Basic Input/Output System a broad field encompassing computer science, data
(BIOS). analytics, linguistics, and neuroscience.
III. Classification of Computers How AI Works: AI systems learn and improve by
analyzing large amounts of data to find patterns.
Summary Table of Key Concepts: This process is guided by algorithms, which are
sets of rules or instructions.
Examples and Use
Type of Computer Description
Cases Types of AI:
Desktops, laptops, o Reactive machines: Limited AI that reacts
A general-purpose tablets, and to stimuli based on preprogrammed rules,
Personal
computer for smartphones for without using memory or learning from
Computer (PC)
individual use. home, office, and new data.
school tasks.
o Limited memory: Most modern AI uses
Small, low-cost Embedded systems memory to improve over time by being
computers like microwave trained with new data. Deep learning is a
Microcontrollers designed to perform ovens, washing subset of this type of AI.
specific, dedicated machines, and
tasks. digital cameras. o Theory of mind: A theoretical AI that
could emulate the human mind and make
High-performance decisions like a human, including
computers Engineering, 3D recognizing emotions.
designed for modeling, video
Workstations o Self-aware: A mythical machine that is
technical or editing, and
scientific scientific research. aware of its own existence and has
applications. human-level intellectual and emotional
capabilities.
o Artificial Narrow Intelligence (ANI): All
Powerful computers current AI is considered narrow, as it can
Web servers,
that manage only perform specific sets of actions.
database servers,
network resources
Servers and file servers for o Artificial General Intelligence (AGI): The
and provide
hosting websites ability for a machine to "sense, think, and
services to other
and storing data. act" like a human, which does not yet
computers.
exist.
Large, powerful
computers Large organizations AI Training Models:
designed for large- like banks, airlines, o Supervised learning: Uses labeled data
Mainframes
scale data and government
to map a specific input to a known output.
processing and agencies.
complex tasks. o Unsupervised learning: Learns patterns
from unlabeled data without a known end
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
result, and is good for pattern matching Applications: Business intelligence, market
and descriptive modeling. research, healthcare analytics, and fraud detection.
o Semi-supervised learning: A mixed III. Cybersecurity
approach where some data is labeled, and
the algorithm must organize and structure Definition: The practice of protecting computer
the data to achieve a known result. systems, networks, and data from unauthorized
access, use, or destruction.
o Reinforcement learning: An "agent"
learns a task through trial and error,
Types of Cyber Threats:
receiving positive or negative o Malware: Malicious software like viruses,
reinforcement. worms, and ransomware.
Artificial Neural Networks: A training model based o Phishing: Social engineering attacks that
on the human brain, composed of artificial neurons
trick individuals into revealing sensitive
(perceptrons) that classify and analyze data.
information.
o Feedforward neural networks (FF): Data
o Hacking: Unauthorized access to
flows in one direction through layers of
computer systems, often with malicious
neurons.
intent.
o Recurrent neural networks (RNN): Use
Cybersecurity Best Practices:
time-series data and have "memory" of
what happened in previous layers. o Encryption: Protecting data by converting
it into a secret code.
o Long/short term memory (LSTM): An
advanced form of RNN that can remember o Firewalls: Devices that filter network
what happened several layers ago using traffic to block unauthorized access.
"memory cells".
o Access controls: Restricting access to
o Convolutional neural networks (CNN): sensitive data based on user roles.
Most often used for image recognition,
they filter different parts of an image o Regular Updates: Keeping software and
through distinct layers. operating systems updated with the latest
security patches.
o Generative adversarial networks (GAN):
Two neural networks compete against o Employee Training: Educating
each other, with one (the generator) employees about cybersecurity risks.
creating examples and the other (the
discriminator) attempting to prove them Applications: Network security, data protection,
true or false. incident response, and compliance with regulations
like GDPR.
II. Data Science and Analytics
IV. Cloud Computing
Definition: A multidisciplinary field that combines
statistics, computer science, and domain expertise Definition: A model of delivering IT services over
to extract insights from data. It involves collecting, the internet, allowing users to access resources on-
cleaning, analyzing, and interpreting data to make demand without needing physical infrastructure.
informed decisions.
Cloud Service Models:
Key Terms and Definitions:
o Infrastructure as a Service (IaaS):
o Data collection: Gathering data from Provides fundamental computing
various sources such as databases, resources like servers, storage, and
surveys, and sensors. networking.
o Data cleaning: Identifying and correcting o Platform as a Service (PaaS): Offers a
errors, inconsistencies, and missing cloud-based platform for developers to
values in data. build and manage applications.
o Descriptive statistics: Summarizing and o Software as a Service (SaaS): Delivers
describing data using measures like mean, applications over the internet, accessible
median, and mode. through a web browser.
o Inferential statistics: Making inferences Cloud Deployment Models:
about a population based on a data
o Public Cloud: Shared resources
sample.
accessible to the general public.
o Data Mining: The process of discovering
patterns and relationships within a dataset.
INTRO TO
COMPUTINGINTRODUCTION
FIRST SEMESTER PRELIM
TO THE COURSE
o Private Cloud: Dedicated resources for a
single organization.
o Hybrid Cloud: A combination of public
and private clouds.
V. Internet of Things (IoT)
Definition: The interconnectedness of physical
devices, vehicles, and other objects embedded with
electronics, software, sensors, and network
connectivity. These devices collect and exchange
data to automate tasks and improve efficiency.
Components: Sensors, actuators, and networks.
Applications: Smart homes, smart cities, industrial
IoT, healthcare, and agriculture.
VI. Robotics
Definition: The branch of engineering that deals
with the design, construction, operation, and
application of robots. Robots are programmable
machines that can perform tasks autonomously or
with human guidance.
Types of Robots:
o Industrial Robots: Used in manufacturing
for repetitive tasks like assembly and
welding.
o Service Robots: Interact with humans in
service-oriented environments like
healthcare and hospitality.
o Autonomous Robots: Operate
independently without human intervention,
using sensors and AI.
Components: Sensors, actuators, and control
systems.
Applications: Manufacturing, healthcare, and
exploration.
VII. Human-Computer Interaction (HCI)
Definition: The study of the interaction between
humans and computers, focusing on designing
technology that is easy to use, efficient, and
enjoyable.
User Interface Design Principles: Clarity,
consistency, efficiency, aesthetics, affordances,
feedback, and flexibility.
User Experience Research Methods: User
testing, surveys, interviews, heuristic evaluation,
and eye tracking.
Applications: Website design, mobile app
development, virtual and augmented reality, and
wearable technology.