Specific Issues in
Science, Technology,
and Society
Lesson 1
The Information Age
Objectives
At the end of lesson, the students should be able to:
define Information Age;
Discuss the history of Information Age; and
Understand the factors that need to be considered in checking website
sources.
Introduction
Highly modernized, automated, data-driven, and technologically advanced –
these best described our society nowadays, as evidenced by how information
could be transferred or shared quickly.
The different areas of society have been influenced tremendously such as
communication, economics, industry, health, and the environment.
Despite our gains due to the growing development of information technology,
the rapid upgrade of information also has disadvantages. This lesson will
discuss the history and impact of technological advancements to society.
Life is accompanied by endless transmission of information that take place
within and outside the human body.
Information is “knowledge communicated or obtained concerning a specific
fact or circumstance.” (Webster’s Encyclopedic Unbridged Dictionary)
Hence, information is a very important tool for survival.
The Information Age is defined as a “period starting in the last quarter of the
20th century when information became effortlessly accessible through
publications and through the management of information by computers and
computer networks” (Vocabulary.com., n.d.)
The means of conveying symbolic information (e.g. writing, math, other
codes) among humans has evolved with increasing speed.
The information Age is also called the Digital Age and the New Media Age
because it was associated with the development of computers.
According to James R. Messenger who proposed the Theory of Information Age
in 1982, “the Information Age is a true new age based upon the
interconnection of computers via telecommunications, with these information
systems operating on both a real-time and as-needed basis.
Furthermore, the primary factors driving this new age forward are
convenience and user-friendliness which, in turn, will create user
dependence.”
As man evolved, information and its dissemination has also evolved in many ways.
Eventually, we no longer kept them to ourselves; instead, we share them and
manage them in different means.
Information got ahead of us.
It started to grow at a rate we were unprepared to handle.
Because of the abundance of information, it was difficult to collect and manage
them starting in the 1960s and 1970s.
During the 1980s, real changes set in.
Richard Wurman called it “Information Anxiety”.
In the 1990s, information became the currency in the business world.
Information was the preferred medium of exchange and the information managers
served as information officers.
In the present generation, there is no doubt that information has turned out to be
a commodity, an overdeveloped product, mass-produced, and unspecialized.
Soon, we become overloaded with it.
Different authors have diverse, contrasting ideas on the evolution of the
Information Age.
In spite of this, we can still say that information is a very important tool that
helps improve our way of life.
One thing is for sure, the Information Age will continue to move forward and
far greater than our minds could imagine.
In his article “Truths of the Information Age” (n.d.), Robert Harris detailed
some facts on the Information Age. These are the following:
1. Information must compete. There is a need for information to stand out
and be recognized in the increasing clutter.
2. Newer is equated with truer. We forgot the truth that any fact or value
can endure.
3. Selection is a viewpoint. Choose multiple sources for your information if
you want to receive a more balanced view of reality.
4. The media sells what the culture buys. In other words, information is
driven by cultural priorities.
5. The early word gets the perm. The first media channel to expose an issue
often defines the context, terms, and attitudes surrounding it.
6. You are what you eat and so is your brain. Do not draw conclusions unless
all ideas and information are presented to you.
7. Anything in great demand will be counterfeited. The demand for
incredible knowledge, scandals, and secrets is ever-present; hence, many
events are fabricated by tabloids, publicists, or other agents of information
fraud.
8. Ideas are seen as controversial. It is almost certainly impossible to make
any assertion that will not find some supporters and some detractors.
9. Undead information walks ever on. Rumors, lies, disinformation, and
gossips never truly die down. They persist and continue to circulate.
10. Media presence creates the story. People behave much differently from
the way they would if being filmed when the media are present, especially
film news or television media.
11. The medium selects the message. Television is mainly pictorial, partial
aural, and slightly textual, so visual stories are emphasized: fires, chases, and
disasters.
12. The whole truth is a pursuit. The information that reaches us is usually
selected, verbally charged, filtered, slanted, and sometimes, fabricated.
What is neglected is often even more important than what is included.
Computer
Among the most important contributions of advances in the Information Age to
society are computers.
It is an electronic device that stores and processes data (information).
It runs on a program that contains the exact, step-by-step directions to solve a
problem (Ushistory.org, 2017).
Types of Computer
Computers are associated with numerous terms and descriptions.
Most people suggest the dimensions, intended use, or the computer’s power.
While the term “computer” can apply to virtually any device that has a
microprocessor in it, most people think of a computer as a device that receives
input from the user through a mouse (hand-guided directions tool) or
keyboard, processes it in some fashion, and presents the result on a screen.
1. Personal Computer (PC) - is a single-user instrument.
PCs were first known as microcomputers since they were a complete computer
but built on a smaller scale than the enormous systems operated by most
businesses.
2. Desktop Computer – a PC that is not designed for portability.
It will be set-up in a permanent spot.
A workstation is simply a desktop computer that has a more powerful processor,
additional memory, and enhanced capabilities for performing special group of
tasks, such as 3D graphics or game development.
Most desktop offer more storage, power, and versatility than their portable
versions (Ushistory.org, 2017).
3. Laptops –portable computers that integrate the essentials of a desktop computer
in a battery-powered package, which are somewhat larger than a typical hardcover
book.
They are commonly called notebook.
4. Personal Digital Assistants (PDAs) – are tightly integrated computers that usually
have no keyboards but rely on a touch screen for user input.
They are usually smaller than a paperback, lightweight, and battery-powered
(Ushistory.org, 2017)
5. Server – is a computer that has been improved to provide network services to
other computers.
Servers usually boast powerful processors, tons of memory, and large hard drives
(Ushistory.org, 2017)
6. Mainframes – are huge computer systems that can fill an entire room.
They are used especially by large firms to describe the large, expensive machines
that process millions of transactions every day.
The term “mainframe” has been replaced by enterprise server.
Although some supercomputers are single computer systems, most comprise
multiple, high-performance, parallel computers working as a single system
(Ushistory.org, 2017).
7. Wearable Computers – involve materials that are usually integrated into
cellphones, watches, and other small objects or places.
They perform common computer applications such as databases, email,
multimedia, and schedulers (Ushistory.org, 2017).
The World Wide Web (Internet)
Several historians trace the origin of the Internet to Claude E. Shannon, an
American Mathematician who was considered as the “Father of Information Theory”
He worked at Bell Laboratories and at the age 32, he published a paper proposing
that information can be quantitatively encoded as a sequence of ones and zeroes.
The Internet is a worldwide system of interconnected networks that facilitate data
transmission among enumerable computers.
It was developed during the 1970s by the Department of Defense.
In case of an attack, military advisers suggested the advantage of being able to
operate on one computer from another terminal.
In the early days, the Internet was used mainly by scientists to communicate
with other scientists.
The internet remained under government control until 1984 (Rouse, 2014).
One early problem faced by Internet users was speed.
Phone lines could only transmit information at a limited rate.
The development of fiber-optic cables allowed for billions of bits of
information to be received every minute.
Companies like the Intel developed faster microprocessors so personal
computers could process the incoming signals at a more rapid rate
(Ushistory.org, 2017).
Sergey Brin and Larry Page, directors of a Stanford research project, built a
search engine that listed results to reflect page popularity when they
determined that the most popular result would frequently be the most usable.
After talking to family, friends, and other investors into contributing $1
million, the researchers launched their company in 1998.
Google is now the world’s most popular search engine, accepting more than
200 million queries daily.
Back then, new forms of communication were also introduced.
Electronic mail, or email, was a suitable way to send a message to fellow
workers, business partners, or friends.
Messages could be sent and received at the convenience of the individual.
A letter that took several days to arrive could be read in minutes.
Internet services providers like America Online and CompuServe set up
electronic chat rooms.
These were open areas of cyberspace where interested parties could join in a
conversation with perfect strangers.
“Surfing the net” became a pastime in and of itself (UShistory.org, 2017).
Consequently, companies whose business are built on digitized information have
become valuable and powerful in a relatively short period of time; the current
Information Age has spawned its own breed of wealthy influential brokers, from
Microsoft’s Bill Gates to Apple’s Steve Jobs to Facebook’s Mark Zuckerberg.
Critics charged that the Internet created a technological divide that increased
the gap between the members of the higher class and lower class of society.
Those who could not afford a computer or a monthly access fee were denied
these possibilities.
Many decried the impersonal nature of electronic communication compared to a
telephone call or a handwriting letter.
On one hand, the unregulated and loose nature of the Internet allowed pornography
to be broadcast to millions of homes.
Protecting children from these influences or even from meeting violent predator
would prove to be difficult.
Nowadays, crimes in various forms are rampant because of the use of social media.
Cyberbullying is an issue that poses alarm worldwide.
Consequently, we need to be aware of the possible harm and damage due to abuse
of these advances in the Information Age.
Application of Computers in Science and Research
One of the significant applications of computers for science and research is evident
in the field of bioinformatics.
Bioinformatics is the application of information technology to store, organize, and
analyze vast amount of biological data which is available in the form of sequences
and structures of proteins - the building blocks of organisms and nucleic acids – the
information carriers (Madan, n.d.)
Early interest in bioinformatics was established because of a need to create
databases of biological sequences.
The human brain cannot store all the genetic sequences of organisms and this
huge amount of data can only be stored, analyzed, and be used efficiently
with the use of computers.
While the initial databases of protein sequences were maintained at
individual laboratories, the development of a consolidated formal database,
known as SWISS-PROT protein sequence database, was initiated in 1986.
It now has about 70,000 protein sequences from more than 5,000 model
organisms, a small fraction of all known organisms.
The enormous variety of divergent data resources is now available for study
and research by both academic institutions and industries.
These are made available as public domain information in the larger interest
of research community through the Internet (www.ncbi.nlm.nih.gov) and CD-
ROMs (on request from www.rcsb.org).
These databases are constantly updated with additional entries (Madan, n.d.)
Computers and software tools are widely used for generating these databases
and to identify the function of proteins, model the structure of proteins,
determine the coding (useful) regions of nucleic acid sequences, find suitable
drug compounds from a larger pool, and optimize the drug development
process by predicting possible targets.
Some of the software tools which are handy in the analysis include: BLAST (used
for comparing sequences); Annotator (an interactive genome analysis tool); and
GeneFinder (tool to identify coding regions and splice sites) (Madan, n.d.)
The sequence information generated by the human genome research, initiated in
1988, has now been stored as a primary information source for future
applications in medicine.
The available data is so huge that if compiled in books, the data would run into
200 volumes of 1,000 pages each and reading alone (ignoring understanding
factor) would require 26 years working around the clock.
For a population of about five billion human beings with two individuals differing
in three million bases, the genomic sequence difference database would have
about 15,000,000 billion entries.
The present challenge to handle such huge volume of data is to improve database
design, develop software for database access, and manipulation and device data-
entry procedures to compensate for the varied computer procedures and systems
used in different laboratories.
The much-celebrated complete human genome sequence which was formally
announced on the 26th of June 2000 involved more than 500 x 10 18 (500 million
trillion) calculations during the process of assembling the sequences alone.
This can be considered as the biggest exercise in the history of computational
biology (Madan, n.d.)
Moreover, from the pharmaceutical industry’s point of view, bioinformatics is
the key to rational drug recovery.
It reduces the number of trials in the screening of drug compounds and in
identifying potential drug targets for a particular disease during high-power
computing workstations and software like Insight.
This profound application of bioinformatics in genome sequence has led to a
new era in pharmacology – Pharmacogenomics, where potential targets for
drug development are hypothesized from the genome sequences.
Molecular modeling, which requires a lot of calculations, has become faster
due to the advances in computer processors and its architecture (Madan,
n.d.)
In plant biotechnology, bioinformatics is found to be useful in the areas of
identifying diseases resistance genes and designing plants with high nutrition
value (Madan, n.d.)