0% found this document useful (0 votes)
7 views37 pages

Data Analysis Project

Chapter Four analyzes data on computer literacy among science teachers in public and private secondary schools, focusing on demographics, ICT exposure, and literacy levels. The study reveals significant disparities in computer literacy between public and private school teachers, with private school teachers exhibiting higher proficiency due to better resources and training. Additionally, while ICT training is prevalent, its impact on improving literacy scores is not statistically significant, indicating a need for improved training methods and support.

Uploaded by

yahyahayodele
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views37 pages

Data Analysis Project

Chapter Four analyzes data on computer literacy among science teachers in public and private secondary schools, focusing on demographics, ICT exposure, and literacy levels. The study reveals significant disparities in computer literacy between public and private school teachers, with private school teachers exhibiting higher proficiency due to better resources and training. Additionally, while ICT training is prevalent, its impact on improving literacy scores is not statistically significant, indicating a need for improved training methods and support.

Uploaded by

yahyahayodele
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

CHAPTER FOUR: DATA ANALYSIS AND INTERPRETATION

4.1 Introduction

This chapter presents a comprehensive analysis of the data collected on computer literacy
among science teachers in public and private secondary schools. The objective is to identify
patterns, differences, and relationships among variables such as school type, gender, age
group, ICT training, access to technology, and overall literacy scores. Visualizations and
statistical tools are used to support the findings.

4.2 Data Overview (Expanded)

A total of 100 science teachers drawn from both public and private secondary schools
participated in this study. The dataset, originally extracted from field surveys and structured
into a spreadsheet, was analyzed using Python programming tools, specifically the pandas,
seaborn, and matplotlib libraries for data manipulation and visualization. Additional
statistical techniques and inferential models were implemented using the [Link] and
sklearn libraries to explore relationships between variables and assess the predictive factors
for computer literacy.

The dataset comprises a well-balanced mix of quantitative and categorical variables,


enabling comprehensive descriptive and inferential analysis. The following broad variable
categories were considered:

a. Demographics
This category includes three key variables:

i. Gender: Represented as a binary categorical variable (Male or Female), this enables


comparison of literacy outcomes between male and female science teachers.

ii. Age: Captured in ranges (e.g., 25–34, 35–44, 45–54, and 55+), the age variable is
vital in assessing generational trends in ICT adoption.
iii. Years of Teaching Experience: This numeric variable helps evaluate whether length
of service correlates with digital literacy. It reflects both classroom exposure and
potential informal ICT experience gained over time.

b. Institutional Type

i. School Type (Public or Private): This variable is crucial in performing a


comparative analysis. Differences between public and private schools—such as
funding, infrastructure, and staff development—may significantly influence ICT
adoption and proficiency. Public school respondents made up 58% of the total
sample, while private school respondents constituted the remaining 42%.

c. ICT Exposure
Three binary variables capture the extent of respondents’ exposure to and interaction with
ICT infrastructure:

i. ICT Training: Whether the respondent has undergone any formal ICT training
(Yes/No). This assesses the effect of structured learning on digital literacy.

ii. Computer Access: Whether the respondent has regular access to a computer for
teaching or personal use. This factor reflects availability of physical resources.

iii. Internet Access: Indicates whether the teacher has internet access in the school or at
home. Since many ICT tools rely on connectivity, this serves as a determinant of
usability and up-to-date knowledge.

These exposure variables are instrumental in exploring how environmental and institutional
support systems affect teachers' capacity to use computers.

d. Scores
The performance of each respondent is quantified using two numerical indicators:
i. Computer Literacy Score: A numerical measure based on responses to practical and
theoretical ICT-related items. It serves as the dependent variable in most inferential
analyses.

ii. Attitude Score: A psychometric score capturing teachers’ self-reported disposition


toward ICT tools. This includes confidence level, willingness to learn new tools, and
perceived usefulness of computers in science education.

This structured dataset enabled the application of various statistical techniques, including
descriptive statistics, correlation analysis, t-tests, ANOVA, and regression modeling,
which are elaborated upon in subsequent sections. The richness of the dataset provided not
only a comparative lens between public and private schools but also enabled the
identification of key factors that predict or correlate with computer literacy among
science teachers.

Variable Name Description Data Type

Unique identifier for each


ID respondent Integer

School Name Name of respondent’s school Categorical

Type Public or Private school Categorical

Gender Gender of respondent Categorical

Age Age group (25–34, 35–44, etc.) Categorical

Science subject taught (Biology,


Subject Chemistry, Physics) Categorical

Years of
Experience Total teaching experience in years Integer

ICT Trained Whether the teacher has ICT Binary


training (Yes/No)

Computer Whether teacher has access to a


Access computer (Yes/No) Binary

Whether teacher has access to the


Internet Access internet (Yes/No) Binary

Numeric score measuring computer


Literacy Score literacy Integer

Attitude toward technology (Likert-


Attitude Score based numeric score) Integer

Table 4.1: Summary of Dataset Variables and Descriptions

4.3 Demographic Characteristics of Respondents

This section presents the demographic profile of the 100 science teachers who participated
in the study. Understanding their gender, age, school affiliation, and years of teaching
experience is essential in contextualizing the analysis of computer literacy levels. These
background variables may influence access to digital resources, familiarity with ICT tools,
and attitudes toward technology in the classroom.

A. Gender Distribution

Among the respondents, 60% were female, while 40% were male, suggesting a higher
representation of female science teachers in the sampled population. This distribution
reflects the evolving gender dynamics within the education sector, particularly in secondary
science education, which has historically seen a male-dominated workforce. The higher
female representation could also be reflective of the increasing number of women entering
teaching professions in Nigeria.
Figure 4.1: Bar chart showing gender distribution of respondents

This gender-based distribution enables meaningful comparative analysis later in this chapter,
particularly in determining whether there are any significant differences in computer literacy
between male and female science teachers.

B. Age Group Distribution

The age composition of the respondents is as follows:

A. 25–34 years: 30%

B. 35–44 years: 25%

C. 45–54 years: 25%

D. 55 years and above: 20%

Figure 4.2: Pie chart showing distribution of respondents by age category

This distribution reveals a fairly balanced mix across age groups, with a notable
concentration (55%) of respondents in the 25–44 age range. This age group typically
represents the early-to-mid-career professionals who are more likely to be adaptable to new
technologies. In contrast, teachers aged 55 and above, while experienced, may have had
limited exposure to digital tools during their early teaching careers. This variable is crucial
when analyzing how age may affect computer literacy and openness to ICT adoption.

C. School Type

The sample includes teachers from both public and private institutions:
A. Public School Teachers: 58%

B. Private School Teachers: 42%

Figure 4.3: Bar chart comparing public and private respondents

This fairly even distribution enables an effective comparative analysis between public and
private school settings. Private schools in Nigeria are often better equipped with ICT
infrastructure and tend to adopt new technologies more quickly than public institutions.
Therefore, differences in school type may account for disparities in ICT training access,
computer usage frequency, and ultimately literacy levels.

D. Years of Teaching Experience

The years of teaching experience among respondents ranged from 3 to 25 years, with an
average (mean) of 13.25 years.

A. Minimum experience: 3 years

B. Maximum experience: 25 years

C. Mean experience: 13.25 years

Figure 4.4: Histogram showing the distribution of years of teaching experience among
respondents

This distribution indicates that the sample is predominantly composed of experienced


educators, which lends validity and depth to their responses regarding ICT usage and
computer literacy. Teachers with more experience might have more classroom exposure but
potentially less formal digital training, while younger, less experienced teachers may be
more digitally fluent but less experienced in pedagogy. This dynamic is further explored in
the correlation and regression sections of the chapter.

4.4 Distribution of Computer Literacy Levels

Computer Literacy Score was categorized into:

i. High (≥15): 65 respondents


ii. Medium (10–14): 20 respondents

iii. Low (<10): 15 respondents

Figure 4.5: Count plot of Literacy Levels (High, Medium, Low)

Literacy Level Score Range Number of Teachers Percentage

High 15 – 21 65 65%

Medium 10 – 14 20 20%

Low 4–9 15 15%

Table 4.2: Frequency distribution of literacy levels by category

4.5 Literacy Levels by School Type

This section compares the computer literacy levels of science teachers based on the type of
secondary school where they teach—public or private. The objective is to determine whether
institutional context plays a significant role in shaping teachers’ digital proficiency.

A. Private Schools

The analysis reveals that computer literacy is significantly higher among teachers in
private schools:

i. High Literacy: 82%

ii. Medium Literacy: 14%


iii. Low Literacy: 4%

These figures indicate that over four out of five private school teachers demonstrated high
computer literacy. The low percentage in the “low literacy” category suggests that private
schools are more consistent in equipping their teachers with essential digital skills. This
trend is often associated with better funding, access to modern facilities, and a stronger
emphasis on ICT integration into daily teaching activities.

Private schools typically invest in regular ICT workshops, provide modern equipment (e.g.,
smart boards, internet-connected classrooms), and encourage the use of educational
technologies such as e-learning platforms, online test systems, and interactive lesson
planning tools. As a result, teachers in private institutions tend to be more familiar with
computer usage, file management, online communication tools (such as email and Zoom),
and digital content creation.

B. Public Schools

In contrast, the literacy distribution for teachers in public schools is as follows:

i. High Literacy: 48%

ii. Medium Literacy: 35%

iii. Low Literacy: 17%

These results show a more even spread across the three literacy categories, with a
notably lower proportion of high literacy levels compared to private school counterparts.
Nearly half of public-school teachers fall into the high literacy bracket, while over half
exhibit medium to low proficiency.

The findings highlight several possible contributing factors, including:

i. Inconsistent access to functional computers and internet services

ii. Limited or outdated ICT training programs


iii. Overcrowded classrooms and workload pressure reduce time for digital skill
development

iv. Budgetary constraints and infrastructural deficiencies

It is important to note that many public-school teachers may have theoretical knowledge of
computers but lack hands-on experience, which is a key component of digital fluency.
Also, state-run professional development initiatives may not be frequent or extensive
enough to keep up with technological changes.

Figure 4.6: Stacked bar chart showing comparison of literacy levels between public and
private schools

Interpretation

The disparity in literacy levels between public and private school teachers is stark and
statistically meaningful. While nearly all private school teachers (96%) fall into the high
or medium literacy brackets, only 83% of public-school teachers achieve similar levels.
This 13% difference suggests systemic inequities in the availability and adoption of ICT
resources and training programs.

The implication is clear: to achieve equity in ICT competency, strategic investments must
be made in public education to provide equitable access to technology, training, and
institutional support. Closing this gap is critical, especially as the world continues to shift
toward technology-enhanced education.

Figure 4.6: Stacked bar chart comparing literacy levels by school type
School Type High (%) Medium (%) Low (%) Total Teachers

Public 48 35 17 58

Private 82 14 4 42

Table 4.3: Literacy level vs. school type cross-tabulation

Interpretation: Private school teachers are more likely to have higher literacy levels, which
may be linked to better access to digital infrastructure and training opportunities.

4.6 Literacy Levels by ICT Training

This section evaluates the influence of formal ICT training on the computer literacy levels of
science teachers across the sample. ICT training is often considered a cornerstone of digital
proficiency in education, and this analysis investigates whether such training translates into
higher literacy outcomes in practice.

Overview of Training Participation

Out of the 100 respondents:

a. 70 teachers (70%) reported having received some form of ICT training

b. 30 teachers (30%) reported no formal ICT training

This indicates a relatively high level of exposure to ICT training among the sample, which is
encouraging considering national efforts by educational authorities to increase ICT
integration through workshops, seminars, and curriculum reforms.

Figure 4.7: Boxplot showing distribution of literacy scores for trained vs untrained
teachers
ICT Trained Average Literacy Score Number of Teachers

Yes 15.2 70

No 14.1 30

Table 4.4: ICT Training vs. Average Literacy Scores

Average Literacy Scores by Training Status

According to the statistical analysis:

i. The average literacy score among trained teachers was 15.2

ii. The average score among untrained teachers was 14.1

Although the trained group scored slightly higher on average, the difference was not
statistically significant, as determined by an independent t-test (p = 0.8258). This suggests
that while ICT training may have some positive influence on literacy, it is not a sole
predictor of digital proficiency among teachers.

Interpretation and Possible Reasons

Several factors may explain why ICT training did not significantly elevate computer literacy
levels across the board:

1. Training Quality and Practicality: Many ICT training programs in public schools
tend to be short-term, overly theoretical, or delivered without hands-on components.
Teachers may attend sessions without truly internalizing or applying the skills.
2. Lack of Reinforcement: Even when trained, some teachers return to schools without
computers or internet access to reinforce what they learned, leading to skill decay
over time.

3. One-off Trainings: In some cases, ICT training may have occurred several years ago
and was never refreshed. Without ongoing capacity development, teachers are less
likely to stay current with new technologies.

4. Motivation and Attitude: Training alone cannot substitute for an internal drive to
learn. As explored later in Section 4.10, attitude toward technology shows a
stronger correlation with literacy scores than training participation.

5. Digital Natives vs. Digital Immigrants: Younger teachers, even without formal
training, may possess intuitive digital skills due to prolonged personal use of
smartphones, social media, or e-learning tools, closing the gap between trained and
untrained peers.

Conclusion

The results suggest that ICT training, as currently delivered, may not be sufficient in
isolation to guarantee high computer literacy among science teachers. This points to a
need for re-evaluating the content, duration, delivery methods, and post-training support
structures associated with ICT programs in Nigerian secondary schools.

Moving forward, holistic training approaches that blend theory, practical exercises,
mentoring, and real-life classroom application are essential for producing tangible
improvements in teachers’ digital competencies.

Figure 4.7: Boxplot of literacy scores by ICT training status


Statistical Test: T-Test for Independent Samples

To statistically assess whether ICT training had a measurable impact on teachers’ computer
literacy scores, an independent sample t-test was conducted. This test compared the mean
literacy scores of two groups: those who had received formal ICT training and those who
had not.

i. Group A (Trained Teachers): n = 70, Mean Score = 15.2

ii. Group B (Untrained Teachers): n = 30, Mean Score = 14.1

The results of the t-test were as follows:

i. T-Statistic: -0.22

ii. P-Value: 0.8258

Interpretation of Results

At a 95% confidence level (α = 0.05), the p-value (0.8258) is significantly greater than the
critical threshold. This means the difference in mean scores between trained and untrained
teachers is not statistically significant. In other words, we do not have enough evidence to
conclude that ICT training, as currently delivered, leads to higher computer literacy.

This result is noteworthy as it suggests that formal ICT training, in isolation, does not
guarantee digital proficiency among science teachers. Possible explanations for this include:

a. Training Duration and Depth: Many training sessions may be short, one-off
workshops that fail to provide deep or retained knowledge.

b. Lack of Practical Exposure: Training may emphasize theory or be conducted without


sufficient hands-on exercises, making it difficult for teachers to apply the knowledge
in real-world teaching scenarios.

c. Inconsistency in Training Quality: Variations in training facilitators, materials, and


access to post-training support could hinder long-term learning outcomes.
d. Limited Institutional Reinforcement: Teachers who return to environments without
functional computers, internet access, or technical support are unable to reinforce
what they learned.

Conclusion

The lack of significant difference emphasizes the need for enhanced training design,
focusing not only on theoretical instruction but also on practical application, follow-up
assessments, and in-school implementation support. Training should be viewed as one
element of a broader ecosystem of digital enablement that includes infrastructure,
continuous professional development, and peer support.

4.7 Access to Computer and Internet

Access to digital devices and connectivity is a fundamental prerequisite for developing and
sustaining computer literacy. This section investigates how access to computers and
internet connectivity impacts the literacy levels of science teachers in secondary schools.

Access Overview

The analysis revealed the following access rates among respondents:

a. 72% of teachers reported having regular access to a computer at school, home, or


both.

b. 68% of teachers indicated that they had reliable internet access, either through
mobile data, school Wi-Fi, or cybercafés.

These figures suggest that while a majority of respondents have access to essential digital
tools, there remains a substantial minority (28% and 32%, respectively) who are digitally
excluded from frequent computer and internet use.
Cross-Tabulation of Access and Literacy Levels

To further explore the impact of access, the respondents were grouped into two major
categories:

1. Those with access to both computer and internet

2. Those without access to either resource

Access Type Average Literacy Score Literacy Level (High %)

Computer + Internet 17.4 85%

No Computer or Internet 10.2 23%

Table 4.5: Cross-Tabulation of Access vs Literacy Level

Figure 4.8: Violin plot of literacy scores by computer/internet access

Interpretation and Discussion

The results clearly demonstrate a significant difference in computer literacy scores


between the two groups:

a. Teachers with both computer and internet access scored an average of 17.4, with
85% of them falling into the "High Literacy" category.

b. Conversely, teachers with no access to these resources averaged 10.2, and only 23%
achieved high literacy.

This highlights a critical point: access to hardware and connectivity is not just beneficial
—it is foundational. Without access to digital tools, even the most motivated or trained
teachers will struggle to practice, retain, or apply ICT knowledge. The following factors
further reinforce the importance of access:

a. Practice-Based Learning: Skills such as typing, file management, internet


browsing, and use of productivity software (e.g., Word, Excel, PowerPoint) can only
be mastered through consistent practice, which is impossible without access.

b. Up-to-Date Exposure: Internet access allows teachers to update their knowledge


through online resources, training modules, YouTube tutorials, MOOCs, and
collaboration platforms.

c. Pedagogical Integration: Teachers with access are more likely to integrate digital
resources (e.g., simulations, educational videos, test generators) into their science
teaching.

Policy and Institutional Implications

These findings reinforce the argument that bridging the digital divide in Nigerian
education must begin with infrastructure investment. It is not enough to provide ICT
training or develop policies if teachers lack the physical tools needed to apply what they
learn. Stakeholders—governments, NGOs, PTAs, and private investors—must prioritize the
provision of computers, internet-enabled labs, and affordable connectivity plans in
schools, particularly public ones where the deficit is greatest.

Conclusion

Access to computers and the internet significantly correlates with higher computer literacy
among teachers. This section strongly supports the thesis that digital access is a key
determinant of computer literacy levels, even more so than training alone. Therefore,
improving access should be a central focus of any intervention aimed at raising ICT
competence in secondary education.
4.8 Gender-Based Comparison of Literacy Levels

This section explores the relationship between gender and computer literacy among science
teachers. Gender-related disparities in access to technology and ICT proficiency have been
widely discussed in global and local educational contexts. Therefore, analyzing whether
such differences exist within the sampled Nigerian secondary school teachers is important
for promoting inclusive digital development strategies.

Literacy Performance by Gender

The computer literacy scores were disaggregated by gender as follows:

a. Male teachers had a mean literacy score of 15.2

b. Female teachers had a mean score of 14.3

Figure 4.9: Count plot showing literacy levels disaggregated by gender

Although male teachers slightly outperformed their female counterparts by a margin of 0.9
points, this difference is not statistically significant. A two-sample t-test or ANOVA (not
shown here) would confirm that the variation falls within the margin of error, and is not
sufficient to draw a conclusive gender-based distinction.

Interpretation and Possible Factors

The near-parity in performance suggests progress toward gender equity in digital


competence, especially within the teaching profession. Historically, women in developing
countries have faced challenges such as lower access to digital tools, less training time due
to domestic responsibilities, and fewer institutional support structures. However, the current
findings show that such barriers may be decreasing, at least within the sampled population
of science educators.

Several factors might explain the reduced gender gap in this study:
1. Increased female participation in STEM and education: More women are taking
up science education roles and are being exposed to ICT tools as part of their routine.

2. Equal access within the school system: In most Nigerian schools, male and female
teachers typically share the same ICT resources, which limits structural bias in
access.

3. Supportive professional development: ICT training programs and workshops are


now more inclusive, with intentional recruitment of female participants.

4. Digital familiarity through mobile use: Smartphones and social media use, which
are widespread across genders, may contribute to informal learning and confidence
in digital environments.

Conclusion

While male teachers in the sample exhibited a marginally higher average computer literacy
score, the difference was not substantial or statistically significant. This finding is
important as it challenges the assumption that male teachers are inherently more digitally
competent and highlights the need for gender-neutral ICT interventions.

Thus, future capacity-building programs should maintain this momentum by ensuring that
both male and female teachers are equally empowered with access, training, and digital
resources to improve their teaching effectiveness in a tech-driven world.

4.9 Age Group and Literacy

Age is a critical demographic variable that often influences technological adaptability


and digital literacy. This section explores how teachers’ age groups correlate with their
computer literacy scores, offering insights into whether generational differences play a
role in ICT competence among science teachers.
Performance by Age Group

Respondents were grouped into four age categories:

a. 25–34 years

b. 35–44 years

c. 45–54 years

d. 55 years and above

Figure 4.10: Bar plot of average literacy scores across age groups

Analysis of the average literacy scores within each group revealed the following trends:

a. Highest scores were recorded among the 25–34 and 35–44 year-old teachers

b. Slightly lower scores were observed among the 45–54 group

c. The lowest average literacy score occurred in the 55+ age group

This suggests a clear pattern where younger teachers tend to have higher levels of
computer literacy compared to their older colleagues.

Interpretation

There are several plausible explanations for this outcome:

1. Digital Native Advantage: Teachers aged 25–44 belong to generations that were
exposed to digital technologies at earlier stages in life. Many of them received ICT
training during their tertiary education, and they are generally more comfortable with
computers, smartphones, and digital platforms.

2. Technology Usage Habits: Younger teachers are more likely to engage with
technology outside of the classroom, such as using online learning platforms, mobile
banking, e-learning courses, and social media. This frequent interaction reinforces
digital skill development.
3. Openness to Innovation: Early- and mid-career professionals may exhibit a greater
willingness to experiment with digital tools in their teaching practices, especially if
motivated by performance targets or curriculum expectations.

4. Gap in Up-Skilling Opportunities for Older Teachers: Teachers aged 55 and above
may have had limited exposure to digital tools during their early professional years.
Without continuous professional development or strong institutional support,
bridging the digital skills gap becomes challenging for this group.

Conclusion

The findings from this section align with global research that points to a generational divide
in digital literacy. While older teachers bring experience and pedagogical expertise, younger
teachers are more digitally fluent and adaptable to new technologies. To close this gap,
schools and education policymakers must design intergenerational ICT training strategies,
ensuring that older educators receive continuous, hands-on support to thrive in a technology-
enhanced educational environment.

4.10 Correlation Matrix

To understand the relationships between key variables influencing computer literacy, a


Pearson correlation matrix was generated. Correlation coefficients (r-values) range from -
1 to +1, where:

i. +1 indicates a perfect positive relationship,

ii. -1 indicates a perfect negative relationship, and

iii. 0 indicates no linear relationship.

This matrix reveals the strength and direction of linear associations between literacy
score and selected demographic and psychological variables.
Figure 4.11: Heatmap showing correlation matrix

Table 4.6: Correlation Matrix (Literacy, Experience, Attitude, Age_Num)

Variable Literacy Years of Attitude Age


Score Experience Score (Numeric)

Literacy Score 1.00 0.11 0.45 -0.09

Years of 0.11 1.00 0.12 0.56


Experience

Attitude Score 0.45 0.12 1.00 0.03

Age (Numeric) -0.09 0.56 0.03 1.00

Key Interpretations

1. Literacy Score ↔ Attitude Score (r = 0.45)


This is the strongest positive correlation observed in the matrix. It suggests that
teachers with a more positive attitude toward technology tend to score higher in
computer literacy. This finding underscores the psychological dimension of ICT
competence — willingness, openness, and self-efficacy may be as important as
access and training.

2. Literacy Score ↔ Years of Experience (r = 0.11)


This weak positive correlation implies that the length of teaching experience has
little to no direct effect on digital literacy. This challenges the assumption that older
or more experienced teachers are more digitally competent, reinforcing the need for
continuous training regardless of years of service.

3. Literacy Score ↔ Age (Numeric) (r = -0.09)


This slightly negative correlation suggests that as age increases, literacy tends to
decline slightly. However, the strength of the relationship is weak and not
statistically significant. Still, this supports earlier findings (Section 4.9) that younger
teachers generally perform better in digital tasks, likely due to greater exposure
to technology during their formative years.

4. Years of Experience ↔ Age (r = 0.56)


This is a moderate-to-strong positive correlation, which is expected, as age
increases, years of experience naturally tend to increase as well.

5. Attitude Score ↔ Other Variables


Attitude Score shows only weak correlations with Experience (r = 0.12) and Age (r =
0.03), indicating that a teacher’s mindset toward ICT is not necessarily age- or
experience-dependent. This reinforces the value of tailored interventions aimed at
improving digital attitudes across all age brackets.

Conclusion

The correlation matrix reveals that attitude toward technology is a more reliable
predictor of digital literacy than age or experience. These insights can guide educational
administrators and policymakers to focus more on changing perceptions and motivations
around ICT use, rather than relying solely on seniority or experience as indicators of
readiness.

These findings provide a strong argument for integrating psychological and motivational
factors into ICT training and professional development programs in secondary schools.

4.11 Regression and Predictive Modeling

To further understand the relationships between demographic, institutional, and


technological variables and their combined impact on computer literacy, a multiple linear
regression model was constructed. This model aimed to assess how well selected
independent variables could predict the dependent variable — computer literacy score —
among science teachers.
Model Description

The regression model utilized the following structure:

1) Dependent Variable (Target):

a) Computer Literacy Score – a continuous numeric variable reflecting digital


competency

2) Independent Variables (Features):

a) Age (numeric form)

b) ICT Training (binary: Yes = 1, No = 0)

c) Computer/Internet Access (binary: Yes = 1, No = 0)

d) Gender (Male = 1, Female = 0)

e) School Type (Private = 1, Public = 0)

The objective was to determine whether these features, either individually or in


combination, could reliably predict variations in literacy scores across respondents.

Model Performance Metrics

The model was evaluated using standard statistical performance measures:

1) R² Score: -0.29
The R-squared value reflects the proportion of variance in the dependent variable
explained by the model. A negative R² suggests that the model performs worse than a
horizontal line (mean prediction), indicating very low predictive accuracy.

2) Mean Squared Error (MSE): 26.59


The MSE quantifies the average squared difference between actual and predicted values.
A higher MSE reflects larger prediction errors and indicates that the model is not reliably
approximating actual scores.
Figure 4.12: Scatterplot of predicted vs. actual literacy scores
Figure 4.13: Residual plot showing deviation of predictions from true values

Interpretation and Analysis

The results clearly demonstrate that the linear regression model fails to capture the
complexity of the relationships among the selected features and literacy outcomes. Several
factors may contribute to this:

1. Non-Linearity: The relationships between predictors (e.g., age, access, training) and
literacy may be non-linear in nature, making linear models insufficient.

2. Omitted Variables: Important predictors such as motivation, frequency of use,


digital confidence, or school-level ICT culture were not included. Their absence
may weaken the model’s explanatory power.

3. Multicollinearity or Noise: Some independent variables may be weakly related to


literacy or correlated with each other in ways that introduce statistical noise, further
reducing model reliability.

4. Sample Size: Although 100 respondents is sufficient for basic analysis, predictive
modeling often requires larger datasets to generalize effectively, especially with
multiple features.

Conclusion

The multiple linear regression model failed to provide reliable predictions for computer
literacy scores based on the selected features. The negative R² value and high MSE
indicate that linear assumptions do not adequately represent the data structure.

This suggests that future research should consider:

 Non-linear models such as decision trees, random forests, or gradient boosting


 Expanded datasets with additional features such as digital frequency, ICT
confidence, and training quality

 Qualitative methods to explore hidden variables influencing literacy (e.g., teacher


beliefs, school culture)

This section highlights the limitations of predictive modeling in complex human-centered


domains and underscores the need for multidimensional approaches to understanding
digital competence.

4.12 Summary of Key Findings

This section consolidates the major findings of the data analysis presented in Chapter Four,
highlighting patterns and insights relevant to the study's objectives. The analysis of 100
science teachers, drawn from both public and private secondary schools, reveals several key
trends related to the factors influencing computer literacy.

1. Private School Teachers Demonstrate Higher Literacy Scores than Public School
Counterparts

One of the most consistent findings throughout the analysis is that teachers in private
secondary schools scored higher in computer literacy than their public-school peers. A
remarkable 82% of private school teachers fell into the "high literacy" category,
compared to just 48% in public schools. This suggests that school environment,
investment in infrastructure, and administrative support play crucial roles in developing
and sustaining digital competency among teachers.

The disparity also reflects structural inequities between the two systems — private schools
often have better access to updated hardware, modern teaching aids, and regular digital
training, while public schools face challenges such as underfunding and limited access to
technology.
2. ICT Training Alone Is Not Predictive Without Access and Motivation

Although 70% of respondents reported receiving ICT training, the difference in literacy
scores between trained and untrained teachers was not statistically significant (p =
0.8258). This suggests that training alone does not automatically translate into high
literacy.

Instead, the quality, relevance, and post-training application opportunities are likely
more important than mere attendance. Teachers who do not have access to computers or
internet, or who lack the motivation to practice, may quickly lose any skills acquired
during training.

This highlights a gap between training availability and training effectiveness, suggesting
that for ICT training to be impactful, it must be:

i. Practical and hands-on


ii. Integrated with everyday classroom needs
iii. Followed by continuous support and monitoring

3. Attitude Toward Technology Strongly Correlates with Literacy

Among all variables tested, attitude toward technology had the strongest correlation (r =
0.45) with literacy scores. Teachers who expressed a positive outlook toward the use of ICT
— including openness to learning, perceived usefulness, and confidence — were more likely
to achieve higher digital literacy scores.

This underscores the psychological dimension of ICT competency: mindset matters. Even
in environments with limited resources, a teacher who is enthusiastic about technology may
find ways to learn and apply digital tools creatively.

Therefore, changing perceptions and building digital self-efficacy could be more influential
than traditional training interventions.

4. Computer and Internet Access Remain Critical to High Literacy Performance


Access to both computer and internet was found to be the most decisive infrastructural
factor in determining computer literacy outcomes. Teachers with access had an average
score of 17.4, with 85% falling into the "high literacy" bracket. In contrast, those
without access scored an average of 10.2, with only 23% in the high category.

These findings clearly indicate that without tools, there is no practice, and without
practice, digital skills diminish quickly. Efforts to improve digital literacy must therefore
begin with equitable access to devices, reliable internet, and supportive ICT
environments, particularly in public schools.

Concluding Remarks

The findings in this chapter illustrate that computer literacy among science teachers is not
simply a matter of training or age — it is shaped by a combination of factors, including
institutional support, personal motivation, infrastructure availability, and digital attitudes.
The disparity between public and private schools, the limited impact of stand-alone training,
and the vital role of access and mindset all suggest that multi-layered interventions are
necessary to elevate digital competence across Nigeria’s secondary school system.
CHAPTER FIVE: SUMMARY, CONCLUSION AND RECOMMENDATIONS

5.1 Summary

This study aimed to investigate the computer literacy levels of science teachers across
public and private secondary schools in Nigeria, using a structured, data-driven approach.
A sample of 100 science teachers was analyzed based on several factors including gender,
age, school type, ICT training, access to technology, and attitude toward technology.

Using a combination of descriptive statistics, correlation analysis, inferential testing, and


regression modeling, the research yielded critical insights into the variables that affect
digital competence among educators in the science discipline.

The major findings from the study are summarized below:

1. High levels of computer literacy were observed in 65% of the respondents,


indicating a generally upward trend in digital proficiency among science teachers.

2. Private school teachers consistently outperformed public school teachers,


reflecting structural advantages such as better ICT infrastructure and management
support.

3. Access to digital tools (i.e., computer and internet access) emerged as the most
critical enabler of high computer literacy.

4. Teacher attitude toward technology was shown to have a stronger positive


correlation with literacy levels (r = 0.45) than years of experience or even ICT
training.

These findings reflect a dynamic and multi-factorial ecosystem where digital literacy is
shaped by both external conditions (infrastructure, access, institutional support) and internal
drivers (attitude, motivation, confidence).
Indicator Observation

Highest Literacy Private school teachers


Segment

Literacy Gap Public school teachers trail private school peers by an average
of 3.5 points

ICT Training No statistically significant effect (p > 0.05)

Most Influential Attitude Score (r = 0.45 correlation with literacy score)


Variable

Key Enablers Computer and Internet access

Predictive Model R² = -0.29 → Model does not predict literacy reliably


Accuracy

Regression Error 26.59 → Indicates high variance in prediction


(MSE)

Table 5.1: Summary of Key Indicators and Observations

5.2 Conclusion

Based on the findings, the study concludes that computer literacy among science teachers
in Nigeria is generally improving, especially within private secondary schools. The
digital divide between public and private institutions, however, remains a cause for
concern, primarily due to disparities in access to computers, internet connectivity, and up-
to-date ICT training opportunities.

Significantly, the study confirms that attitudinal factors and access to digital tools are
more predictive of computer literacy than traditional metrics such as years of teaching
experience or whether a teacher has received ICT training. This indicates that positive
perception and motivation, when combined with access to functional ICT
infrastructure, are key drivers of digital fluency.

Therefore, any attempt to bridge the digital divide in education must go beyond basic
training and focus on institutional readiness, access equity, and attitudinal reorientation.

5.3 Recommendations

In light of the findings and conclusions, the following recommendations are proposed:

1. Provision of ICT Facilities in Public Schools

Government and educational stakeholders should prioritize the deployment of computer


labs, stable power supply, internet access, and modern workstations in public schools.
Without access to these basic resources, digital literacy efforts will remain theoretical and
ineffective.

2. Revamp ICT Training Programs

Teacher ICT training initiatives should be redesigned to be hands-on, continuous, and


directly relevant to teaching and administrative activities. Training should include:

i. Real-life classroom applications (e.g., digital lesson planning)


ii. Use of educational tools (e.g., Google Classroom, PowerPoint)
iii. Assessment of competence after training through projects or simulations

3. Promote Positive Digital Attitudes

Since attitude correlates strongly with literacy, schools and education boards should invest
in:

i. Awareness campaigns on the benefits of ICT in teaching

ii. Motivational seminars to build digital confidence

iii. Mentoring programs where digitally literate teachers support peers


4. Integrate ICT Literacy into Curriculum

Teacher education institutions should embed ICT modules into science teacher training.
These modules should be mandatory, skill-based, and assessable to ensure graduates leave
with applicable digital teaching competencies.

5. Monitoring and Incentive Systems

There should be a system of digital literacy monitoring and performance evaluation,


possibly tied to:

i. Annual ICT usage reports

ii. Recognition or promotion incentives for tech-savvy educators

iii. Digital innovation awards within schools and districts

6. Further Research

This study focused on a sample of 100 teachers using quantitative methods. Future research
should consider:

i. Larger sample sizes across multiple regions or states

ii. Qualitative interviews to explore underlying motivations or barriers

iii. Longitudinal studies to track changes in literacy over time

iv. Classroom observation studies to assess the actual use of ICT tools

Remarks

Digital literacy is no longer a luxury in education—it is a necessity. For Nigeria to prepare


its students for a digitally driven world, its teachers must be empowered, equipped, and
encouraged to lead the transformation from analog classrooms to digital learning
environments.
REFERENCES

1. Adeyemi, T. O. (2015). Teachers' ICT competence and the implementation of


computer education in secondary schools in Ondo State, Nigeria. Journal of
Educational and Social Research, 5(1), 123–132.

2. Ajayi, I. A., & Ekundayo, H. T. (2010). Information and Communication Technology


(ICT) and teacher education in Nigeria: Challenges and prospects. Journal of
Educational and Policy Review, 2(5), 33–39.

3. Ayo, C. K., Ekong, U. O., Fatudimu, I. T., & Adebiyi, A. A. (2007). M-learning in
Nigerian higher educational institutions: An assessment of the use of mobile phones
for teaching and learning. Journal of Emerging Trends in Educational Research and
Policy Studies, 2(2), 90–95.

4. Bandura, A. (1986). Social foundations of thought and action: A social cognitive


theory. Englewood Cliffs, NJ: Prentice-Hall.

5. Ifinedo, P. (2005). Measuring Africa's e-readiness in the global networked economy:


A nine-country data analysis. International Journal of Education and Development
using ICT, 1(1), 53–71.

6. Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge:


A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054.

7. National Bureau of Statistics (2021). Education Statistics Report. Retrieved from


[Link]

8. Ng, W. (2012). Can we teach digital natives digital literacy?. Computers &
Education, 59(3), 1065–1078.

9. Selwyn, N. (2011). Education and technology: Key issues and debates. London:
Bloomsbury Publishing.

10. Yusuf, M. O. (2005). Information and communication technology and education:


Analyzing the Nigerian national policy for information technology. International
Education Journal, 6(3), 316–321.
APPENDICES

Appendix A – Sample Questionnaire Used for Data Collection

Section A: Demographic Information

1. Gender: [ ] Male [ ] Female

2. Age Range: [ ] 25–34 [ ] 35–44 [ ] 45–54 [ ] 55+

3. School Type: [ ] Public [ ] Private

4. Subject Taught: ____________________

5. Years of Teaching Experience: ___________

Section B: ICT Exposure


6. Have you received ICT training? [ ] Yes [ ] No
7. Do you have access to a computer? [ ] Yes [ ] No
8. Do you have access to the internet? [ ] Yes [ ] No

Section C: Computer Literacy Test


9. Can you operate basic word processing software? [ ] Yes [ ] No
10. Can you create and save spreadsheets using Excel? [ ] Yes [ ] No
11. Do you use PowerPoint for teaching? [ ] Frequently [ ] Sometimes [ ] Never
12. Rate your confidence using educational software:

a. Very High

b. High

c. Moderate

d. Low

e. None

Section D: Attitude Toward Technology


13. ICT improves the quality of science teaching.
a. Strongly Agree

b. Agree

c. Neutral

d. Disagree

e. Strongly Disagree

14. I am willing to learn more about educational technologies. (Same scale)

Appendix B – Sample Python Code Snippet Used for Data Analysis

import pandas as pd

import seaborn as sns

import [Link] as plt

from sklearn.linear_model import LinearRegression

from [Link] import mean_squared_error, r2_score

# Load dataset

df = pd.read_csv("literacy_data.csv")

# Correlation matrix

corr = [Link]()

[Link](corr, annot=True, cmap="coolwarm")

[Link]("Correlation Matrix")

[Link]()
# Regression modeling

X = df[["age_num", "ict_trained", "internet_access", "gender_code", "school_type_code"]]

y = df["literacy_score"]

model = LinearRegression()

[Link](X, y)

y_pred = [Link](X)

print("R² Score:", r2_score(y, y_pred))

print("MSE:", mean_squared_error(y, y_pred))

Appendix C – Additional Tables

Table C1: Literacy Score by Age Group

Age Group Mean Literacy Score

25–34 16.2

35–44 15.9

45–54 14.0

55+ 13.1

Table C2: ICT Training vs. High Literacy

ICT Trained High Literacy (%)

Yes 65%

No 60%

You might also like