0% found this document useful (0 votes)
28 views4 pages

Viva Questions

The document consists of a series of questions and answers related to artificial intelligence (AI), its applications, and related concepts such as machine learning (ML), natural language processing (NLP), and model evaluation metrics. It covers definitions, differences between AI and its subfields, ethical considerations, data handling, and performance evaluation methods. Additionally, it discusses various libraries in Python used for data analysis and visualization.

Uploaded by

pkt011979
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views4 pages

Viva Questions

The document consists of a series of questions and answers related to artificial intelligence (AI), its applications, and related concepts such as machine learning (ML), natural language processing (NLP), and model evaluation metrics. It covers definitions, differences between AI and its subfields, ethical considerations, data handling, and performance evaluation methods. Additionally, it discusses various libraries in Python used for data analysis and visualization.

Uploaded by

pkt011979
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

Viva Voce Questions

1. Define verbal-linguistic intelligence.


Ans. People who possess verbal-linguistic intelligence are skilled with words. They enjoy reading, express
themselves well in writing, and are able to easily recognise and understand the meaning and sounds of
words.
2. What is AI?
Ans. Al is the simulation of human intelligence in machines that are designed to think and act like humans.
3. How do machines become artificially intelligent?
Ans. Machines become intelligent once they are trained with sufficient data to help them achieve their
tasks. They learn and get better over time with more information.
4. Give any two applications of Al around us.
Ans. Two applications of Al are search engines and streaming platforms.
5. Give any two examples of smart machines that cannot be considered as examples of Al.
Ans. Automatic washing machines and smart TVs
6. List the three domains of Al.
Ans. The three domains of Al are Data Science, Computer Vision, and Natural Language Processing.
7. Explain the difference between AI, ML, and DL.
Ans. Al refers to the overall field of creating intelligent machines. ML is a subfield of Al that involves training
machines to learn from data. DL is a type of ML that uses artificial neural networks for complex data
modelling.
8. State any two applications of NLP.
Ans. Email filters or spam filters and sentiment analysis are common applications of NLP.
9. Define Al ethics.
Ans. Al ethics refers to a set of moral principles and techniques that form the guidelines for the development
and deployment of Al to ensure ethical and responsible use of the technology.
10. Define Al bias.
Ans. Al bias occurs when an algorithm produces results that are biased because it is trained on biased data.
11. Describe problem scoping.
Ans. Problem scoping refers to the process of choosing a problem that we wish to address using AI. It involves
identifying the problem's scope, objectives, and constraints in order to establish a clear understanding of
what needs to be addressed.
12. Define the 4Ws problem canvas.
Ans. The 4Ws problem canvas is a structured framework that helps with problem scoping. It consists of four
crucial parameters that we need to know while solving a problem: Who? What? Where? Why?

13. Define training data and testing data.


Ans. Training data is the initial dataset used to train an Al module. Testing data is used to evaluate the
performance of the Al module.
14. What is the role of data exploring in the Al project cycle?
Ans. The role of data exploring in the Al project cycle is to analyse the data and identify any patterns,
relationships, or anomalies in it.
15. What is modelling in the Al project cycle?
Ans. Modelling in the Al project cycle involves building a model that can be used to solve the problem defined
in the planning step. This may involve using machine learning algorithms or other Al techniques.
16. What is the role of evaluation in the Al project cycle?
Ans. The role of evaluation in the Al project cycle is to assess the performance of the Al model and determine
its accuracy and effectiveness in solving the problem it was designed to address.
17. What is data acquisition?
Ans. Data acquisition refers to the process of collecting and preparing data that will be used to train and
evaluate Al models.
18. What is data visualisation?
Ans. Data visualisation refers to the process of creating graphical representations of data to help understand and
communicate insights.
19. Describe rule-based model.
Ans. A rule-based model follows the rules defined by the developer and produces the required output. It
typically uses labelled data.
20. Define learning-based approach.
Ans. In learning-based approach, the machine learns on its own. It can handle both labelled and unlabelled data.
21. Define supervised learning.
Ans. Supervised learning is a task-driven process where models learn to make predictions or decisions based on
input data and corresponding output labels.
22. Mention the types of supervised learning models.
Ans. Classification and regression are the types of supervised learning models.
23. Define clustering.
Ans. Clustering is an unsupervised machine learning approach where the model groups the dataset into different
clusters or groups based on algorithms or rules that the machine generates on its own.
24. What are neural networks?
Ans. Neural networks are computational networks that are designed to mimic the structure of the human brain
and are inspired by how the human brain interprets and processes information.
25. What is the purpose of the NumPy library in Python?
Ans. The NumPy library in Python is used for numerical computing and provides support for arrays, matrices,
and multi-dimensional arrays. It is widely used in scientific computing,
data analysis, and machine learning, and provides functions for performing mathematical operations on
arrays and matrices.
26. What is the purpose of the Pandas library in Python?
Ans. The Pandas library in Python is used for data analysis and provides functions for manipulating, cleaning,
transforming, and visualising data. It provides data structures, such as Series and DataFrames, which are
similar to arrays and tables in Excel and provides functions for working with missing data, handling
outliers, grouping data, and more.
27. What is the Python library Matplotlib used for?
Ans. Matplotlib is a comprehensive library for creating static, animated, and interactive visualisations in
Python. It is often used for creating plots, charts, and graphs.
28. Describe scatter plots.
Ans. Scatter plots are used to plot discontinuous data, i.e., data that does not have any continuity in flow.
29. What is meant by resolution?
Ans. The number of pixels in an image is called its resolution. It determines the level of detail or clarity of
images displayed on screens or captured by cameras.
30. Define CNN.
Ans. CNN is a deep learning algorithm commonly used in image recognition and processing. It analyses an
image, extracts the most relevant features, and reduces its size to make it manageable while still preserving
its important features.
31. What is a kernel?
Ans. A kernel is also known as a convolution matrix or mask. In image processing, convolution is performed by
sliding a kernel, usually a matrix of size [3x3] or [5x5], sequentially over different portions of the picture.
32. Name the four layers of CNN.
Ans. The different layers of CNN are listed below:
o Convolution Layer o Rectified Linear Unit (ReLU)
o Pooling Layer o Fully Connected Layer
33. What is automatic summarization in NLP?
Ans. Automatic summarization is the process of creating a shorter version of a text while retaining its main
ideas and key points.
34. What is sentiment analysis in NLP?
Ans. Sentiment analysis is the process of determining the emotional tone of a text. It is used to determine
whether the tone of the text is positive, negative, or neutral.
35. What is text classification in NLP?
Ans. Text classification is the process of categorising text into different classes based on its contents. It is used
for applications such as spam filtering, sentiment analysis, and topic identification.
36. What are chatbots in NLP?
Ans. Chatbots are computer programs designed to mimic conversations with human users especially over the
internet.
37 . What is text normalisation in NLP?
Ans. Text normalisation is the process of converting text data into a standard form that can be easily processed
and understood by computers.
38. Define Sentence and word tokenization' in NLP.
Ans. Sentence and word tokenization is the process of breaking the text data into sentences and words for
further processing.
39 . What is text lemmatization and stemming in NLP?
Ans. Lemmatization and stemming are processes used to reduce words to their base form to reduce the
dimensions of the text data.
40. What are stop words in NLP?
Ans. Stop words are common words in a language that do not carry much meaning and are usually removed
from the text data.
41. What is bag of words in NLP?
Ans. Bag of words is a technique used to represent text data as numerical features for further processing
42. What is term frequency and inverse document frequency?
Ans. Term frequency and inverse document frequency are measures used to weigh the importance of words in a
text data.
43. What are the applications of TFIDF in NLP?
Ans. TFIDF is used in NLP to determine the relevance of words in a text data and is used in various
applications, such as text classification, text summarization, and document retrieval.
44. What is NLTK?
Ans. NLTK stands for Natural Language Toolkit and it is a popular Python library used for NLP tasks.
45. What are the main pre-processing steps in NLP?
Ans. The main pre-processing steps in NLP are text normalisation, sentence and word tokenization, text
lemmatization and stemming, stop words removal, and bag of words creation.
46. What is the purpose of text normalisation in NLP?
Ans. The purpose of text normalisation is to convert the text data into a standard form that can be easily
processed and understood by computers.
47. Why is TFIDF used in NLP?
Ans. TFIDF is used in NLP to determine the relevance of words in a text data and weigh their importance.
48. What is a confusion matrix?
Ans. A confusion matrix is a table that is used to evaluate the performance of a classification algorithm. It helps
to determine the number of true positive, true negative, false positive, and false negative predictions made
by the algorithm.
49. What is true positive?
Ans. True positive is a prediction made by the Al model that is accurate and the actual outcome is positive.
50. What is a false positive?
Ans. A false positive is a prediction made by the Al model that is incorrect and the actual outcome is negative.
51. What is a true negative?
Ans. A true negative is a prediction made by the Al model that is accurate and the actual outcome is negative.
52. What is a false negative?
Ans. A false negative is a prediction made by the Al model that is incorrect and the actual outcome is positive.
53. What is accuracy?
Ans. Accuracy is a measure of the percentage of correct predictions made by the Al model.
54. What is precision?
Ans. Precision is a measure of the percentage of positive predictions made by the Al model that are actually
correct.
55. What is recall?
Ans. Recall is a measure of the percentage of positive cases that the Al model correctly identifies.
56. What is F1 score?
Ans. F1 score is a measure that combines precision and recall to give a single score that represents the
performance of the Al model.
57. How does a confusion matrix help in evaluating the performance of an Al model?
Ans. A confusion matrix helps in evaluating the performance of an Al model by determining the number of true
positive, true negative, false positive, and false negative predictions made by the algorithm.
58. Why is F1 score important in Al model evaluation?
Ans. F1 score is important in Al model evaluation because it combines precision and recall to give a single
score that represents the performance of the Al model.
59. How does Al model evaluation help in improving the performance of the Al model?
Ans. Al model evaluation helps in improving the performance of the Al model by determining the strengths and
weaknesses of the model and providing insights into where improvements can be made.
60. How does Precision differ from Recall?
Ans. Precision refers to the ability of a model to not predict positive outcomes for negatives examples, while
recall refers to the ability of a model to identify all positive examples in a dataset.
61. Can a model have high accuracy but low recall?
Ans. Yes, a model can have high accuracy but low recall if it makes many false negative predictions, leading to
a high proportion of actual positive examples being missed.

You might also like