Logistics Regression
• Logistic regression predicts the output of a
categorical dependent variable. Therefore
the outcome must be a categorical or
discrete value. It can be either Yes or No,
0 or 1, true or False, etc. but instead of
giving the exact value as 0 and 1, it gives
the probabilistic values which lie
between 0 and 1.
• Logistic Regression is much similar to the
Linear Regression except that how they
are used. Linear Regression is used for
solving Regression problems,
whereas Logistic regression is used for
solving the classification problems.
• In Logistic regression, instead of fitting a
regression line, we fit an "S" shaped
logistic function, which predicts two
maximum values (0 or 1).
• The sigmoid function is a mathematical
function used to map the predicted values
to probabilities.
• It maps any real value into another value
within a range of 0 and 1.
• The value of the logistic regression must
be between 0 and 1, which cannot go
beyond this limit, so it forms a curve like
the "S" form. The S-form curve is called
the Sigmoid function or the logistic
function.
• In logistic regression, we use the concept
of the threshold value, which defines the
probability of either 0 or 1. Such as values
above the threshold value tends to 1, and
a value below the threshold values tends
to 0.
• Type of Logistic Regression:
• On the basis of the categories, Logistic
Regression can be classified into three types:
• Binomial: In binomial Logistic regression, there
can be only two possible types of the dependent
variables, such as 0 or 1, Pass or Fail, etc.
• Multinomial: In multinomial Logistic regression,
there can be 3 or more possible unordered types
of the dependent variable, such as "cat", "dogs",
or "sheep"
• Ordinal: In ordinal Logistic regression, there can
be 3 or more possible ordered types of dependent
variables, such as "low", "Medium", or "High".
Example of Logistic Regression
Naïve Bayes Classifier
• Naïve Bayes algorithm is a supervised
learning algorithm, which is based
on Bayes theorem and used for solving
classification problems.
• It is mainly used in text classification that
includes a high-dimensional training dataset.
• Naïve Bayes Classifier is one of the simple and
most effective Classification algorithms which
helps in building the fast machine learning
models that can make quick predictions.
• It is a probabilistic classifier, which means it
predicts on the basis of the probability of an
object.
• Some popular examples of Naïve Bayes
Algorithm are spam filtration, Sentimental
analysis, and classifying articles.
Bayes' Theorem:
• Bayes' theorem is also known as Bayes' Rule
or Bayes' law, which is used to determine the
probability of a hypothesis with prior
knowledge. It depends on the conditional
probability.
• The formula for Bayes' theorem is given as:
• Naïve Bayes Classifier Algorithm
• Bayes' theorem is also known as Bayes'
Rule or Bayes' law, which is used to
determine the probability of a hypothesis with
prior knowledge. It depends on the
conditional probability.
• The formula for Bayes' theorem is given as:
• Where,
• P(A|B) is Posterior probability: Probability of
hypothesis A on the observed event B.
• P(B|A) is Likelihood probability: Probability of
the evidence given that the probability of a
hypothesis is true.
• P(A) is Prior Probability: Probability of
hypothesis before observing the evidence.
• P(B) is Marginal Probability: Probability of
Evidence.
Problem: If the weather is sunny, then
the Player should play or not?
Example of Naïve Bayes(picnic day)
• You are planning a picnic today, but
the morning is cloudy
• Oh no! 50% of all rainy days start off
cloudy!
• But cloudy mornings are common
(about 40% of days start cloudy)
• And this is usually a dry month (only
3 of 30 days tend to be rainy, or
10%)
• What is the chance of rain during
the day?
• We will use Rain to mean rain during
the day, and Cloud to mean cloudy
morning.
• The chance of Rain given Cloud is
written P(Rain|Cloud)
Example of Naïve bayes