0% found this document useful (0 votes)
6 views6 pages

21bcs8803 SC Lab 1.2

The document outlines Experiment No. 2 conducted by a student in a Soft Computing Lab, focusing on designing a neural network with and without activation functions and implementing basic logic gates (AND, OR, NOT) using the Mc Culloch-Pitt Model. It includes theoretical explanations, code implementations for both types of neural networks, and outputs for the logic gates. The learning outcomes highlight the student's understanding of neural networks and related functions in Python.

Uploaded by

itsmesubbu2023
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views6 pages

21bcs8803 SC Lab 1.2

The document outlines Experiment No. 2 conducted by a student in a Soft Computing Lab, focusing on designing a neural network with and without activation functions and implementing basic logic gates (AND, OR, NOT) using the Mc Culloch-Pitt Model. It includes theoretical explanations, code implementations for both types of neural networks, and outputs for the logic gates. The learning outcomes highlight the student's understanding of neural networks and related functions in Python.

Uploaded by

itsmesubbu2023
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Experiment No.

2
Student Name: MARAM LEELA KRISHNA SUBRAHMANYAM
UID: 21BCS8803
Branch: CSE-AIML Section/Group: 20AML-6B
Semester: 5th Date of Performance: 22/08/2022
Subject Name: Soft Computing Lab Subject Code: 20-CSP-347

1. Aim/Overview of the practical:


a) To design a simple neural network with and without activation function.
b) Implementation of basic gates (AND, OR, NOT) using Mc Culloch-Pitt Model.
2. Task to be done:
To make Logic Gates using Neural Networks and also implementing Neural Networks with
and without Activation Function.

3. Theory:
A neural network also known as artificial neural network (ANN) is the basic building block
of deep learning. It consists of layers of sigmoid neuron stacked together to form a bigger
architecture.

Neural networks are just the weighted sum of the inputs. So, the learning of neural networks
is based on updating these weights. We need a method to update the weights. It is based on
how good the neural network is performing. Performance of the neural network means how
good are the predictions based on the actual labels that are to be predicted. The value at the
output layer is calculated by crossing through the neural network and finding the value of
each neuron. This process of crossing through the neural network is called forward
propagation.
4. Code:

a) Without Activation Function :


import numpy as np

import pandas as pd

import matplotlib.pyplot as plt

from sklearn.linear_model import LinearRegression

df = pd.read_csv("Salary.csv")

print(df.head())

model = LinearRegression()

model.fit(df[['YearsExperience']],df['Salary'])

print(model.predict([[3.4]]))

print(model.predict([[1.1]]))

With Activation Function (I used Sigmoid function for implementation):


from numpy import exp, array, random, dot

class NeuralNetwork():

def __init__(self):

random.seed(1)

self.synaptic_weights = 2 * random.random((3, 1)) - 1

def __sigmoid(self, x):

return 1 / (1 + exp(-x))

def __sigmoid_derivative(self, x):

return x * (1 - x)

def train(self, training_set_inputs, training_set_outputs, number_of_training_iterations):

for iteration in range(number_of_training_iterations):


output = self.think(training_set_inputs)

error = training_set_outputs - output

adjustment = dot(training_set_inputs.T, error * self.__sigmoid_derivative(output))

self.synaptic_weights += adjustment

def think(self, inputs):

return self.__sigmoid(dot(inputs, self.synaptic_weights))

if __name__ == "__main__":

neural_network = NeuralNetwork()

print("Random starting synaptic weights: \n", neural_network.synaptic_weights)

training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])

training_set_outputs = array([[0, 1, 1, 0]]).T

neural_network.train(training_set_inputs, training_set_outputs, 10000)

print("New synaptic weights after training: \n", neural_network.synaptic_weights)

print("Considering new situation [1, 0, 0] -> ?: ", neural_network.think(array([1, 0, 0])))

b) Implementation of AND Gate:


import numpy as np

import pandas as pd

def threshold(x):

return 1 if x>=2 else 0

def fire(data, weights, output):

for x in data:

weighted_sum = np.inner(x, weights)

output.append(threshold(weighted_sum))

data = [[0,0],[0,1],[1,0],[1,1]]
weights = [1,1]

output = []

fire(data, weights, output)

t = pd.DataFrame()

t['X1'] = [0,0,1,1]

t['X2'] = [0,1,0,1]

t['y'] = pd.Series(output)

print(t)

Implementation of OR Gate:
import numpy as np

import pandas as pd

def threshold(x):

return 1 if x>=0.5 else 0

def fire(data, weights, output):

for x in data:

weighted_sum = np.inner(x, weights)

output.append(threshold(weighted_sum))

data = [[0,0],[0,1],[1,0],[1,1]]

weights = [1,1]

output = []

fire(data, weights, output)

t = pd.DataFrame()

t['X1'] = [0,0,1,1]

t['X2'] = [0,1,0,1]
t['y'] = pd.Series(output)

print(t)

Implementation of NOT Gate:


import numpy as np
import pandas as pd

def threshold(x):
return 1 if x>=0 else 0
def fire(data, weights, output):
for x in data:
weighted_sum = np.inner(x, weights)
output.append(threshold(weighted_sum))

data = [0, 1]
weights = [-1]
output = []
fire(data, weights, output)
t = pd.DataFrame()
t['X1'] = [0, 1]
t['y'] = pd.Series(output)
print(t)

5. Outputs:

Without Activation Function

With Activation Function


Output for AND Gate

Output for OR Gate

Output for NOT Gate


Learning outcomes (What I have learnt):

1. Came to learn about implementation of Neural Network using and without using

Activation Function.

2. Came to learn about implementation of Logic Gates using Neural Networks.

3. Came to learn about different functions related to Numpy and Pandas.


Evaluation Grid:
Sr. No. Parameters Marks Obtained Maximum Marks
1.

2.

3.

You might also like