Skip to content

udit-rawat/Synthax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Synthax

Introduction

This notebook focuses on the nature of optimisation approaches used for deep learning and create the model from scratch using JAX and demonstrate the symbolic nature of equation of the expression using Sympy.

Use Case

The main purpose of this project is to understand about the nature of two variants of optimisers that is gradient descent and newton's second moment update using hessian matrix.To compare this a minimum viable model is produced from scratch based on JAX and various demonstration of processes involved in deep learning are explored.

Key Highlights

  • Utilization of JAX for efficient numerical optimization.
  • Sympy's symbolic mathematics for clear formulation and demonstration.
  • Testing and comparison of simple gradient descent and Newton's second moment.
  • Using visualisation to demonstrate the complex nature of the optimisation process .
  • Explore the less popular approach of newton second moment update with the more popular used gradient descent algorithm of optimisation

About

A JAX + SymPy based project that explores optimization in ML using hessian matrix and graidents.

Topics

Resources

License

Stars

Watchers

Forks

Contributors