Skip to content

Exploring Gradient Descent Algorithms for Function Fitting with Artificial Data using Tensorflow

Notifications You must be signed in to change notification settings

enazari/gradient-descent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Fitting a Function to Artificial Data: An Introduction to Gradient Descent Algorithm

Welcome to this educational Jupyter file, where we will explore the Gradient Descent algorithm and its variations in the context of fitting a function to artificial data.

Here, I will cover three key variations of Gradient Descent: Gradient Descent, Stochastic Gradient Descent, and Mini-Batch Gradient Descent. I will begin by explicitly coding the closed forms of the derivatives and later demonstrate how Tensorflow can be used to calculate the derivatives more efficiently.

Throughout this file, I will focus on the simplicity of one-dimensional input and one-dimensional output, allowing us to grasp the core principles of these algorithms without unnecessary complexity.

I will begin by fitting a line, followed by fitting second, third, and sixth-degree polynomials to the data. Additionally, I will investigate fitting a function derived from the first five components of the Fourier series to our dataset. Lastly, we will experiment with fitting mixture functions.

Let's get started and unravel the power of Gradient Descent in action!

About

Exploring Gradient Descent Algorithms for Function Fitting with Artificial Data using Tensorflow

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published