From the course: Machine Learning in Telecommunication: From Basics to Real-World Cases

Unlock this course with a free trial

Join today to access over 24,800 courses taught by industry experts.

Gradient descent: Fine-tuning network models

Gradient descent: Fine-tuning network models

(bright music) - [Instructor] Now that we have an understanding of the cost function, let's talk about how we can minimize it. And the goal is to bridge the gap between the predicted values and the actual output. This is where gradient descent comes into play. It is an optimization algorithm that help us minimize the cost function. And it does this by updating the model parameters theta zero and theta one iteratively. In simple terms, gradient descent adjusts slope and intercept of the regression line, moving them in a direction that minimizes the cost function. As we keep iterating the gap within the predicted and the actual values, it gets smaller. And eventually, it comes down to a level where we achieve the global minima. Now, how does gradient descent work? The core idea behind gradient descent is to keep adjusting the values of theta zero and theta one based on the gradient at that point. So, we keep on changing theta zero and theta one. But along with that, there are certain…

Contents