From the course: Machine Learning in Telecommunication: From Basics to Real-World Cases
Unlock this course with a free trial
Join today to access over 24,800 courses taught by industry experts.
Gradient descent: Fine-tuning network models
From the course: Machine Learning in Telecommunication: From Basics to Real-World Cases
Gradient descent: Fine-tuning network models
(bright music) - [Instructor] Now that we have an understanding of the cost function, let's talk about how we can minimize it. And the goal is to bridge the gap between the predicted values and the actual output. This is where gradient descent comes into play. It is an optimization algorithm that help us minimize the cost function. And it does this by updating the model parameters theta zero and theta one iteratively. In simple terms, gradient descent adjusts slope and intercept of the regression line, moving them in a direction that minimizes the cost function. As we keep iterating the gap within the predicted and the actual values, it gets smaller. And eventually, it comes down to a level where we achieve the global minima. Now, how does gradient descent work? The core idea behind gradient descent is to keep adjusting the values of theta zero and theta one based on the gradient at that point. So, we keep on changing theta zero and theta one. But along with that, there are certain…
Contents
-
-
-
-
-
-
(Locked)
Linear regression basics for telecom analytics2m 42s
-
(Locked)
Using hypothesis testing to predict network performance5m 38s
-
(Locked)
Cost function explained: Measuring telecom model accuracy5m 3s
-
(Locked)
Gradient descent: Fine-tuning network models5m 30s
-
(Locked)
Overfitting vs. underfitting: Optimizing for telecom predictions3m 36s
-
(Locked)
-
-
-
-
-
-