Basics of Machine learning series: Loss Functions

Hello! Today I will be writing a post to give a brief overview of loss functions and what purpose they serve in the world of machine learning and optimization.

Loss functions are something I had briefly covered in my post discussing gradient descent, and they play a huge part in gradient descent.

Why do we need loss functions?

Before describing the different types of loss functions, it is important to understand exactly why we need loss functions. Loss functions, simply give help map the performance of your machine learning algorithm over a given input/ set of inputs to a singular number. The closer our loss is to zero, the closer our algorithm is to performing optimally. The reason we need loss functions is also quite simple, they allow us to get a direction in which we can move to improve our machine learning model (I use the word direction on purpose as it ties in nicely with gradient descent).

Mean Squared Loss

For different types of machine learning problems, loss functions work a little differently, but the main idea behind them remains the same. One of the most popular loss functions and one that is quite intuitive to understand is mean square loss or L2 loss. In this loss function, you take the output of the machine learning algorithm and subtract that from the desired outcome and square the difference. Summing this difference squared over all the inputs, and then taking the average yields you loss. The formula for mean squared loss is as follows:

Where E is the expected output of input i and A is the actual output

The mean squared loss function has its own pros and cons. Since you are calculating the average of your loss, outliers will impact your overall loss heavily. However, if there aren’t expected to be a lot of outliers in your dataset, mean squared loss works quite well.

Mean squared loss works best for regression-type problems that have a result that is a number or a continuous type of solution.

This has been just one example of a loss function, I will be sure to write updates with other types of loss functions as well!

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.