Sveriges mest populära poddar
Data Skeptic

[MINI] Backpropagation

15 min7 april 2017

Backpropagation is a common algorithm for training a neural network.  It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network.  In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.

Data Skeptic med Kyle Polich finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.