Skip to main content

14 docs tagged with "Deep Learning"

View all tags

Activation Functions & Derivatives

Activation functions are what make neural networks more than just glorified linear regressions. They add non-linearity — which gives neural networks their superpower: the ability to learn complex patterns and behaviors. Their derivatives are critical for training the network effectively through backpropagation.

Weight Initialization in Neural Networks

You’ve learned how to do forward propagation, backpropagation, and gradient descent. But now comes a crucial design decision: How do we initialize the weights?

weights

In machine learning, weights are crucial parameters that are learned during the training of a model, such as a neural network. They help the model make predictions by assigning importance to the inputs it receives. Let's break down the concept further: