Glossary

List of deep learning terms

Activation Function

Introduces non-linearity into a neural network, enabling it to learn complex patterns.

Backpropagation

Algorithm used to compute the gradients of the loss function with respect to each parameter.

Batch Normalisation

Normalises the inputs of each layer to improve stability and speed of training.

Cost Function

Another term for loss function, measures prediction error.

Dropout

Regularisation technique that randomly ignores selected neurons during training to prevent overfitting.

Epoch

A single pass through the entire training dataset.

Gradient Descent

Optimisation algorithm that minimises the loss function by iteratively updating model parameters.

Learning Rate

Hyperparameter controlling the update step size during gradient descent.

Loss Function

Function measuring how well predictions match actual target values.

Optimiser

Algorithm used to adjust the weights and biases of the network to minimise the loss function.

Overfitting

When a model learns the training data too well, including noise, and performs poorly on new data.

Regularisation

Techniques used to prevent overfitting by adding a penalty to the loss function.

Vanishing Gradient Problem

Issue where gradients become too small, inhibiting learning in deep networks.

Weight Initialisation

Method for setting the initial values of weights before training.

Weight Decay

Regularisation technique that adds a penalty to the loss function proportional to the magnitude of the weights.

Training Set

Subset of the dataset used to train the model.

Validation Set

Subset of the dataset used to provide an unbiased evaluation of the model during training.

Test Set

Subset of the dataset used to evaluate the final model performance.

Last updated