Overview
A framework where two neural networks compete — a generator creates synthetic data while a discriminator evaluates its authenticity.
Cross-References(1)
More in Deep Learning
Skip Connection
ArchitecturesA neural network shortcut that allows the output of one layer to bypass intermediate layers and be added to a later layer's output.
Convolutional Neural Network
ArchitecturesA deep learning architecture designed for processing structured grid data like images, using convolutional filters to detect features.
Variational Autoencoder
ArchitecturesA generative model that learns a probabilistic latent space representation, enabling generation of new data samples.
Positional Encoding
Training & OptimisationA technique that injects information about the position of tokens in a sequence into transformer architectures.
Residual Connection
Training & OptimisationA skip connection that adds a layer's input directly to its output, enabling gradient flow through deep networks and allowing training of architectures with hundreds of layers.
Knowledge Distillation
ArchitecturesA model compression technique where a smaller student model learns to mimic the behaviour of a larger teacher model.
Batch Normalisation
ArchitecturesA technique that normalises layer inputs during training to stabilise and accelerate deep neural network learning.
Gradient Checkpointing
ArchitecturesA memory optimisation that trades computation for memory by recomputing intermediate activations during the backward pass instead of storing them all during the forward pass.