Deep LearningArchitectures

Data Parallelism

Overview

A distributed training strategy that replicates the model across multiple devices and divides training data into batches processed simultaneously, synchronising gradients after each step.

Cross-References(1)

Business & Strategy

More in Deep Learning

See Also