Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 465 Bytes

distributed.rst

File metadata and controls

11 lines (9 loc) · 465 Bytes

Distributed Training APIs

SageMaker distributed training libraries offer both data parallel and model parallel training strategies. They combine software and hardware technologies to improve inter-GPU and inter-node communications. They extend SageMaker’s training capabilities with built-in options that require only small code changes to your training scripts.

.. toctree::
   :maxdepth: 3

   smd_data_parallel
   smd_model_parallel