Habana Blog

News & Discussion
Tagged: Large Language Models

Habana Blogs

News & Discussion
Tagged: Large Language Models

Training Llama and Bloom 13 Billion Parameter LLMs with 3D Parallelism on Habana® Gaudi2®

One of the main challenges in training Large Language Models (LLMs) is that they are often too large to fit on a single node or even if they fit, the training may be too slow. To address this issue, their training can be parallelized across multiple Gaudi accelerators (HPUs).
3D-Parallelism, DeepSpeed, GenAI, Large Language Models

Porting a model to Megatron-DeepSpeed with Habana Gaudi

If you want to train a large model using Megatron-DeepSpeed, but the model you want is not included in the implementation, you can port it to the Megatron-DeepSpeed package. Assuming your model is transformer-based, you can add your implementation easily, basing it on existing code.
3D-Parallelism, DeepSpeed, GenAI, Large Language Models

Memory-Efficient Training on Habana® Gaudi® with DeepSpeed

One of the key challenges in Large Language Model (LLM) training is reducing the memory requirements needed for training without sacrificing compute/communication efficiency and model accuracy.
DeepSpeed, developer, Gaudi, Large Language Models