Optimizing Large Language Model Inference on Gaudi2 with Hugging Face Optimum-Habana
We have optimized additional Large Language Models on Hugging Face using the Optimum Habana library.
DeepSpeed, Hugging Face, Inference
Faster Training and Inference: Habana Gaudi®-2 vs Nvidia A100 80GB
In this article, you will learn how to use Habana® Gaudi®2 to accelerate model training and inference, and train bigger models with 🤗 Optimum Habana.
developer, Gaudi2, Hugging Face
Fine tuning GPT2 with Hugging Face and Habana Gaudi
In this tutorial, we will demonstrate fine tuning a GPT2 model on Habana Gaudi AI processors using Hugging Face optimum-habana library with DeepSpeed.
DeepSpeed, developer, Fine Tuning, Gaudi, GPT, GPT2, Hugging Face