Intel® Gaudi® software suite

like our accelerator hardware, was purpose-designed to optimize deep learning performance, efficiency, and most importantly for developers, ease of use. With support for popular frameworks and models, the goal of Intel Gaudi software is to facilitate the ease and speed for developers, using the code and tools they use regularly and prefer. In essence, Intel Gaudi software and its many tools and support are designed to meet deep learning developers where you are— enabling you to develop what and how you want.

Intel Gaudi software goal: ease migration

of existing software to Intel Gaudi AI accelerators, preserving software investments, and making it easy to build new models—for both training and deployment of the numerous and growing models defining deep learning, generative AI and large language models.

Ready to Use Intel Gaudi software ?
Quote mark icon
“During the research for this blog post I discovered that “Gaudi” means “fun” in German. I cannot think of a better way to describe the experience I have had with DL1 thus far.”
-Chaim Rand, ML Algorithm Developer, Mobileye

Simplified development the way you want to develop

Intel Gaudi software:
optimized for deep learning training and inference

Intel Gaudi software suite

integrating frameworks, tools, drivers and libraries

Synapse Intel gaudi software stack
developer site screen

We offer extensive support for data scientists, developers,
and IT and Systems Administrators with:

Developer Site: featuring documentation, customer-available software, “how to” content, community Forum

GitHub: providing reference models, set-up and install
instructions, snapshot scripts for analysis and debug, custom kernel
examples, roadmap, and “issues” sub-site tracking to dos, bugs,
feature requests and more

“So, if you have existing code, you can migrate it in minutes. And… the first time I tried this, it took me 10 minutes and that included reading the docs. And I launched my accelerated script and it worked out of the box. I have to say this is one of the simplest development experiences I’ve ever seen.”
-Julien Simon, Chief Evangelist, Hugging Face

Intel Gaudi AI accelerator

Software ecosystem
for deep learning

Habana ecosystem
brings together leading software providers, tools and code to accelerate development of state-of-the-art deep learning models based on PyTorch, TensorFlow, PyTorch Lightning and DeepSpeed frameworks. Our collaboration with software partners such as Hugging Face enables Intel to rapidly expand its support for the latest popular models, easing your development process to enable the explosive array of deep learning applications emerging daily. Intel’s software team leverages open code, models and capabilities from trusted partners who share our focus on developing models for computer vision, natural language processing, generative AI and multi-modal applications.
Hugging face logo

Hugging Face: over 50,000 AI models and
90,000+ GitHub stars. Habana Optimum library
on Hugging Face provides customers using Intel Gaudi
AI accelerators to access to the entire
Hugging Face model universe. Checkout a list of
Hugging Face Habana optimized models here.

PyTorch Lightning logo

Lightning: acceleration of PyTorch deep learning workloads

deepspeed logo

Deep Speed: easy-to-use deep learning
optimization software that enables scale and
speed with particular focus on large scale models

Cnvrg

Cnvrg.io: MLOps support for customers
using Intel Gaudi processors

Solutions focused on
Business and End-user Outcomes

Our model development is focused on industries and applications where AI is proven to provide the greatest near-term business
revenue and deliver rich consumer experiences.

For more information on how Intel Gaudi AI accelerators are enabling high-value applications using computer vision, NLP,
generative AI, large-language and multi-modal models, see our industry solutions >