Red Hat AI learning hub

Explore learning materials and tools designed to help you use Red Hat® OpenShift® AI and Red Hat® Enterprise Linux® AI, organized by the tasks you need to accomplish.

Info Icon

This content is curated by Red Hat experts but it may not be tested on every supported configuration.

Featured resources

feature_resource_0

Red Hat AI Foundations

No-cost training covering AI essentials and effective use of Red Hat AI.

Training
feature_resource_1

Try Developer Sandbox for Red Hat OpenShift AI

Start a no-cost trial, which includes instant access to your own minimal developer cluster, hosted and managed by Red Hat.

Trial
feature_resource_2

Try Red Hat Enterprise Linux AI

Get a trial version of RHEL AI, our foundation model platform to develop, train, test, and run Granite family LLMs for enterprise applications.

Trial

Get to know Red Hat AI

Red Hat OpenShift AI Technical Overview (A1067)

An introduction to operationalizing AI/ML with Red Hat OpenShift AI.

Training course

Red Hat Enterprise Linux AI Technical Overview (A1096)

Explore how RHEL AI supports common challenges in adopting AI, such as cost, data transparency, and deployment complexity.

Training course

InstructLab on RHEL AI

Explore InstructLab on RHEL AI, including resources for developers.

Product page

OpenShift AI tutorial - Fraud detection example

A step-by-step guide for using OpenShift AI to train an example model in JupyterLab, deploy the model, and refine the model by using automated pipelines.

Documentation

RHEL AI: Try out LLMs the easy way

Learn how RHEL AI provides a security-focused, low-cost environment for experimenting with LLMs.

Learning path

Foundation model platform for generative AI

Get started quickly with gen AI and deliver results with a trusted, security-focused Red Hat Enterprise Linux platform.

Documentation

Red Hat OpenShift AI

Learn about the artificial intelligence platform that runs on top of Red Hat OpenShift in this developer-focused product page.

Product page

AI solutions from Red Hat

Explore Red Hat’s AI solutions.

Solutions page

Customize models

What is InstructLab and how do you use it? 9 quick videos to help you get started

Nine quick videos to help you get started with InstructLab.

Blog

How to train an LLM using InstructLab

Join Red Hat’s Senior Director of Technical Marketing, Grant Shipley, as he demonstrates how to train an LLM with InstructLab.

Video

RAG vs. Fine Tuning

Join Cedric Clyburn as he explores the differences and use cases of Retrieval Augmented Generation (RAG) and fine-tuning in enhancing LLMs.

Video

Serve models

Serving models

Test and implement trained models into intelligent applications in Red Hat OpenShift AI Self-Managed

Documentation

Install Red Hat Device Edge on NVIDIA Jetson Orin and IGX Orin

Install Red Hat Device Edge on an NVIDIA® Jetson Orin™/NVIDIA IGX Orin™ Developer Kit and explore new features brought by rpm-ostree.

Learning path

Fine-tuning and serving an open source foundation model with Red Hat OpenShift AI

Learn how to fine-tune and deploy a HuggingFace GPT-2 model consisting of 137M parameters on a WikiText dataset using Red Hat OpenShift AI.

Blog

Build and scale applications

Introduction to Python Programming and to Red Hat OpenShift AI (AI252)

An introduction to Python programming, and creating and managing AI/ML workloads with Red Hat OpenShift AI.

Training course

Creating Machine Learning Models with Python and Red Hat OpenShift AI (A1253)

An introduction to machine learning concepts with Python and how to use Red Hat OpenShift AI to train ML models.

Training course

Developing and deploying AI/ML apps on RHOAI (A1267)

An introduction to developing and deploying AI/ML applications on Red Hat OpenShift AI.

Training course

Working in your data science IDE

Prepare your data science integrated development environment (IDE) for developing machine learning models.

Documentation

How to get started with large language models and Node.js

Learn how to access an LLM using Node.js and LangChain.js, and explore LangChain.js APIs that simplify common requirements.

Learning path

Getting started with Podman AI lab

Take a tour of Podman AI Lab and walk through each of its capabilities. Discover how to integrate generative AI in new and existing applications.

Article

From Podman AI Lab to OpenShift AI

Learn how to go from a chatbot recipe in the Podman AI Lab extension to a RAG chatbot deployed on OpenShift and OpenShift AI.

Learning path

Configure hardware accelerators

Hardware accelerators

Instructions for installing and configuring GPU Operators on OpenShift.

Documentation

Enabling accelerators

Learn how to enable a variety of hardware accelerators for use in OpenShift AI.

Documentation

Working with accelerators

Use accelerators to optimize the performance of your end-to-end data science workflows in OpenShift AI.

Documentation

How to make the most of your GPUs (part 1 - time-slicing)

Explore different strategies supported by the NVIDIA GPU operator to oversubscribe available GPU resources.

Blog

How to make the most of your GPUs (part 2 - Multi-instance GPU)

Learn about dividing GPUs into isolated and static instances for concurrent usage by different applications.

Blog

How to Enable Hardware Accelerators on OpenShift (part 1)

Read about using hardware accelerators with Red Hat OpenShift.

Blog

How to enable Hardware Accelerators on OpenShift, SRO Building Blocks (part 2)

Read part 2 of this blog on using hardware accelerators with OpenShift.

Blog

Automate workflows

Automating end-to-end machine learning and model training tasks

Learn how to solve key data science challenges with OpenShift AI pipelines.

Interactive experience

Working with data science pipelines

Enhance your data science projects on OpenShift AI by building portable ML workflows with data science pipelines.

Documentation

Integrate a private AI coding assistant into your CDE using Ollama, Continue, and OpenShift Dev Spaces

Streamline your cloud development workflow by deploying and integrating a private AI assistant according to privacy, security, and IP legal concerns.

Learning path

Manage and configure storage

Working with data in an S3-compatible object store

Learn how to work with data stored in an S3-compatible object store from your workbench.

Documentation

Managing OpenShift AI

Manage OpenShift AI users and groups, dashboard interface and applications, deployment resources, accelerators, distributed workloads, and data backup.

Documentation

Configuring cluster storage

Add cluster storage to the project and connect cluster storage to a specific project’s workbench.

Documentation

Managing resources

Manage custom workbench images, cluster PVC size, user groups, and Jupyter notebook servers.

Documentation

Troubleshoot Red Hat AI

Troubleshoot with Red Hat support

Open a support case with us or connect with an expert.

Support

Red Hat Enterprise Linux AI

Browse the release notes and full product documentation for RHEL AI.

Documentation

Red Hat OpenShift AI

Browse the release notes and full product documentation for Red Hat OpenShift AI Self-Managed and Cloud Service.

Documentation

docs.instructlab.ai

Browse InstructLab resources and documentation.

InstructLab documentation

Common RHEL AI issues

Learn how to collect system information and resolve common errors in RHEL AI. (Sign-in required)

Knowledge base article

Troubleshooting common installation problems

Resolve problems when installing Red Hat OpenShift AI in a disconnected environment.

Documentation
Red Hat logoGithubRedditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

© 2024 Red Hat, Inc.