Model Garden
Featured Models
Dolly 2.0 Inference
Dolly 2.0 – The World’s First, Truly Open Instruction-Tuned LLM on IPUs – Inference
OpenAssistant Pythia 12B Inference
OpenAssistant Pythia 12B is an open-source and commercially usable chat-based assistant model trained on the OpenAssistant Conversations Dataset (OASST1)
Whisper Inference
Speech Transcription on IPUs using OpenAI's Whisper - Inference
GPT-J 6B Fine-tuning
Text entailment on IPU using GPT-J 6B on PyTorch using fine-tuning.
Flan-T5-Large/XL Inference
Flan-T5-Large/XL inference on IPUs with Hugging Face
Stable Diffusion Text-to-Image Inference
The popular latent diffusion model for generative AI with support for text-to-image on IPUs using Hugging Face Optimum.
YOLOv4 Inference
YOLOv4 - You Only Look Once - a convolutional neural network model that performs object detection tasks on IPUs using PyTorch.
BERT-Large Fine-tuning
HuggingFace Optimum implementation for fine-tuning a BERT-Large transformer model.
Library
Dolly 2.0 Inference
Dolly 2.0 – The World’s First, Truly Open Instruction-Tuned LLM on IPUs – Inference
OpenAssistant Pythia 12B Inference
OpenAssistant Pythia 12B is an open-source and commercially usable chat-based assistant model trained on the OpenAssistant Conversations Dataset (OASST1)
Whisper Inference
Speech Transcription on IPUs using OpenAI's Whisper - Inference
Llama 2 Inference
Run Meta’s latest Open Source Large Language Model Inference on IPUs
Stable Diffusion 2 Text-to-Image Inference
The popular latent diffusion model for generative AI with support for text-to-image on IPUs using Hugging Face Optimum.
Stable Diffusion Text-to-Image Inference
The popular latent diffusion model for generative AI with support for text-to-image on IPUs using Hugging Face Optimum.
Stable Diffusion Image-to-Image Inference
The popular latent diffusion model for generative AI with support for image-to-image on IPUs using Hugging Face Optimum.
Stable Diffusion Inpainting Inference
The popular latent diffusion model for generative AI with support for inpainting on IPUs using Hugging Face Optimum.
GPT-J 6B Fine-tuning
Text entailment on IPU using GPT-J 6B on PyTorch using fine-tuning.
GPT-J 6B Inference
Text generation on IPU using GPT-J 6B on PyTorch for inference.
RGCN Training
Training a GNN to do Fraud Detection using Relational Graph Convolution Network (RGCN) on IPUs with PyG (PyTorch Geometric)
GPT-3 Fine-tuning
GPT-3 (Generative Pretrained Transformer 3) is a state-of-the-art language processing AI model developed by OpenAI.
GPT-3 Inference
GPT-3 (Generative Pretrained Transformer 3) is a state-of-the-art language processing AI model developed by OpenAI.
GPT2-Large Training
GPT2-L training in PyTorch leveraging the Hugging Face Transformers library.
GPT2-Large Inference
GPT2-L inference in PyTorch leveraging the Hugging Face Transformers library.
GPT2-Medium Training
GPT2-M training in PyTorch leveraging the Hugging Face Transformers library.
GPT2-Medium Fine-tuning
HuggingFace Optimum implementation for fine-tuning a GPT2-Medium transformer model.
GPT2-Medium Inference
GPT2-M inference in PyTorch leveraging the Hugging Face Transformers library.
GPT2-Small Training
GPT2-S training in PyTorch leveraging the Hugging Face Transformers library.
GPT2-Small Fine-tuning
HuggingFace Optimum implementation for fine-tuning a GPT2-Small transformer model.
GPT2-Small Inference
GPT2-S inference in PyTorch leveraging the Hugging Face Transformers library.
Flan-T5-Large/XL Inference
Flan-T5-Large/XL inference on IPUs with Hugging Face
T5-Small Fine-Tuning
Summarization on IPU using T5 Small with Hugging Face Optimum - Fine-Tuning
MT5-Small Fine-Tuning
Machine Translation on IPUs using MT5-Small with Hugging Face - Fine-tuning
MT5-Large Inference
Zero-Shot Text Classification on IPUs using MT5-Large with Hugging Face - Inference
GPS++ Training
A hybrid GNN/Transformer for training Molecular Property Prediction using IPUs on the PCQM4Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.
GPS++ Inference
A hybrid GNN/Transformer for Molecular Property Prediction inference using IPUs trained on the PCQM4Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.
Distributed KGE - TransE (256) Training
Knowledge graph embedding (KGE) for link-prediction training on IPUs using Poplar with the WikiKG90Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.
Distributed KGE - TransE (256) Inference
Knowledge graph embedding (KGE) for link-prediction inference on IPUs using Poplar with the WikiKG90Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.
Distributed KGE - TransE (256) Training
Knowledge graph embedding (KGE) for link-prediction training on IPUs using PyTorch with the WikiKG90Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.
BERT-Large Training
BERT-Large (Bidirectional Encoder Representations from Transformers) using PyTorch for NLP training on IPUs.
BERT-Large Training
BERT-Large (Bidirectional Encoder Representations from Transformers) using TensorFlow 1 for NLP training on IPUs.
BERT-Large Inference
BERT-Large (Bidirectional Encoder Representations from Transformers) for NLP inference on IPUs with TensorFlow 1.
BERT-Large Training
BERT-Large (Bidirectional Encoder Representations from Transformers) using TensorFlow 2 for NLP training on IPUs.
BERT-Large Training
BERT-Large (Bidirectional Encoder Representations from Transformers) using PopART for NLP training on IPUs.
BERT-Large Inference
BERT-Large (Bidirectional Encoder Representations from Transformers) using PopART for NLP inference on IPUs.
BERT-Large Fine-tuning
HuggingFace Optimum implementation for fine-tuning a BERT-Large transformer model.
BERT-Large Pretraining
HuggingFace Optimum implementation for pre-training a BERT-Large transformer model.
DistilBERT Training
DistilBERT is a small, fast, cheap and light Transformer model trained by distilling BERT base using Hugging Face Optimum on IPUs.
BERT-Base Training
BERT-Base (Bidirectional Encoder Representations from Transformers) using PyTorch for NLP training on IPUs.
BERT-Base Training
BERT-Base (Bidirectional Encoder Representations from Transformers) using TensorFlow 2 for NLP training on IPUs.
BERT-Base Training
BERT-Base (Bidirectional Encoder Representations from Transformers) using TensorFlow 1 for NLP training on IPUs.
BERT-Base Training
BERT-Base (Bidirectional Encoder Representations from Transformers) using PopART for NLP training on IPUs.
BERT-Base Inference
BERT-Base (Bidirectional Encoder Representations from Transformers) using PopART for NLP inference on IPUs.
BERT-Base Training
BERT-Base pre-training and SQuAD fine-tuning using Baidu's PaddlePaddle framework on IPUs.
BERT-Base Pretraining
HuggingFace Optimum implementation for pretraining a BERT-Base transformer model using bert-based-uncased datasets.
BERT-Base Fine-tuning
HuggingFace Optimum implementation for fine-tuning a BERT-Base transformer model using bert-base-uncased on the squad dataset.
RoBERTa-Large Training
HuggingFace Optimum implementation for training RoBERTa-Large - a transformer model for sequence classification, token classification or question answering.
RoBERTa-Base Fine-tuning
HuggingFace Optimum implementation for fine-tuning RoBERTa-Base on the squad dataset for text generation and comprehension tasks
RoBERTa-Base Fine-tuning
HuggingFace Optimum implementation for fine-tuning RoBERTa-Base on the squad_v2 dataset for text generation and comprehension tasks
LXMERT Fine-tuning
HuggingFace Optimum implementation for fine-tuning LXMERT on the gqa-lxmert dataset for learning vision-and-language cross-modality representations.
DeBERTa Training
HuggingFace Optimum implementation for training DeBERTa - a transformer models that improves BERT and RoBERTa models using disentangled attention and enhanced mask decoder.
LXMERT Fine-tuning
HuggingFace Optimum implementation for fine-tuning LXMERT on the vqa-lxmert dataset for learning vision-and-language cross-modality representations.
DeBERTa Inference
SQuAD and MNLI on IPUs using DeBERTa with Hugging Face - Inference
HuBERT Training
HuggingFace Optimum implementation for training HuBERT (Hidden-Unit BERT) for self-supervised speech representation learning approach.
BART Training
HuggingFace Optimum implementation for training BART - a transformer model for text generation and comprehension tasks
GroupBERT Training
GroupBERT - an enhanced transformer architecture with efficient grouped structures in TensorFlow 1.
PackedBERT Training
New BERT packing algorithm that removes padding for more efficient training in PyTorch.
PackedBERT Training
New BERT packing algorithm that removes padding for more efficient training in PopART.
PackedBERT Fine-tuning
New BERT packing algorithm that removes padding for more efficient fine-tuning in Hugging Face.
PackedBERT Inference
New BERT packing algorithm that removes padding for more efficient inference in Hugging Face.
Conformer-Medium Training
A variant of the conformer model based on WeNet (not ESPnet) using PyTorch which uses a hybrid CTC/attention architecture with transformer or conformer as an encoder.
CLIP Training
CLIP (Contrastive Language-Image Pre-Training) - a neural network trained on a variety of (image, text) pairs using PyTorch.
ViT (Vision Transformer) Fine-tuning
ViT (Vision Transformer) fine-tuning in PyTorch using Hugging Face transformers.
ViT (Vision Transformer) Pretraining
ViT (Vision Transformer) pretraining in PyTorch using Hugging Face transformers.
ViT (Vision Transformer) Fine-tuning
HuggingFace Optimum implementation for fine-tuning a ViT (vision transformer) model.
DINO Training
Self-supervised Vision Transformer model for training in PyTorch.
YOLOv3 Training
YOLOv3 - You Only Look Once - a convolutional neural network model that performs object detection tasks on IPUs using TensorFlow 1.
YOLOv3 Inference
YOLOv3 - You Only Look Once - a convolutional neural network model that performs object detection tasks on IPUs using TensorFlow 1..
YOLOv4 Inference
YOLOv4 - You Only Look Once - a convolutional neural network model that performs object detection tasks on IPUs using PyTorch.
ResNet-50 Training
Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with PyTorch.
ResNet-50 Inference
Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with PyTorch.
ResNet-50 Training
Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 2.
ResNet-50 Training
Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 1.
ResNet-50 Inference
Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 1.
EfficientNet-B4 Training
CNN (Convolutional Neural Network) image classification training on EfficientNet with PyTorch for IPU.
EfficientNet-B0/B4 Inference
CNN (Convolutional Neural Network) image classification inference on EfficientNet with PyTorch for IPU.
EfficientDet (D0-D4) Inference
Efficient object detection model for inference using TensorFlow 2 on the IPU.
EfficientNet-B4 Training
CNN (Convolutional Neural Network) image classification training on EfficientNet with TensorFlow 1 for IPU.
Reference Evapotranspiration (ET0) Inference
Spatial interpolation analysis and prediction calculation using TensorFlow 1 for weather forecasting, drought forecasting, and smart irrigation.
ResNeXt-101 Training
Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with TensorFlow 1.
ResNeXt-101 Inference
Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with PyTorch.
ResNeXt-101 Inference
Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with TensorFlow 1.
ResNeXt-101 Inference
Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with PopART.
Faster-RCNN Training
IPU implementation of Faster-RCNN detection framework using PopART.
Swin Pretraining
Swin: Hierarchical Vision Transformer model using Shifted Windows for pretraining in PyTorch.
MAE Training
Implementation of MAE computer vision model in PyTorch for the IPU based on the paper "Masked Autoencoders Are Scalable Vision Learners".
Frozen️ in Time Training
Implementation of Frozen in Time on the IPU in PyTorch for joint video and image encoder end-to-end retrieval.
Swin Fine-tuning
Swin: Hierarchical Vision Transformer model using Shifted Windows for fine-tuning in PyTorch.
UNet Medical Training
U-Net for biomedical image segmentation using TensorFlow 2 Keras for the IPU.
UNet Medical Inference
U-Net for biomedical image segmentation using TensorFlow 2 Keras for the IPU.
UNet Industrial Training
How to run a UNet Industrial training example with TensorFlow 1 for image segmentation.
Mini DALL-E Training
Mini DALL-E Text-to-Image Generation training example with PyTorch for the IPU.
TGN Training
TGN: Temporal Graph Networks is a dynamic GNN model for training on the IPU using PyG (PyTorch Geometric)
Bellman-Ford networks (NBFnet)
Bellman-Ford networks (NBFnet) is a GNN model used for link prediction in homogeneous and heterogeneous graphs implemented in PyG (PyTorch Geometric)
GIN Training
Graph Isomorphism Network (GIN) is used to perform graph classification for molecular property prediction using TensorFlow 2.
GIN Training
Graph Isomorphism Network (GIN) is used to perform graph classification for molecular property prediction using PyG (PyTorch Geometric)
Cluster-GCN Training
An efficient algorithm for training deep and large Graph Convolutional Networks using TensorFlow 2.
SchNet Training
GNN-based model in PyG (PyTorch Geometric) developed for modelling quantum interactions between atoms in a molecule
Cluster-GCN Training
An efficient algorithm for training deep and large Graph Convolutional Networks using PyG (PyTorch Geometric)
Neural Image Fields Training
Training a neural network model for reconstructing / compressing images in TensorFlow 2.
Neural Image Fields Inference
Running inference on a neural network model for reconstructing / compressing images in TensorFlow 2.
MCMC Training
Markov Chain Monte Carlo (MCMC) training on IPUs using standard TensorFlow Probability.
Deep Voice 3 Training
Text-To-Speech training on IPUs with PopART using a Convolutional Sequence Learning technique.
FastSpeech2 Training
FastSpeech2: Fast and High-Quality End-to-End Text to Speech training on IPUs with TensorFlow 2.
FastSpeech2 Inference
FastSpeech2: Fast and High-Quality End-to-End Text to Speech inference on IPUs with TensorFlow 2.
FastPitch Training
FastPitch: Parallel Text-to-speech with Pitch Prediction using PyTorch.
Wav2Vec2 Training
HuggingFace Optimum implementation for training Wav2Vec2-Base - a speech recognition transformer model.
wav2vec2 Inference
HuggingFace Optimum implementation for Wav2Vec2-Base inference - a speech recognition transformer model.
DeepLOB-Seq2Seq Training
Multi-Horizon Financial Forecasting on IPU using DeepLOB-Seq2Seq - Training with TensorFlow 2
DeepLOB-Attention Training
Multi-horizon Financial Forecasting on IPUs using DeepLOB-Attention - Training with TensorFlow 2
Transformer Transducer (RNN-T) Training
IPU implementation of the Speech Recognition Model with Transformer Encoders and RNN-T Loss in PopART.
DIEN Training
DIEN (Deep Interest Evolution Network) training on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.
DIEN Inference
DIEN (Deep Interest Evolution Network) inference on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.
DIN Training
DIN (Deep Interest Network) training on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.
DIN Inference
DIN (Deep Interest Network) inference on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.
CosmoFlow Training
A deep learning model for calculating cosmological parameters in TensorFlow 1. The model primarily consists of 3D convolutions, pooling operations, and dense layers.
Approximate Bayesian Computation (ABC) COVID-19 Inference
A representative implementation of ABC for Simulation-based Inference for observing data from COVID-19 infections to enable statistical inference using TensorFlow 2.
Deep Molecular Dynamics (DeePMD-kit) Training
DeePMD-kit - a deep learning package for many-body potential energy representation and molecular dynamics using TensorFlow 1.
Monte Carlo Ray Tracing Inference
Monte Carlo ray tracing application built in Poplar for neural rendering on the IPU.
MobileNetv3 Training
MobileNetv3 - Convolutional neural network training for classification, detection and segmentation using PyTorch.
MobileNetv2 Inference
MobileNetv2 - Convolutional neural network inference for classification, detection and segmentation using TensorFlow 1.
MobileNetv3 Inference
MobileNetv3 - Convolutional neural network inference for classification, detection and segmentation using PyTorch.
Autoencoder Training
Custom autoencoder model on the IPU using TensorFlow 1 to train collaborative filtering in recommender systems.
Autoencoder Inference
Custom autoencoder inference model on the IPU using TensorFlow 1 to perform collaborative filtering in recommender systems.
Contrastive Divergence VAE Training
Train a Variational Autoencoder / Markov Chain Monte Carlo hybrid model on IPUs with TensorFlow 1.
Reinforcement Learning Training
How to train a deep reinforcement learning model in TensorFlow 1 on multiple IPUs with synchronous data parallel training.
Sales Forecasting Training
How to train a sales forecasting machine learning model with TensorFlow 1 on 91ƵAPP's IPUs.