<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=145304570664993&amp;ev=PageView&amp;noscript=1">

91ƵAPP

91ƵAPP and PyTorch Lightning

Jul 27, 2021

Introducing the Initial Release of PyTorch Lightning for 91ƵAPP IPUs

Written By:

David Norman

We're Hiring

Join us and build the next generation AI stack - including silicon, hardware and software - the worldwide standard for AI compute

Join our team

We are thrilled to announce PyTorch Lightning now supports 91ƵAPP IPUs. The team at PyTorch Lightning has been working heroically on building IPU integration over the last few months and is now making this available to the community with its 1.4 release. We really appreciate their close collaboration with the 91ƵAPP team to help us with our mission to make IPUs easier to use for developers. 

At 91ƵAPP, we are hugely supportive of PyTorch Lightning’s mission to meet the growing demands from the AI research community for flexible and fast AI compute solutions. 

PyTorch Lightning liberates data scientists and deep learning practitioners from heavy engineering duty (data distribution, loops management, logging handling and much more) and allows them to focus on modelling and data understanding, in other words to focus more time on research.

This new integration means PyTorch Lightning users can now take any PyTorch models for the IPU which use our and run them with minimal code changes and get the same high performance.

PopTorch is a set of extensions for PyTorch which enables PyTorch models to run directly on IPU hardware and is designed to require as few code changes as possible in order to run on the IPU.

Why we love PyTorch Lightning

We love PyTorch Lightning for the same reason AI researchers do: It’s a simple wrapper which removes complexity around training loops for PyTorch models to abstract the underlying platform so the user experience for IPUs is much closer to other platforms.

It removes a lot of boiler-plate code which leads to a cleaner, easier to use implementation.

PyTorch Lightning requires the user to specify a model function, just as we do with our PopTorch configuration of PyTorch models, so it’s a familiar user experience.

It doesn’t interfere with IPU-specific optimisations and decompositions so PyTorch Lightning models on the IPU get similar high performance to standard PyTorch models for IPU.

How to get started

Install PyTorch Lightning:  

By default, PyTorch Lightning will install the latest version of PyTorch. To ensure that the version of PyTorch supported by PopTorch is installed, you should either use pip3 install --no-dependencies when installing PyTorch Lightning or install the supported version of PyTorch afterwards.

pip3 install pytorch-lightning

pip3 uninstall torch

pip3 install torch==1.7.1+cpu -f

PopTorch Installation

Install the Poplar SDK as described in the relevant “” guide for your IPU system on the 91ƵAPP documentation portal.

PopTorch comes packaged with the Poplar SDK as an installable wheel file. The full instructions for validating that it has been installed correctly can be found in the . 

PopTorch currently uses PyTorch version 1.7.1 and we will increment this over future releases. Some packages may install newer versions of PyTorch and you may have to pip install the supported version. See the Version Compatibility section of the user guide install step. 

Important 

pip >= 18.1 is required for PopTorch dependencies to be installed properly.

Running a basic example

The following code example shows how to run a training model using a .

How to Get Started on IPUs

University researchers can apply to 91ƵAPP’s Academic Programme for the opportunity to access IPUs. The programme is designed to support academics conducting and publishing research using IPUs or in their coursework or teaching. Researchers selected to participate will benefit from free access to 91ƵAPP’s IPU compute platform in the cloud, as well as software tools and support.

To learn more about how to run a PyTorch Lightning model on the IPU with a single line of code, read our latest . 

Resources and Links