PyTorch Lightning team

Sign in

Introducing Lightning Transformers, a new library that seamlessly integrates PyTorch Lightning, HuggingFace Transformers and Hydra, to scale up deep learning research across multiple modalities.


Machine learning metrics making evaluations of distributed PyTorch models clean and simple.

Figuring out which metrics you need to evaluate is key to deep learning. There are various metrics that we can evaluate the performance of ML algorithms. TorchMetrics is a collection of PyTorch metric implementations, originally a part of the PyTorch Lightning framework for high-performance deep learning. …


Lightning 1.4 Release adds TPU pods, IPU Hardware, DeepSpeed Infinity, Fully Sharded Data-Parallel and More.

Today we are excited to announce Lightning 1.4, introducing support for TPU pods, XLA profiling, IPUs, and new plugins to reach 10+ billion parameters, including Deep Speed Infinity, Fully Sharded Data-Parallel and more!

TPU Pod Training


A guide to open-source tools for efficient dataset and model development and analysis for video understanding with FiftyOne, PyTorch Lightning, and PyTorch Video

A visualization of Lightning Flash and PyTorchVideo predictions in FiftyOne (Image by author)

Video understanding, while a widely popular and ever-growing field of computer vision, is often held back by the lack of video support in many tools. Hundreds of tools exist to expedite nearly all aspects of the computer vision lifecycle, but they generally only support image data.


8 New Flash Tasks

Lightning Flash is a library from the creators of PyTorch Lightning to enable quick baselining and experimentation with state-of-the-art models for popular Deep Learning tasks.

We are excited to announce the release of Flash v0.3 which has been primarily focused on the design of a modular API to make it…


PyTorch profiler integration, predict and validate trainer steps, and more

Today we are excited to announce Lightning 1.3, containing highly anticipated new features including a new Lightning CLI, improved TPU support, integrations such as PyTorch profiler, new early stopping strategies, predict and validate trainer routines, and more.

In addition, we are standardizing our release schedule. We will be launching a…


New release includes a full set of metrics for information retrieval and other metrics requested by the community

This post was co-written by Nicki Skafte Detlefsen and Luca Di Liello

TorchMetrics v0.3.0 includes 6 new metrics for evaluating information retrieval

We are happy to announce TorchMetrics v0.3.0 is now publicly…


Max parameter size on using the same MinGPT model on the same lambda-labs A100 server with and without DeepSpeed with less than 3 lines of code difference

TLDR; This post introduces the PyTorch Lightning and DeepSpeed integration demonstrating how to scale models to billions of parameters with just a few lines of code.

What is PyTorch Lightning?


New release including many new PyTorch integrations, DeepSpeed model parallelism, and more.

We are happy to announce PyTorch Lightning V1.2.0 is now publicly available. It is packed with new integrations for anticipated features such as:

Continue reading to learn more about what’s available. As always, feel free…


Flash is a collection of tasks for fast prototyping, baselining and fine-tuning scalable Deep Learning models, built on PyTorch Lightning.

Whether you are new to deep learning, or an experienced researcher, Flash offers a seamless experience from baseline experiments to state-of-the-art research. It allows you to build models without being…

PyTorch Lightning team

PyTorch Lightning is a deep learning research frameworks to run complex models without the boilerplate.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store