<< Back to posts

Flaregun - A Tiny PyTorch Helper Library

Posted on March 20, 2023 • Tags: pytorch ml

The flaregun Python library is a super tiny (~50 lines of code) library containing a few helper functions to bring more visibility into your PyTorch code.

For example, get the current free/used/total memory of a specific GPU within your Python script, or get the number of total parameters in a PyTorch model.

I primarily made it to learn how to use PyPI, but will try to add useful functions as they come up.

Installation

pip install flaregun

Usage

Flaregun provides real-time visibility to your GPU / PyTorch models via a simple Python interface.

Get GPU Memory

####################
# Quickstart
####################

from flaregun import GPUStats
# Pretty print statistics for GPU #0
GPUStats(device=0).print()
> "GPU memory usage: 3061 / 32510 MB"

####################
# Other features
####################

# Free GPU memory (in MB)
free_mem = GPUStats(device).free()

# Total GPU memory (in MB)
total_mem = GPUStats(device).total()

# Used GPU memory (in MB)
used_mem = GPUStats(device).used()

Note: Loading the PyTorch kernel onto a GPU instantly takes up ~1225 MB of memory.

Count Model Params

####################
# Quickstart
####################
from flaregun import ModelStats

# Get HuggingFace model
model = AutoModelForMaskedLM.from_pretrained("yikuan8/Clinical-Longformer")

# Pretty print Model parameter count
ModelStats(model).print()
> "148711257 params (148711257 trainable | 0 non-trainable)"

####################
# Other features
####################

# Total params
total = ModelStats(model).total()

# Trainable params
trainable = ModelStats(model).trainable()

# Frozen (non-trainable) params
frozen = ModelStats(model).frozen()