Kernels documentation

Basic Usage

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Basic Usage

Loading Kernels

Here is how you would use the activation kernels from the Hugging Face Hub:

import torch
from kernels import get_kernel

# Download optimized kernels from the Hugging Face hub
activation = get_kernel("kernels-community/activation", version=1)

# Create a random tensor
x = torch.randn((10, 10), dtype=torch.float16, device="cuda")

# Run the kernel
y = torch.empty_like(x)
activation.gelu_fast(y, x)

print(y)

This fetches version 1 of the kernel kernels-community/activation. Kernels are versioned using a major version number. Using version=1 will get the latest kernel build from the v1 branch.

Kernels within a version branch must never break the API or remove builds for older PyTorch versions. This ensures that your code will continue to work.

Checking Kernel Availability

You can check if a particular version of a kernel supports the environment that the program is running on:

from kernels import has_kernel

# Check if kernel is available for current environment
is_available = has_kernel("kernels-community/activation", version=1)
print(f"Kernel available: {is_available}")
Update on GitHub