Open-source deep-learning framework for building, training and deploying neural networks on GPUs and CPUs.
End-to-end open-source machine-learning platform for building, training and deploying models across desktop, mobile, web and cloud.
Open-source Python library from Hugging Face that supplies thousands of pretrained transformer models and utilities for text, vision, audio and multimodal tasks.
High-level, multi-backend deep-learning API for building, training and deploying neural-network models.
Lightweight PyTorch wrapper that separates research code from engineering, enabling fast, scalable and reproducible deep-learning workflows.
JAX is a high-performance Python library that brings just-in-time compilation, automatic differentiation and easy parallelism to NumPy-style array programming.
Open-source deep-learning optimisation library from Microsoft that scales PyTorch training and inference to trillions of parameters with maximum efficiency.
LightGBM is an open-source gradient-boosting framework that delivers fast, memory-efficient tree-based learning for classification, regression and ranking tasks.
Ray is an open-source distributed compute engine that lets you scale Python and AI workloads—from data processing to model training and serving—without deep distributed-systems expertise.
Open-source, node-based workflow-automation platform for designing and running complex integrations and AI-powered flows.
NVIDIA’s model-parallel training library for GPT-like transformers at multi-billion-parameter scale.
A PyTorch-based system for large-scale model parallel training, memory optimization, and heterogeneous acceleration.