State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Transformers acts as the model-definition framework for state-of-the-art machine learning models in text, computer vision, audio, video, and multimodal model, for both inference and training. It centralizes the model definition so that this definition is agreed upon across the ecosystem. transformers is the pivot across frameworks: if a model definition is supported, it will be compatible with the majority of training frameworks (Axolotl, Unsloth, DeepSpeed, FSDP, PyTorch-Lightning, ...), inference engines (vLLM, SGLang, TGI, ...), and adjacent modeling libraries (llama.cpp, mlx, ...) which leverage the model definition from transformers.
$
pkg install py311-transformersOrigin
misc/py-transformers
Size
152MiB
License
APACHE20
Maintainer
yuri@FreeBSD.org
Dependencies
32 packages
Required by
2 packages
Dependencies (32)
python311py311-uvicornpy311-tqdmpy311-torchvisionpy311-torchaudiopy311-tokenizerspy311-timmpy311-tiktokenpy311-starlettepy311-sentencepiecepy311-sagemakerpy311-safetensorspy311-requestspy311-regexpy311-pyyamlpy311-pytorchpy311-pydantic2py311-pillowpy311-packagingpy311-optunapy311-openaipy311-numpypy311-num2wordspy311-nattenpy311-librosapy311-huggingface-hubpy311-ftfypy311-filelockpy311-fastapipy311-codecarbonpy311-blobfilepy311-accelerate