light-the-torch is a small utility that wraps
pip to ease the installation process
for PyTorch distributions like
torchaudio, and so on as well
as third-party packages that depend on them. It auto-detects compatible CUDA versions
from the local setup and installs the correct PyTorch binaries without user
PyTorch distributions like
torchaudio, and so on are fully
pip install'able, but PyPI, the default
pip search index, has some limitations:
torch==1.11.0+cpu. Unfortunately, local specifiers are not allowed on PyPI. Thus, only the binaries compiled with one CUDA version are uploaded without an indication of the CUDA version. If you do not have a CUDA capable GPU, downloading this is only a waste of bandwidth and disk capacity. If on the other hand your NVIDIA driver version simply doesn't support the CUDA version the binary was compiled with, you can't use any of the GPU features.
pip install torch --extra-index-url https://download.pytorch.org/whl/cu113
While this is certainly an improvement, it still has a few downsides:
cu113), is supported on your local machine. This can be quite challenging for new users and at least tedious for more experienced ones.
--preoption. Failing to do so, will pull the stable binary from PyPI even if the rest of the installation command is correct.
pipprefers newer releases from PyPI. Thus, it is not possible to automatically get the latest LTS release.
In case you only want to install PyTorch distributions, point 3. and 4. above can be
resolved by using
--index-url instead and completely disabling installing from PyPI.
But of course this means it is not possible to install any package not hosted by
PyTorch, but that depends on it.
If any of these points don't sound appealing to you, and you just want to have the same
user experience as
pip install for PyTorch distributions,
light-the-torch was made
light-the-torch is as easy as
pip install light-the-torch
Since it depends on
pip and it might be upgraded during installation,
Windows users should
install it with
py -m pip install light-the-torch
light-the-torch is installed you can use its CLI interface
ltt as drop-in
ltt install torch
pip with a few added options:
ltt uses the local NVIDIA driver version to select the correct binary
for you. You can pass the
--pytorch-computation-backend option to manually specify
the computation backend you want to use:
ltt install --pytorch-computation-backend=cu102 torch torchvision torchaudio
Borrowing from the mutex packages that PyTorch provides for
--cpuonly is available as shorthand for
In addition, the computation backend to be installed can also be set through the
LTT_PYTORCH_COMPUTATION_BACKEND environment variable. It will only be honored in
case no CLI option for the computation backend is specified.
ltt installs stable PyTorch binaries. To install binaries from the
nightly, test, or LTS channels pass the
ltt install --pytorch-channel=nightly torch torchvision torchaudio
--pytorch-channel is not passed, using
--pre option will
install PyTorch test binaries.
Of course, you are not limited to install only PyTorch distributions. Everything shown above also works if you install packages that depend on PyTorch:
ltt install --pytorch-computation-backend=cpu --pytorch-channel=nightly pystiche
The authors of
pip do not condone the use of
pip internals as they might break
without warning. As a results of this,
pip has no capability for plugins to hook into
light-the-torch works by monkey-patching
pip internals at runtime:
light-the-torchreplaces the default search index with an official PyTorch download link. This is equivalent to calling
pip installwith the
--extra-index-urloption only for PyTorch distributions.
light-the-torchculls binaries incompatible with the hardware.
Thanks a lot for your interest to contribute to
light-the-torch! All contributions are
appreciated, be it code or not. Especially in a project like this, we rely on user
reports for edge cases we didn't anticipate. Please feel free to
open an issue if you encounter
anything that you think should be working but doesn't.
If you want to contribute code, check out our contributing guidelines to learn more about the workflow.