Skip to content

CollaborativeRoboticsLab/pytorch_vendor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pytorch_vendor

This package vendors the C++ PyTorch library (LibTorch) into a ROS 2 workspace and exposes it as a reusable CMake target.

It supports both x86_64 (desktop/server) and aarch64 (Jetson) platforms:

  • On x86_64 it downloads an official LibTorch archive from pytorch.org (CPU or CUDA, depending on nvcc).
  • On Jetson (aarch64) it can reuse the LibTorch that comes with NVIDIA’s Python torch wheel, or fall back to a downloaded CPU-only LibTorch if that is not available.

Configuration

All options are standard CMake cache variables/options and can be set via colcon build --cmake-args -D....

LibTorch version (x86_64 and fallback)

  • PYTORCH_VERSION (string, default 2.7.0)
    • Version of LibTorch to download from pytorch.org when using the vendored archive.

    • Example:

       colcon build --packages-select pytorch_vendor \
       	--cmake-args -DPYTORCH_VERSION=2.3.1

LibTorch root override

  • LIBTORCH_DIR (path, default ${CMAKE_CURRENT_SOURCE_DIR}/external/libtorch)
    • Root directory that contains include/ and lib/ subdirectories for LibTorch.
    • If ${LIBTORCH_DIR}/lib/libtorch.so already exists, no download is performed and that tree is used.
    • You can point this at an existing LibTorch installation if you don’t want the package to download anything.

Jetson / aarch64 behavior

On Jetson devices (where CMAKE_SYSTEM_PROCESSOR == aarch64) the package prefers the LibTorch that ships with NVIDIA’s PyTorch Python wheel if possible.

  • PYTORCH_USE_SYSTEM_TORCH (BOOL, default ON on aarch64)
    • When ON, CMake tries to locate the Python torch installation:

      1. Runs:

        python3 -c "import torch, os; print(os.path.dirname(torch.__file__))"
      2. If that path contains lib/libtorch.so, LIBTORCH_DIR is set to that directory and used as the LibTorch root.

      3. If not found, a warning is printed and the code falls back to the download flow into ${CMAKE_CURRENT_SOURCE_DIR}/external/libtorch.

Prerequisites on Jetson:

Note: this package does not call find_package(Torch). Instead it treats LibTorch as a plain include/lib tree and exports it via the pytorch_vendor CMake target.


What gets installed

After building pytorch_vendor with colcon build:

  • Headers from ${LIBTORCH_DIR}/include/ are installed to:

    • <workspace>/install/pytorch_vendor/include
  • Libraries from ${LIBTORCH_DIR}/lib/ are installed to:

    • <workspace>/install/pytorch_vendor/lib
  • A CMake export file is installed:

    • <workspace>/install/pytorch_vendor/share/pytorch_vendor/cmake/pytorch_vendorTargets.cmake

The package also declares and exports the interface library target:

  • CMake target: pytorch_vendor (INTERFACE)
    • Adds ${LIBTORCH_DIR}/include to the include path.
    • Adds ${LIBTORCH_DIR}/lib to the link directories.
    • Links torch and c10 transitively.

Using pytorch_vendor from another ROS 2 package

1. package.xml

Add a dependency on pytorch_vendor:

<depend>pytorch_vendor</depend>

2. CMakeLists.txt

Typical consumption pattern in a downstream package (executable target):

cmake_minimum_required(VERSION 3.8)
project(my_pytorch_node)

set(Torch_DIR "${CMAKE_CURRENT_SOURCE_DIR}/../../pytorch_vendor/external/libtorch/share/cmake/Torch")

find_package(ament_cmake REQUIRED)
find_package(rclcpp REQUIRED)
find_package(pytorch_vendor REQUIRED)
find_package(Torch REQUIRED)

add_executable(my_pytorch_node
	src/my_pytorch_node.cpp
)

# Recommended: use ament_target_dependencies so LibTorch is wired
# in transitively via the pytorch_vendor interface target.
ament_target_dependencies(my_pytorch_node
	rclcpp
	pytorch_vendor
)

# Link Torch explicitly
target_link_libraries(${PROJECT_NAME}
  ${TORCH_LIBRARIES}
)

target_include_directories(${PROJECT_NAME}
  PUBLIC ${TORCH_INCLUDE_DIRS}
)

install(TARGETS my_pytorch_node
	DESTINATION lib/${PROJECT_NAME}
)

ament_package()

For a shared library / plugin target the pattern is the same:

cmake_minimum_required(VERSION 3.8)
project(my_pytorch_library)

set(Torch_DIR "${CMAKE_CURRENT_SOURCE_DIR}/../../pytorch_vendor/external/libtorch/share/cmake/Torch")

find_package(ament_cmake REQUIRED)
find_package(pluginlib REQUIRED)
find_package(pytorch_vendor REQUIRED)
find_package(Torch REQUIRED)

add_library(${PROJECT_NAME} SHARED
	src/my_pytorch_library.cpp
)

ament_target_dependencies(${PROJECT_NAME}
	pluginlib
	pytorch_vendor
)

# Link Torch explicitly
target_link_libraries(${PROJECT_NAME}
  ${TORCH_LIBRARIES}
)

target_include_directories(${PROJECT_NAME}
  PUBLIC ${TORCH_INCLUDE_DIRS}
)

install(TARGETS ${PROJECT_NAME}
	ARCHIVE DESTINATION lib
	LIBRARY DESTINATION lib
	RUNTIME DESTINATION bin
)

ament_package()

Key points:

  • Consumer packages should only call find_package(pytorch_vendor REQUIRED).
  • You normally do not need:
    • find_package(Torch)
    • manual set(Torch_DIR ...) calls
    • manual use of TORCH_LIBRARIES or TORCH_INCLUDE_DIRS variables.
  • Using ament_target_dependencies(... pytorch_vendor ...) or target_link_libraries(... pytorch_vendor):
    • Adds the LibTorch include directory.
    • Adds the LibTorch library directory.
    • Links torch and c10 transitively.

In your C++ code you can then use LibTorch headers normally, for example:

#include <torch/torch.h>

If you really need to use find_package(Torch) manually (advanced / non-ROS use cases), make sure Torch_DIR points at the TorchConfig.cmake that lives under the LibTorch tree used by pytorch_vendor, for example:

set(Torch_DIR "${CMAKE_CURRENT_SOURCE_DIR}/../../pytorch_vendor/external/libtorch/share/cmake/Torch")
find_package(Torch REQUIRED)

However, inside a normal ROS 2 workspace this is discouraged – prefer depending on pytorch_vendor only, as shown above, to avoid mismatches between different LibTorch installations.


Examples

x86_64 desktop, CUDA available

colcon build --packages-select pytorch_vendor
  • nvcc is detected.
  • The CUDA version is inferred from nvcc --version.
  • A matching CUDA LibTorch archive is downloaded and used.

x86_64 desktop, CPU-only

colcon build --packages-select pytorch_vendor \
	--cmake-args -DPUSE_CUDA=OFF

or simply build on a system without nvcc in PATH; the package will automatically select the CPU-only LibTorch archive.

Jetson (aarch64) using NVIDIA PyTorch wheel

# On the Jetson, after installing NVIDIA's torch wheel
colcon build --packages-select pytorch_vendor
  • CMake will detect CMAKE_SYSTEM_PROCESSOR == aarch64.
  • It will try to locate python3’s torch module and use that LibTorch installation.
  • If that fails, it prints a warning and downloads the CPU-only aarch64 LibTorch archive as a fallback.

If you want to force using the downloaded LibTorch instead of system torch on Jetson:

colcon build --packages-select pytorch_vendor \
	--cmake-args -DPYTORCH_USE_SYSTEM_TORCH=OFF

Troubleshooting

  • Link errors for torch or c10:

    • Make sure your consumer target either:
      • uses ament_target_dependencies(... pytorch_vendor ...), or
      • links pytorch_vendor explicitly via target_link_libraries.
  • Runtime errors about missing shared libraries:

    • Ensure you are sourcing the workspace’s install/setup.bash before running your nodes, so the runtime loader can find libtorch.so under install/pytorch_vendor/lib.
  • Jetson: CMake cannot import Python torch:

    • Verify that python3 -c "import torch" works in the same environment where you run colcon build.
    • If you want to bypass the Python wheel entirely, set PYTORCH_USE_SYSTEM_TORCH=OFF and let the package download/host its own LibTorch.

About

A ROS2 package for pytorch C++ lib installation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages