Skip to content

Commit

Permalink
updated docs
Browse files Browse the repository at this point in the history
  • Loading branch information
dusty-nv committed Mar 28, 2023
1 parent f890da0 commit e2e4adb
Showing 1 changed file with 4 additions and 11 deletions.
15 changes: 4 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Deep Learning Nodes for ROS/ROS2
This repo contains deep learning inference nodes and camera/video streaming nodes for ROS/ROS2 with support for NVIDIA **[Jetson Nano / TX1 / TX2 / Xavier / Orin](https://developer.nvidia.com/embedded-computing)** devices and TensorRT.
# DNN Inference Nodes for ROS/ROS2
This repo contains DNN inference nodes and camera/video streaming nodes for ROS/ROS2 with support for NVIDIA **[Jetson Nano / TX1 / TX2 / Xavier / Orin](https://developer.nvidia.com/embedded-computing)** devices and TensorRT.

The nodes use the image recognition, object detection, and semantic segmentation DNN's from the [`jetson-inference`](https://github.com/dusty-nv/jetson-inference) library and NVIDIA [Hello AI World](https://github.com/dusty-nv/jetson-inference#hello-ai-world) tutorial, which come with several built-in pretrained networks for classification, detection, and segmentation and the ability to load customized user-trained models.

Expand All @@ -18,9 +18,6 @@ Various distribution of ROS are supported either from source or through containe
### Table of Contents

* [Installation](#installation)
* [jetson-inference](#jetson-inference)
* [ROS/ROS2](#rosros2)
* [ros_deep_learning](#ros_deep_learning-1)
* [Testing](#testing)
* [Video Viewer](#video-viewer)
* [imagenet Node](#imagenet-node)
Expand All @@ -35,25 +32,21 @@ Various distribution of ROS are supported either from source or through containe

## Installation

The easiest way to get up and running is by running one of the containers that is already built with ROS/ROS2, jetson-inference, and the ros_deep_learning nodes already included:
The easiest way to get up and running is by running one of the containers that is already built with ROS/ROS2, jetson-inference, and the ros_deep_learning package pre-installed:

``` bash
git clone https://github.com/dusty-nv/ros_deep_learning
cd ros_deep_learning
docker/run.sh --ros=foxy
```

The `--ros` argument to the [`docker/run.sh`](docker/run.sh) script selects the ROS distro to use (the default is Foxy). Containers with ros_deep_learning are available for Noetic, Foxy, Galactic, and Humble. They use the `ros:$ROS_DISTRO-pytorch-l4t-*` images built from [jetson-containers](https://github.com/dusty-nv/jetson-containers).
The `--ros` argument to the [`docker/run.sh`](docker/run.sh) script selects the ROS distro to use (the default is Foxy). Containers with ros_deep_learning are available for Noetic, Foxy, Galactic, and Humble. They use the `ros:$ROS_DISTRO-pytorch-l4t-*` container images built from [jetson-containers](https://github.com/dusty-nv/jetson-containers).

For previous information about building the ros_deep_learning package for an uncontainerized ROS installation, expand the section below (the parts about installing ROS may require adapting for the particular version of ROS/ROS2 that you want to install)

<details>
<summary>Legacy Install Instructions</summary>

First, install the latest version of [JetPack](https://developer.nvidia.com/embedded/jetpack) on your Jetson.

Then, follow the steps below to install the needed components on your Jetson.

### jetson-inference

These ROS nodes use the DNN objects from the [`jetson-inference`](https://github.com/dusty-nv/jetson-inference) project (aka Hello AI World). To build and install jetson-inference, see [this page](https://github.com/dusty-nv/jetson-inference/blob/master/docs/building-repo-2.md) or run the commands below:
Expand Down

0 comments on commit e2e4adb

Please sign in to comment.