dusty-nv / jetson-inference

Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.

Home Page:https://developer.nvidia.com/embedded/twodaystoademo

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

build a GPU accelerated docker container with jetson-inferense, python3.10 and ros2 humble for jetson nano 4G

fatemeh-mohseni-AI opened this issue · comments

Hello.
I have a Jetson Nano 4G and must use its GPU.
I need python3.8 << and ros2 Humble or foxy.
I know it can not be installed on Jetpack 4.6.1, so I thought I might have it in a docker container. I also need to use
the jetson-inference package inside the container.
is that scenario possible?

(Note that ros Foxy can be installed on Ubuntu 20 but humble needs ubuntu22. It would be great if I could use Humble)

can I have a docker container which uses GPU and has python3.10, ROS Humble or foxy, and jetson-inference ?
this is a vital question for me and I would appreciate it if anyone can help me with it.
thanks

I think you can use this container mentioned here, by running this-

docker/run.sh -c dustynv/ros:humble-pytorch-l4t-r35.3.1

I think you can use this container mentioned here, by running this-

docker/run.sh -c dustynv/ros:humble-pytorch-l4t-r35.3.1

thanks for your reply.
it's just one thing I thought you might know.
The Jetpack 4.6.1 comes with CUDA 10.2
if the version of CUDA of that image differed from 10.2 , would it be a problem? or not

Hi, Jetson Inference has suitable container for 4.6.1 as well. check here.

Just add the container tag while running the docker container. Hope it helps.

Hi, Jetson Inference has a suitable container for 4.6.1 as well. check here.

Just add the container tag while running the docker container. Hope it helps.

Thanks. I checked there but L4T R32.7.1 based on JetPack 4.6.1 has Ubuntu 18 and python3.6.
it takes time to change the python3 version and build ros2 compatible with python3.8+.
at the end I have found this repository in DockerHub , it may help others .

I have used image: timongentzsch/l4t-ubuntu20-ros2-desktop

it has ubuntu20, python 3.8, ros foxy and also supports GPU by CUDA10.2.

after all, I did, I couldn't build jetson-inference in the container but at least, that image supports the rest of the things

Did it work? I plan to use mine in the same way and continue with the GPU support.