Deepseek on a Jetson Orin Nano
5. 2. 2025https://www.youtube.com/watch?v=e-EG3B5Uj78
AI summary:
NVIDIA Jetson Orin Nano Super, an advanced edge computing device capable of running deep learning models directly on the hardware. With impressive specifications such as 1024 CUDA cores, 32 Tensor cores, and 8GB of RAM, the Nano is designed for efficient AI workloads. The focus is on using the Deep Seek R1, a next-generation conversational AI model that can be self-hosted, offering advantages in data privacy, control over expenses, and reduced latency. Dave details the setup process using the Olama tool, which simplifies the deployment of large language models, allowing users to run queries locally once the models are downloaded. The video also discusses the reasoning capabilities of Deep Seek R1, highlighting its ability to process complex queries through deductive, inductive, and abductive reasoning. Dave compares the performance of the Jetson Nano with high-end hardware, demonstrating the trade-offs and capabilities of running AI at home versus in the cloud.