What is NVIDIA’s Jetpack SDK for Jetson Modules and what's new in the Latest Jetpack 6.2 Release?



The release of NVIDIA’s Jetpack 6.2 SDK marks another leap forward for developers and businesses building AI-powered edge solutions. As the industry standard for NVIDIA Jetson platforms, Jetpack 6.2 introduces critical upgrades—including the groundbreaking "Super" performance mode—that unlocks new potential for real-time AI inferencing, computer vision, and robotics. 

Premio has fully upgraded our JCO Series NVIDIA Jetson AI edge computers to Jetpack 6.2, delivering cutting-edge performance for industrial automation, autonomous machines, and smart city applications.   

 

What Is Jetpack SDK? 

Jetpack is NVIDIA’s comprehensive software development kit (SDK) for Jetson modules, provided by NVIDIA: 

Jetson Linux 

It’s the board support package for Jetson, including bootloader, Linux kernel, Ubuntu desktop environment, NVIDIA drivers, toolchain and more. It also includes security and Over-The-Air (OTA) features.

Jetson AI Stack 

CUDA Accelerated AI stack which includes a complete set of libraries for acceleration of GPU computing, multimedia, graphics, and computer vision. It supports application frameworks such as Metropolis for Vision AI application, Isaac for robotic applications and Holoscan for high performance computing applications (HPC) with real time insights and sensor processing capabilities from edge to cloud. 

Jetson Platform Services 

A collection of ready to use services to accelerate AI application development on Jetson, features 15+ services, including generative AI services, foundation services, and more.  

Premio adopts the NVIDIA Jetson Orin HW and the Jetson software stack of the latest Jetpack 6.2 SDK. 

Jetson software stack image source: NVIDIA

It’s the backbone for deploying AI at the edge— JetPack 6.2 provides a full production-ready environment for industrial, automotive, medical, and commercial AI edge computing. 

 

What is Jetpack 6? 

JetPack 6 brings the flexibility to run any Linux Kernel and offers wider options of Linux based distros from Jetson ecosystem partners. It also includes the flexibility to update Jetson AI Stack without updating Jetson Linux BSP. 

What is Jetpack 6.2, and What is New? 

JetPack 6.2 is the latest production release of JetPack 6 from NVIDIA. This release includes Jetson Linux 36.4.3, featuring the Linux Kernel 5.15 and an Ubuntu 22.04-based root file system. The Jetson AI stack packaged with JetPack 6.2 includes CUDA 12.6, TensorRT 10.3, cuDNN 9.3, VPI 3.2, DLA 3.1, and DLFW 24.0. Moreover, JetPack 6.2 supports new high-power Super Mode for NVIDIA Jetson Orin Nano and Jetson Orin NX production modules, delivering up to 2x higher generative AI inference performance on Jetson Orin modules. 

Here's what’s new in Jetpack 6.2. 

“Super” Performance Mode: A Game-Changer

Jetpack 6.2 debuts a dedicated high-performance mode ("Super") for NVIDIA Jetson Orin Nano and Jetson Orin NX modules that dynamically optimizes: 

  • CPU/GPU clock speeds for sustained peak throughput.
  • AI Inference: With the introduction of Super Mode with JetPack 6.2, the Jetson Orin Nano and Jetson Orin NX modules deliver up to a 2x inference performance boost on LLMs, VLMs, and vision transformers.   

Learn More about Super Mode >>

 

Latest Linux OS and Software Flexibility 

JetPack 6.2 includes Jetson Linux 36.4.3, the latest Linux OS

The inclusion of the Linux Kernel 5.15 and an Ubuntu 22.04-based root file system ensures compatibility with the latest software and security updates in various applications such as computer vision, robotics, and AI inference.  

Expanded Linux Distro Options

Offers more choices for Linux distributions on Jetson devices, including the ability to run any upstream Linux Kernel greater than 5.14.

 

Enhanced Libraries, APIs & Developer Tools

  • TensorRT 10.3: Better model optimization (support for newer PyTorch/TF graphs) with a deep learning inference optimizer and runtime, delivering low latency and high throughput. 
  • CUDA 12.6 & cuDNN 9.3: Accelerated math ops for transformer models. 
  • VPI 3.2: Improved stereo depth perception for computer vision. 
  • Multimedia API: Application deployment-ready APIs such as Camera application API and Sensor driver API for camera and sensor integration.  
  • Key SDKs Supported: DeepStrem SDK for multi-sensor processing, Isaac™ ROS for high-performance robotics hardware, and Holoscan for streamlining AI and HPC applications for real-time AI insights. 
  • Developer tools new release: NVIDIA Nsight Systems 2024.5 for application profiling across GPU and CPU; NVIDIA Nsight Graphics 2024.2 for graphics application debugging and profiling. 

 

Security & Camera Updates 

  • Security: Support for Firmware-based Trusted Platform Module (fTPM) on the T234 platform; Fixes for known security vulnerabilities.
  • Camera: Enhanced the Argus library, reducing CPU utilization by up to 40%. 

NVIDIA Jetpack Roadmap | Source: NVIDIA

 

 Jetpack 6.2 vs. 6.1: Key Improvements 

Category 

JetPack 6.1  

JetPack 6.2  

Impact 

Power Efficiency 

Orin Nano: up to 15W 
Orin NX: up to 25W 

Orin Nano Super: up to 25W & MAXN 
Orin NX Super: up to 40W & MAXN 

-Optimized power estimator tool;  

- Provide options from low power to maximized computational throughput 

Memory Bandwidth* 

-Orin Nano 4GB: 34 GB/s 

-Orin Nano 8GB: 68 GB/s 

-Orin NX 8GB/16GB: 102 GB/S 

-Orin Nano 4GB Super: 51 GB/s 

-Orin Nano 8GB Super: 102 GB/s (+50% BW) 

-Orin NX 8GB/16GB Super: 102 GB/s 

- 50% BW improvement on Orin Nano  

- Faster multi-modal AI (vision + language). 

Generative AI 

Introducing improvements to the AI compute stack and other features 

Introduces "Super Mode" for Jetson Orin Nano and Orin NX modules,  

- Improving generative AI inference speed, by up to 2x 

Memory bandwidth specifications sourced from NVIDIA’s JetPack 6.2 announcement.


Super Mode: Unleashing Peak Performance in Jetpack 6.2 

JetPack 6.2 introduces supports new high-power Super Mode for NVIDIA Jetson Orin Nano and Jetson Orin NX production modules that delivers: 

  • Up to 70% increase in AI TOPS for Jetson Orin NX series, and a comparable AI TOPS improvement along with a 50% boost in memory bandwidth.
  • Up to 2x higher generative AI inference performance on Jetson Orin NX & Nano modules for the most popular large language models (LLMs), vision language models (VLMs) and vision transformers (ViTs). 

Support New Power Modes and AI Performance: 

Module 

Existing Modes Power 

Super Modes Power (JetPack 6.2) 

Existing Modes AI TOPS 

Super Modes AI TOPS 

Orin Nano 4GB 

7W, 10W 

7W, 25W, MAXN SUPER 

20 TOPS 

34 TOPS 

Orin Nano 8GB 

7W, 15W 

15W, 25W, MAXN SUPER 

40 TOPS 

67 TOPS 

Orin NX 8GB 

10W, 15W, 20W, MAXN 

10W, 15W, 20W, 40W, MAXN Super 

 

70 TOPS 

117 TOPS 

Orin NX 16GB 

10W, 15W, 25W, MAXN 

10W, 15W, 25W, 40W, MAXN Super 

100 TOPS 

157 TOPS 

Note: MAXN SUPER removes power caps but may throttle under thermal limits. Custom power modes are recommended for optimal balance. 

Premio’s JCO Series is ready to deploy Super Mode on its JCO-3000-ORN and JCO-1000-ORN Series, delivering stable performance for real-time AI workloads in demanding environments.  

 

Premio products with Jetpack 6.2 


 

JCO-6000-ORN High-performance AI Edge Computer 

Powerful Edge AI Computer with NVIDIA Jetson Orin AGX Modules (275 TOPS) for real-time AI inference applications such as robotics, industrial automation, security & surveillance, and medical imaging 

JCO-3000-ORN Mid-Range AI Edge Computer 

Power-Efficient Edge AI Computer with NVIDIA Jetson Orin NX and Orin Nano Super Modules (up to 100 TOPS). Equipped with extended I/O, modular expansion, and industrial-grade durability, it’s ideal for robotics, machine vision, and real-time AI workloads. 

JCO-1000-ORN Fanless Mini AI Edge Computer 

Lite-Performance Edge AI Computer with NVIDIA Jetson Orin NX Super & Orin Nano Super Modules (up to 157 TOPS of AI Performance). It is designed for space-sensitive deployments like AMRs, vision sensors, and embedded AI systems where thermal management and size matter.