What Is Super Mode on NVIDIA Jetson Orin Nano and NX in JetPack 6.2 SDK Release?

NVIDIA has announced an exciting upgrade for Jetson edge developers: Super Mode is now officially supported on Jetson Orin Nano and Jetson Orin NX production modules through the latest JetPack 6.2 release. 

This enhancement was first introduced with the Orin Nano Developer Kit, where it delivered up to 1.7x performance boost enabling small edge devices to run generative AI models faster than ever before. With full production support now available, Super Mode brings even greater performance benefits to commercial deployments delivering up to 2x acceleration for AI workloads. 

 

What is Super Mode? 

Super Mode is a high-performance operating configuration available on NVIDIA Jetson Orin Nano and Orin NX modules. By increasing the module’s power budget up to 25W for Orin Nano and 40W for Orin NX Super Mode allows the CPU and GPU to operate at higher frequencies, delivering a significant boost in AI inference performance. 

This enhanced mode is designed to accelerate demanding workloads such as computer vision, robotics, and generative AI, where low latency and real-time processing are critical. With the recent release of JetPack 6.2, Super Mode is now officially supported for production deployments, enabling up to 2x higher performance on popular AI models compared to standard power configurations. 

For existing users of the Jetson Orin Nano Developer Kit, this performance upgrade is available through a simple software update, making it easier than ever to unlock advanced generative AI capabilities without any changes to the hardware setup. 

 

What Are the Power Modes for Jetson Orin Nano Super and NX Super? 

 

Source image: NVIDIA

 

With the release of JetPack 6.2, NVIDIA has introduced new high-performance power modes for the Jetson Orin Nano and Jetson Orin NX modules. These include a 25W mode and MAXN SUPER for Orin Nano, and a 40W mode and MAXN SUPER for Orin NX. 

The MAXN SUPER mode unlocks the highest available clock frequencies across CPU, GPU, DLA, and SOC engines. It delivers peak performance but may trigger thermal throttling if power exceeds the module’s TDP.

 Source image: NVIDIA

 

Super Mode offers a substantial leap in AI computing capability, delivering 50–70% more performance than original power modes. For developers and integrators, this means the same hardware platform can now handle higher-resolution video streams, achieve faster inference, or support more complex AI models, simply by updating to JetPack 6.2 and enabling Super Mode. 

 

Super Mode vs. Non-Super Mode: Performance Gains in Generative AI Models  

With the introduction of Super Mode in JetPack 6.2, Jetson Orin Nano and Orin NX modules can achieve up to 2x inference performance over previous configurations, according to NVIDIA’s benchmarks. This performance boost significantly expands what’s possible at the edge enabling more compute-intensive AI applications without requiring hardware changes. 

Based on NVIDIA's benchmarks, Super Mode can accelerate a wide range of models, including large language models (LLMs), vision-language models (VLMs), and vision transformers (ViTs), all of which are becoming increasingly important in industrial AI, robotics, and computer vision systems. 

 

Large Language Models 

  • Jetson Orin Nano 4GB and 8GB show the most significant gains, with up to 1.64× speedup on models like Gemma 2B and SmolLM2.
  • Orin NX 8GB also benefits from 1.4–1.5× improvements across the board.
  • Even Orin NX 16GB, already a high-performing module, sees consistent performance boosts of 1.1–1.26×, depending on model complexity. 

*DNR (Did Not Run) means the module did not have sufficient memory to run the specific model. Additionally, throttling behavior can affect model performance when power or thermal limits are reached. 

Figure 1. Performance improvements for LLMs using Super Mode   Source image: NVIDIA

 

Vision Language Models 

  • Jetson Orin Nano 8GB and Orin NX 8GB show the most significant benefits, with up to 2.01× speedups on models like CLIP (ViT-B/16) and SAM2 base—demonstrating Super Mode’s ability to accelerate complex VLM workloads.
  • Jetson Orin Nano 4GB also sees strong improvements (up to 1.78×), though it encounters DNR (Did Not Run) results on certain models due to limited memory capacity, such as Grounding DINO and ViT-B/32.
  • Jetson Orin NX 16GB, while already high-performing, achieves more modest improvements (typically around 1.1× to 1.25×) as it may already be closer to its thermal or memory throughput limits even in standard mode. 

Figure 2. Performance improvements of VLMs when run using Super Mode 

Source image: NVIDIA

 

Vision transformers 

  • Jetson Orin Nano 8GB and Orin NX 8GB show the most significant benefits, with up to 2.01× speedups on models like CLIP (ViT-B/16) and SAM2 base—demonstrating Super Mode’s ability to accelerate complex VLM workloads.
  • Jetson Orin Nano 4GB also sees strong improvements (up to 1.78×), though it encounters DNR (Did Not Run) results on certain models due to limited memory capacity, such as Grounding DINO and ViT-B/32.
  • Jetson Orin NX 16GB, while already high-performing, achieves more modest improvements (typically around 1.1× to 1.25×) as it may already be closer to its thermal or memory throughput limits even in standard mode. 

 Figure 3. Performance improvements of ViTs when run using Super Mode 

Source image: NVIDIA

 

JCO Series: Jetson Orin Modules with Super Mode   

Our Jetson Orin-based AI computers are now available with Super Mode support, enabling higher AI performance through elevated power modes 25W for Orin Nano and up to 40W for Orin NX. Built with robust thermal engineering, wide temperature tolerance, and industrial-grade reliability, our JCO Series is ready to deploy Super Mode, delivering stable performance for real-time AI workloads in demanding environments. 

JCO-1000-ORN


Fanless mini AI edge computer with support for both Jetson Orin Nano Super and Jetson Orin NX Super modules. It is designed for space-sensitive deployments like AMRs, vision sensors, and embedded AI systems where thermal management and size matter. 

Specification 

Jetson Orin Nano Super 

Jetson Orin NX Super 

Max Power Mode 

25W 

40W 

AI Performance 

20-67 TOPS 

70-157 TOPS 

Operating Temperature 

15W: -20°C to 60°C 

25W: -20°C to 55°C 

40W: -20°C to 40°C 

OS 

Linux Ubuntu 22.04 with JetPack 6.2 SDK 

 

Available Power Modes for JCO-1000-ORN 

  • Jetson Orin Nano 4GB: 10W, 25W, MAXN SUPER
  • Jeson Orin Nano: 8GB, 10W, 25W, MAXN SUPER
  • Jetson Orin NX 8GB: 10W, 15W, 20W, 40W, MAXN SUPER
  • Jetson Orin NX 16GB: 10W, 15W, 25W, 40W, MAXN SUPER 

 Explore JCO-1000-ORN >>

JCO-3000-ORN 


A small form factor, rugged edge AI computer supporting Jetson Orin Nano Super and standard Jetson Orin NX modules. Equipped with extended I/O, and industrial-grade durability, it’s ideal for robotics, machine vision, and real-time AI workloads. 

Specification 

Jetson Orin Nano Super 

Jetson Orin NX 

Max Power Mode 

25W 

25W 

AI Performance 

20-67 TOPS 

70-100TOPS 

Operating Temperature 

15W: -20°C to 60°C 

25W: -20°C to 50°C 

OS 

Linux Ubuntu 22.04 with JetPack 6.2 SDK 

 

Available Power Modes for JCO-3000-ORN 

  • Jetson Orin Nano 4GB: 10W, 25W, MAXN SUPER
  • Jeson Orin Nano: 8GB, 10W, 25W, MAXN SUPER
  • Jetson Orin NX 8GB: 10W, 15W, 20W
  • Jetson Orin NX 16GB: 10W, 15W, 25W


Explore JCO-3000-ORN >>