Aetina AI Inference / DeviceEdge Jetson Platform (AGX Orin 32GB)

SKU: AIB-MX13-1-A1

8 x AI performance enables AI inference with low latency
NVIDIA Jetson AGX Orin module offers up to 275 TOPS, 8 times more than NVIDIA® Jetson AGX Xavier™ 32GB module. Built in up to 2048 NVIDIA® CUDA® cores and 64 Tensor Cores, NVIDIA Jetson AGX Orin module enables server-class AI inference at the edge with low latency.

Comprehensive M.2 expansion and 1x10GbE port
AIB-MX13/23 is a platform with high expansion ability, built in 1x M.2 B-Key for LTE/5G, 1x M.2 M-Key for storage, and 1x M.2 E-Key for Wifi/Bluetooth/GPS. For network, on-board 1x10GbE port offers data speeds up to 10 billion bits per second, 10 times faster than traditional GbE standard.

Wide input power and operating temperature for various embedded applications
To meet various embedded applications, especially challenging environments in smart factory, smart city, and smart transportation, AIB-MX13/23 supports temperature ranging from -25°C to 80°C and wide input power from 9 to 36VDC.

Estimated Lead Time

Configure & Buy
Downloads
Specifications
AIB-MX23-1-A1

Aetina AI Inference / DeviceEdge Jetson Platform (AGX Orin 64GB)

AIB-MX13-1-A1

Aetina AI Inference / DeviceEdge Jetson Platform (AGX Orin 32GB)

Aetina AIB-MX11-12-21-22 Series
AIB-MX22-1-A1

Aetina AIB-MX22 AI Inference / DeviceEdge Jetson Platform (AGX Orin 64GB)

Scroll to Top