Aetina AI Inference / DeviceEdge Jetson Platform (AGX Orin 64GB)

SKU: AIB-MX23-1-A1

8 x AI performance enables AI inference with low latency
NVIDIA Jetson AGX Orin module offers up to 275 TOPS, 8 times more than NVIDIA® Jetson AGX Xavier™ 32GB module. Built in up to 2048 NVIDIA® CUDA® cores and 64 Tensor Cores, NVIDIA Jetson AGX Orin module enables server-class AI inference at the edge with low latency.

Comprehensive M.2 expansion and 1x10GbE port
AIB-MX13/23 is a platform with high expansion ability, built in 1x M.2 B-Key for LTE/5G, 1x M.2 M-Key for storage, and 1x M.2 E-Key for Wifi/Bluetooth/GPS. For network, on-board 1x10GbE port offers data speeds up to 10 billion bits per second, 10 times faster than traditional GbE standard.

Wide input power and operating temperature for various embedded applications
To meet various embedded applications, especially challenging environments in smart factory, smart city, and smart transportation, AIB-MX13/23 supports temperature ranging from -25°C to 80°C and wide input power from 9 to 36VDC.

Downloads
Specifications
AIE-KN42-1-A1

Aetina AI Inference / DeviceEdge Jetson System (Orin NX 16GB)

AIE-KN32-1-A1

Aetina AI Inference / DeviceEdge Jetson System (Orin NX 8GB)

AIE-KO32-1-A1

Aetina AI Inference / DeviceEdge Jetson System (Orin Nano 8GB)

Scroll to Top

Quote Request

Fill out the form below, and we will be in touch shortly.