Distributed edge computing solutions are pushing further into remote, mobile and unstable conditions to generate news insights from millions of smart and connected sensors. Devices dispatched to the "rugged edge" require versatile engineering to support real-time processing and inference calculations in conditions prohibitive to traditional pc design. Specialized inference computers deployed at the edge must endure temperature extremes, accommodate questionable power sources, and mitigate kinetic factors as they process great volumes of data through a wide variety of I/O ports. Rich wireless connectivity permits uninterrupted communication, monitoring, data transfer and automation to meet the incredible demands of rugged edge computing.
Inference At the Source
Secure and Private
Industrial Grade Design
How will the age of AI and IoT shape the world for more business intelligence?
Premio is a global provider for highly reliable, world-class computing solutions to support innovative businesses and help them attain new insights. Our trusted rugged edge computers are being deployed to manage the most complex machine learning workloads for inference analysis and computer vision.
Embedded Processors for Edge Performance
AI Edge Inference computers leverage the latest advancements in x86 silicon from Intel and deliver incredible processing power to analyze data and run intelligent workloads. A key building block for new complex edge workloads depend on reliable processing in close proximity to where data is created, at the edge from an array of IoT sensor endpoints. Our computing designs take into account a balance of both performance and power-efficiency in the most mission-critical IoT deployments. Processors options in 35W / 65W options help system integrators scale with confidence and long-term reliability. Premio’s unique modular design for its AI edge computers segregates the CPU in a fanless design proven to withstand the rigors of industrial and mobile deployments for the edge.
Dedicated GPU for Inference and Machine Learning
The next generation of edge computing pushes proven machine learning algorithms from the cloud into the edge. These proven and pre- trained algorithms are now being deployed on rugged edge computers and run artificial intelligent models for inference analysis and object detection in real-time. Our AI Edge Inference computers provide the necessary computing architecture to support graphics processing engines (GPU) for parallelism processing with incredible speed and accuracy. These purpose-built edge inference computers provide the performance benchmarks necessary for millisecond decision-making, a major processing requirement for edge computing workloads
More Localized Storage: NVMe and SATA Capacities
AI Edge Inference computers take a new approach to high-performance storage by supporting options for both high-speed NVMe and traditional SATA storage drives. As more rugged edge computing solutions shift into more mobile and remote environments, access to local storage is paramount for capturing data from IoT sensors at the edge. Next generation of localized edge computing solutions can manage and offload mission-critical data for microsecond data parsing and manipulation for edge computing workloads.
Ruggedized Design for Harsh Edge Environments
AI Edge Inference computers are designed to support software advancements with decades of hardware strategies, providing the rugged, high-performance systems that enable reliable deployment in the most severe physical settings. From hardened external enclosure heatsinks to industrial-grade internal components, every element of a rugged edge computer is purpose-built through a combination of mechanical and thermal engineering designs to address environmental issues such as strong vibration and shock, severe temperatures, and the presence of moisture or dirt.
Mobility Matters: Power Ignition Management with Wide Range Voltage Input
AI Edge Inference computers push new boundaries in the most mobile and remote applications where power input varies. Especially for in-vehicle type deployments these ruggedized edge computers have been engineered to support a special power input in order to ensure zero-downtime processing and immediate data access. The power ignition management feature provides an option to connect power input directly from a 24VDC car battery and programmable options for delay ignition shutdowns in order to protect valuable data at the edge.
Wireless Connectivity at the Edge
AI Edge Inference computer consolidates processing and wireless connectivity for the most demanding edge workload to ensure data telemetry. The need for wireless connectivity in remote and mobile deployments is evident in next generation wireless connectivity technologies in WiFi6 , 4G/LTE, and 5G will enable faster bandwidths to send and receive data.
TPM technology secures connected systems at the hardware level. Premio’s embedded and rugged edge computers are deployable with TPM microcontrollers protecting systems and data. This enterprise-level security is extended to IoT devices connected to systems through authentication and key management.
Variety of I/O and I/O Expansion
AI Edge Computers act as central hubs to consolidate processing workloads from an array of IoT sensors and endpoints in both analog and digital signal. As these AI Edge computers are deployed in a variety of IoT applications streamlined for automation control, the ability to offer flexible I/O port options ensures device connectivity at the edge. In addition to ruggedization, our AI Edge Computers support a variety of I/O ports: Serial Analog (RS-232/422/485), USB 2.0/3.0 (10Gbps), Video Display ( VGA, DVI, DP), Ethernet RJ45/m12 locking connectors ( LAN/PoE), and DIO/GPIO for digital signals.