Economical Workstation and Deployable Edge Computing Solutions for AI Inference

nextcomputingCompany News

This new whitepaper from AMD explores uses cases as well as AMD EPYC processor advancements and performance that benefit AI inference

While AI training is extremely data- and processing-intensive, AI inference, where the trained model processes incoming data in real time, can be done economically with off-the-shelf servers equipped with AMD EPYC™ processors. With up to 96 cores per 4th Gen AMD EPYC processor, these servers can accelerate a range of data center and edge applications including customer support, retail, automotive, financial services, medical, and manufacturing.

NextComputing deployable servers featuring AMD EPYC processors are a highly economical solution for AI inferencing near your data, whether at the core or the edge.

Download Whitepaper (PDF)