Microsoft is on its way to helping developers perform artificial intelligence (AI) via its new computer chip, Project Brainwave, to lure users to Azure Cloud.
Project Brainwave will allow developers to use Microsoft’s data centres using field-programmable gate arrays (FPGAs), and has been built with three main layers:
- A high-performance, distributed system architecture
- A hardware DNN engine synthesized onto FPGAs
- A compiler and runtime for low-friction deployment of trained models.
According to Microsoft’s blog post, by attaching high-performance FPGAs directly to your data centre network, they can serve DNNs as hardware microservices, where a DNN can be mapped to a pool of remote FPGAs and called by a server with no software in the loop.
Microsoft also revealed in its AI blog post: “Project Brainwave is a hardware architecture designed to accelerate real-time AI calculations.
“The Project Brainwave architecture is deployed on a type of computer chip from Intel called a field programmable gate array, or FPGA, to make real-time AI calculations at a competitive cost and with the industry’s lowest latency, or lag time.
“This is based on internal performance measurements and comparisons to other organisation’s publicly posted information.”
A preview of Project Brainwave integrated with Azure Machine Learning is being shown today and tomorrow at Microsoft’s Build Developers conference in Seattle.
Written by Leah Alger