It’s part of Microsoft’s ongoing work to make Azure more friendly for developers of high-performance applications, and it also shows the value of the company’s ongoing deployment of field-programmable gate arrays (FPGAs) inside its datacentres. Those chips, which can be programmed to perform particular processes faster than general processors, help drive the performance gains that Microsoft is touting.
The Accelerated Networking feature is available for most general-purpose and compute-optimized virtual machine instances with four or more vCPUs. (Instances that support hyper threading require eight or more vCPUs.) It’s also limited by operating system compatibility — right now, customers can only enable it on instances running compatible versions of Windows Server, Ubuntu, SUSE Linux Enterprise Server, Red Hat Enterprise Linux, and CentOS.
Enabling Accelerated Networking doesn’t cost users anything extra, though it does take some work to get all of the SDN constructs set up properly.
The same hardware that’s powering Accelerated Networking also provides the foundation for Brainwave, a system that the company has developed for quickly running machine learning computations on top of a fleet of FPGAs. That means Microsoft can use some of its FPGA fleet for Accelerated Networking tasks, and then use the rest of it for other projects.