acm-header
Sign In

Communications of the ACM

ACM TechNews

Inside Facebook's Artificial Intelligence Engine Room


View as: Print Mobile App Share:
Facebooks new servers for artificial intelligence research, in the companys data center in Prineville, OR.

Facebook is using new, high-powered servers in its Prineville, OR, complex to research machine learning.

Credit: Facebook

A complex in Prineville, OR, houses servers that process information from Facebook's hundreds of millions of users, and it was recently expanded with high-powered servers designed to accelerate research into machine learning.

Each Big Sur server is designed around eight high-powered graphics processing units (GPUs) retasked for machine learning and artificial intelligence (AI) research. Facebook engineer Kevin Lee says the servers help researchers train software using more data by functioning faster.

Facebook has to pack the servers less densely than other types of servers in the data center, given the GPUs' enormous power consumption and the danger they could create hot spots. Facebook has open sourced its server designs and the plans for the Prineville data center to the nonprofit Open Compute Project, to encourage computing companies to collaborate on designs for low-cost, high-efficiency data center hardware.

Facebook AI research director Yann LeCun thinks making Big Sur's designs available could expedite progress by enabling more organizations to build powerful machine-learning infrastructure.

Multiple companies are developing new chip designs that are more customized to the mathematics of deep learning than GPUs. Lee says Facebook is "looking into" designing its own custom chips.

From Technology Review
View Full Article

 

Abstracts Copyright © 2016 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account