Microsoft Unveils Maia, Cobalt Chips for AI Computing at Ignite 2023

0
13
Microsoft Unveils Maia, Cobalt Chips for AI Computing at Ignite 2023


Microsoft on Wednesday introduced a duo of custom-designed computing chips, becoming a member of different large tech corporations that – confronted with the excessive price of delivering synthetic intelligence providers – are bringing key applied sciences in-house.

Microsoft stated it doesn’t plan to promote the chips however as a substitute will use them to energy its personal subscription software program choices and as a part of its Azure cloud computing service.

At its Ignite developer convention in Seattle, Microsoft launched a brand new chip, known as Maia, to hurry up AI computing duties and supply a basis for its $30-a-month “Copilot” service for enterprise software program customers, in addition to for builders who need to make {custom} AI providers.

The Maia chip was designed to run giant language fashions, a sort of AI software program that underpins Microsoft’s Azure OpenAI service and is a product of Microsoft’s collaboration with ChatGPT creator OpenAI.

Microsoft and different tech giants equivalent to Alphabet are grappling with the excessive price of delivering AI providers, which may be 10 occasions higher than for conventional providers equivalent to engines like google.

Microsoft executives have stated they plan to sort out these prices by routing practically the entire firm’s sprawling efforts to place AI in its merchandise by way of a typical set of foundational AI fashions. The Maia chip, they stated, is optimized for that work.

“We think this gives us a way that we can provide better solutions to our customers that are faster and lower cost and higher quality,” stated Scott Guthrie, the chief vice chairman of Microsoft’s cloud and AI group.

Microsoft additionally stated that subsequent yr it can provide its Azure clients cloud providers that run on the latest flagship chips from Nvidia and Advanced Micro Devices. Microsoft stated it’s testing GPT 4 – OpenAI’s most superior mannequin – on AMD’s chips.

“This is not something that’s displacing Nvidia,” stated Ben Bajarin, chief government of analyst agency Creative Strategies.

He stated the Maia chip would permit Microsoft to promote AI providers within the cloud till private computer systems and telephones are highly effective sufficient to deal with them.

“Microsoft has a very different kind of core opportunity here because they’re making a lot of money per user for the services,” Bajarin stated.

Microsoft’s second chip introduced Tuesday is designed to be each an inner price saver and a solution to Microsoft’s chief cloud rival, Amazon Web Services.

Named Cobalt, the brand new chip is a central processing unit (CPU) made with know-how from Arm Holdings. Microsoft disclosed on Wednesday that it has already been testing Cobalt to energy Teams, its enterprise messaging instrument.

But Microsoft’s Guthrie stated his firm additionally desires to promote direct entry to Cobalt to compete with the “Graviton” sequence of in-house chips supplied by Amazon Web Services (AWS).

“We are designing our Cobalt solution to ensure that we are very competitive both in terms of performance as well as price-to-performance (compared with Amazon’s chips),” Guthrie stated.

AWS will maintain its personal developer convention later this month, and a spokesman stated that its Graviton chip now has 50,000 clients.

“AWS will continue to innovate to deliver future generations of AWS-designed chips to deliver even better price-performance for whatever customer workloads require,” the spokesman stated after Microsoft introduced its chip.

Microsoft gave few technical particulars that will permit gauging the chips’ competitiveness versus these of conventional chipmakers. Rani Borkar, company vice chairman for Azure {hardware} programs and infrastructure, stated each are made with 5-nanometer manufacturing know-how from Taiwan Semiconductor Manufacturing Co.

She added that the Maia chip can be strung along with commonplace Ethernet community cabling, moderately than a dearer {custom} Nvidia networking know-how that Microsoft used within the supercomputers it constructed for OpenAI.

“You will see us going a lot more the standardization route,” Borkar informed Reuters.

© Thomson Reuters 2023


Affiliate hyperlinks could also be robotically generated – see our ethics assertion for particulars.



Source hyperlink