TechMeta Explores Cutting-Edge CPU Technology for AI and Machine Learning in Datacenters,...

Meta Explores Cutting-Edge CPU Technology for AI and Machine Learning in Datacenters, Posing a Threat to Nvidia’s Dominance

Meta

(Image credit: Unsplash)

mostbet

Meta’s Plan to Develop Custom Artificial Intelligence Chips

Recently, there have been reports indicating that Meta Platforms, the parent company of Facebook, is gearing up to implement its own specially designed artificial intelligence chips, known as Artemis, in its data centers. Despite this, Meta will continue to utilize Nvidia H100 GPUs in conjunction with their new chips, at least for the time being.

According to The Register, there have been job listings for ASIC engineers with expertise in architecture, design, and testing in Bangalore, India, and Sunnyvale, California. This indicates that Meta is moving towards developing its own AI hardware.

Roles posted on LinkedIn emphasize the need for professionals to assist in the architecture of cutting-edge machine learning accelerators and the design of complex SoCs and IPs for data center applications. Some of these positions were initially advertised in late December 2023 and reposted recently, with salaries in Sunnyvale reaching nearly $200,000.

Artificial General Intelligence and Meta’s Ambitions

While the specifics of Meta’s project are still undisclosed, it is likely connected to the “Meta Training Inference Accelerators” slated for release later this year. This move aligns with Meta’s aspirations in the realm of artificial general intelligence, which may require specialized silicon.

Given the growing demand for AI technology and Nvidia’s challenges in meeting such demand, Meta’s decision to develop its own hardware is a strategic move to secure a competitive edge in a highly competitive market.

Reports suggest that the Indian government is likely to view Meta’s expansion into Bangalore positively, as the country aims to establish itself as a major player in the global semiconductor industry. Additionally, rumors hint at Microsoft reducing its reliance on Nvidia by developing a server networking card to enhance machine learning workload efficiency. This trend indicates that Nvidia’s key competitors are seeking ways to decrease dependency on the company’s highly sought-after hardware.

TechRadar Pro Insights

For more insights and news on technology advancements, subscribe to the TechRadar Pro newsletter. Stay informed with the latest news, opinions, features, and guidance necessary for your business success.

About the Author: Wayne Williams

Wayne Williams is an experienced freelancer contributing news articles for TechRadar Pro. With a writing career spanning over 30 years, he has covered topics ranging from computers and technology to the internet. Williams has been a prominent writer for various PC magazines in the UK and has even launched and managed some publications in the industry.

Related Articles:

  • Meta’s Shift Towards Custom AI Chips Raises Concerns for Nvidia and AMD
  • Meta’s Future Plans Include Integration of AI Chips in Servers by 2024
  • Intel Introduces a New Competitor to Nvidia’s Popular H100 AI GPU

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Subscribe Today

GET EXCLUSIVE FULL ACCESS TO PREMIUM CONTENT

SUPPORT NONPROFIT JOURNALISM

EXPERT ANALYSIS OF AND EMERGING TRENDS IN CHILD WELFARE AND JUVENILE JUSTICE

TOPICAL VIDEO WEBINARS

Get unlimited access to our EXCLUSIVE Content and our archive of subscriber stories.

Exclusive content

Latest article

More article