Meta Platforms has employed an Oslo-based group that till late final yr was constructing artificial-intelligence networking know-how at British chip unicorn Graphcore.
A Meta spokesperson confirmed the hirings in response to a request for remark, after Reuters recognized 10 individuals whose LinkedIn profiles stated they labored at Graphcore till December 2022 or January 2023 and subsequently joined Meta in February or March of this yr.
“We recently welcomed a number of highly-specialized engineers in Oslo to our infrastructure team at Meta. They bring deep expertise in the design and development of supercomputing systems to support AI and machine learning at scale in Meta’s data centers,” stated Jon Carvill, the Meta spokesperson.
The transfer brings extra muscle to the social media big’s bid to enhance how its knowledge facilities deal with AI work, because it races to deal with demand for AI-oriented infrastructure from groups throughout the corporate seeking to construct new options.
Meta, which owns Facebook and Instagram, has change into more and more reliant on AI know-how to focus on promoting, choose posts for its apps’ feeds and purge banned content material from its platforms.
On prime of that, it’s now speeding to hitch rivals like Microsoft and Alphabet‘s Google in releasing generative AI merchandise able to creating human-like writing, artwork and different content material, which buyers see as the subsequent massive progress space for tech firms.
The 10 staff’ job descriptions on LinkedIn indicated the group had labored on AI-specific networking know-how at Graphcore, which develops laptop chips and methods optimized for AI work.
Carvill declined to say what they’d be engaged on at Meta.
Graphcore closed its Oslo workplace as a part of a broader restructuring introduced in October final yr, a spokesperson for the startup stated, because it struggled to make inroads in opposition to US-based companies like Nvidia and Advanced Micro Devices which dominate the marketplace for AI chips.
Meta already has an in-house unit designing a number of sorts of chips aimed toward dashing up and maximizing effectivity for its AI work, together with a community chip that performs a kind of air site visitors management perform for servers, two sources advised Reuters.
Efficient networking is very helpful for contemporary AI methods like these behind chatbot ChatGPT or image-generation device Dall-E, that are far too massive to suit onto a single computing chip and should as an alternative be cut up up over many chips strung collectively.
A brand new class of community chip has emerged to assist preserve knowledge transferring easily inside these computing clusters. Nvidia, AMD and Intel all make such community chips.
In addition to its community chip, Meta can also be designing a fancy computing chip to each practice AI fashions and carry out inference, a course of wherein the educated fashions make judgments and generate responses to prompts, though it doesn’t count on that chip to be prepared till round 2025.
Graphcore, one of many UK’s most precious tech startups, as soon as was seen by buyers like Microsoft and enterprise capital agency Sequoia as a promising potential challenger to Nvidia’s commanding lead out there for AI chip methods.
However, it confronted a setback in 2020 when Microsoft scrapped an early deal to purchase Graphcore’s chips for its Azure cloud computing platform, based on a report by UK newspaper The Times. Microsoft as an alternative used Nvidia’s GPUs to construct the large infrastructure powering ChatGPT developer OpenAI, which Microsoft additionally backs.
Sequoia has since written down its funding in Graphcore to zero, though it stays on the corporate’s board, based on a supply accustomed to the connection. The write-down was first reported by Insider in October.
The Graphcore spokesperson confirmed the setbacks, however stated the corporate was “perfectly positioned” to benefit from accelerating business adoption of AI.
© Thomson Reuters 2023