Meta Poaches Apple’s AI Executive Who Was Developing ChatGPT-Like Web Search For Siri

Oct 15, 2025 at 10:31pm EDT
Apple logo next to a glowing AI chip graphic with circuitry.

In what comes as a material blow to Apple's still nebulous AI efforts, Meta has reportedly poached a key executive who was spearheading the Cupertino giant's efforts to make Siri a lot more productive.

Ke Yang Is Reportedly Leaving For Meta Just Weeks After Being Appointed The Head Of Apple's Answers, Knowledge and Information (AKI) Team

Bloomberg's Mark Gurman is now reporting that Ke Yang, who was appointed the head of Apple's Answers, Knowledge and Information (AKI) team only weeks earlier, is reportedly leaving for a lucrative stint at Meta Platforms Inc.

Related Story Every 14-Inch M5 MacBook Pro On Amazon, Regardless Of Which Color, Storage Or RAM Version You Purchase, Is $200 Off, Starting From $1,399, Its Lowest Figure Ever

Critically, the AKI team has been working on equipping Apple's bespoke voice assistant, Siri, with the ability to pull user-requested information directly from the web, akin to what OpenAI's LLMs now do as a matter of routine.

This development is, of course, a material blow to Apple's still-fledgling AI efforts. Even so, the Cupertino giant has come a long way from October 2024, when it first rolled out some key AI abilities, sans the most eagerly awaited ones - in-app actions and personal context awareness.

Apple has done a lot in recent months to overcome its AI-related deficits:

  1. Partnered with OpenAI to roll out its Large Language Models (LLMs) within the ambit of Apple Intelligence.
  2. Acquired three startups - including TrueMeeting and WhyLabs - to bolster its in-house AI expertise.
  3. Pivoted towards privacy and user data safety via Private Apple Intelligence - where relatively simple AI tasks are performed by using computational resources of the device itself, while the more complex tasks are offloaded to Apple's private cloud servers using encrypted and stateless data - as a key differentiator in a market where user data is becoming increasingly commoditized.
  4. Built its own 3-billion-parameter AI model that runs efficiently on iPhones and iPads.
  5. Developed a server-based LLM to tackle more complex AI tasks, as well as a diffusion image generation model and a coding model to assist with coding tasks on Apple's bespoke developer tool, Xcode.
  6. Opened up its in-house models to third-party app developers to increase cross-app AI productivity via the Foundational Models Framework.

Moreover, the Cupertino giant is expected to launch the following features in the next few months:

  1. Third-party AI integrations
    • Apple would soon allow users to ask Siri to leverage a particular LLM - OpenAI's GPT or Google's Gemini, for instance - when performing a given AI-related task.
  2. In-app Actions
    • Siri would soon be able to carry out context-based tasks within supported apps via voice commands. This includes adding an item to a grocery list, sending a message through the messaging app, or playing music.
  3. Personal Context Awareness 
    • Siri would soon be able to leverage personal data to offer tailored services such as scouring the Messages app to find a specific podcast mentioned in a text conversation.

Follow Wccftech on Google to get more of our news coverage in your feeds.