close
close

Association-anemone

Bite-sized brilliance in every update

Apple has an “Apple Intelligence Challenge” of 8 million lei for developers
asane

Apple has an “Apple Intelligence Challenge” of 8 million lei for developers

Apple has an

Apple invites security researchers to investigate its Private Cloud Compute (PCC) system, which powers the Apple Intelligence feature. To demonstrate the security and privacy of its AI infrastructure, the tech giant is offering rewards of up to $1 million for discovering vulnerabilities in the PCC system.
While Apple points out that many Apple Intelligence functions run on the device, more demanding tasks are processed on the PCC servers. These servers, built with Apple Silicon and a new operating system, are designed to support Apple’s commitment to user privacy. To ensure transparency and independent verification, Apple opens the PCC to review by security researchers.

How much developers can get for finding bugs in Apple Intelligence

In a blog post, Apple confirmed that researchers who discover flaws that compromise PCC security and privacy can now earn rewards comparable to those offered for iOS vulnerabilities, with the highest payouts for vulnerabilities that expose user data outside of PCC’s secure environment . Here are the reward amounts:

Category Description Maximum reward
Remote data request attack Executing arbitrary code with arbitrary rights $1,000,000
Access to a user’s request data or sensitive information about user requests outside the confidence limit $250,000
Attack data on demand from a privileged position on the network Access to data requested by a user or other sensitive information about the user outside the trust boundary $150,000
Ability to execute untested code $100,000
Accidental or unexpected disclosure of data due to an implementation or configuration problem $50,000

Source: Apple
The new reward categories focus on critical threats such as accidental data disclosure, external compromise through user requests, and physical or internal access vulnerabilities. This initiative underscores Apple’s commitment to the security and privacy of its AI infrastructure and encourages researchers to help identify and address potential vulnerabilities.
In addition, Apple will also consider any security issue that has a significant impact on PCC for an Apple Security Bounty, even if it does not fit a published category. The company will “evaluate each report based on the quality of what is presented, the proof of what can be exploited, and the impact on users.”

Apple is giving developers new tools to analyze PCC security

In its blog post, Apple confirmed the creation of a Virtual Research Environment (VRE) for security researchers to analyze the Private Cloud Compute (PCC) system used for Apple Intelligence. This will enable independent verification of Apple’s privacy claims and in-depth investigation of PCC software, including the Secure Enclave Processor, through tools for version inspection, log checking, booting into a virtual environment, and debugging.
To access the Private Cloud Compute Virtual Research Environment, developers will need a Mac with Apple silicon and at least 16GB of unified memory running macOS Sequoia 15.1 Developer Preview. Detailed instructions on how to get started are available on Apple’s developer website.