close
close

Association-anemone

Bite-sized brilliance in every update

Apple is daring anyone to find a problem with its beloved AI, offering a  million reward
asane

Apple is daring anyone to find a problem with its beloved AI, offering a $1 million reward

If you find something wrong with Apple Intelligence’s Private Cloud Compute, you can get between $50,000 and $1 million.

Apple is very proud of the privacy apparatus around Apple Intelligenceso proud to offer princely sums to anyone who finds any privacy issue or attack vector in his code. The first Apple bug bounty program for its AI is offering a hefty $50,000 for anyone who finds any accidental data leaks, but the real prize is $1 million for a remote attack on Apple’s new cloud processing.

Apple first announced its Private Cloud Compute in June, while also detailing all new AI features it’s coming to iOS, iPadOS, and eventually MacOS. The most important aspect of Apple’s AI was the revived Siri, which is able to work between apps. As presented, Siri could go into your messages to get some information about a cousin’s birthday that your mom sent you, then pull additional information from your emails to create an event from calendar. This also required data processing through Apple’s internal cloud servers. Apple, for its part, would manage a treasure trove of user data that most people would like to keep private.

In keeping with its privacy-minded reputation, Apple says Private Cloud Compute is an additional layer of software and hardware security. Simply put, Apple claims that your data will be secure and that it will not and cannot retain it.

Which brings us to the security rewards program. On a Thursday blog postApple’s security team said it invites “all security researchers — or anyone interested and with a technical curiosity … (to) conduct their own independent verification of our claims.”

Until now, Apple has said it has allowed third-party auditors to look into it, but this is the first time it has opened it up to the public. Provides a security guide and access to avreal research environment to review PCC in the macOS Sequoia 15.1 Developer Preview. You’ll need a Mac with an M-series chip and at least 16GB of RAM to access. The Cupertino company provides the cloud computing source code in a Github repository.

Beyond calling all the hackers and script kids to the table, Apple offers a wide variety of payouts for any bugs or security issues. The $50,000 base is only for “accidental or unexpected disclosure of data,” but you could get up to $250,000 for “access to user-requested data or sensitive user-requested information.” The first $1 million bounty is for “arbitrary code execution with arbitrary rights.”

It’s an indication of how confident Apple is in this system, but at least the open invitation could allow more people to get under the hood with Apple’s cloud processes. The the initial release of iOS 18.1 is set to hit iPhones on October 28. There is already one beta for iOS 18.2 which gives users access to ChatGPT integration. Apple forces users to grant permission to ChatGPT before it can see any of your requests or interact with Siri. The OpenAI chatbot is just a fallback before Apple gets a chance to fully field its own AI.

Apple expresses its strong record on privacy issues, although it has a penchant for tracking users within their own software ecosystems. In the case of PCC, Apple claims it won’t have any ability to check your logs or requests with Siri. Perhaps anyone with access to the source code can fact-check the tech giant on its privacy claims before Siri finally gets the upgrade. probably sometime in 2025.