Apple is offering up to $1 million for successfully completing a very challenging task.
This initiative coincides with the upcoming launch of Apple Intelligence on all iPhones, scheduled for October 28.
iOS 18.1 will also be released, marking the debut of iPhone AI features, which will allow users to interact with an enhanced version of Siri.
In the ongoing competition regarding security, particularly in private AI options, Apple has emerged as the leading brand, surpassing Google and Samsung.
This achievement is largely due to Apple Intelligence, which processes a vast amount of data directly on its devices.
To thoroughly assess its readiness for public use, Apple has called on researchers to test the security of ‘Private Cloud Compute’ (PCC).
These servers are responsible for receiving and processing user requests for Apple Intelligence when the AI requests exceed on-device capabilities.
In response to privacy concerns, Apple designed the Private Cloud Compute servers to erase user requests once tasks are completed.
They also incorporate end-to-end encryption, ensuring that Apple cannot access user requests made through Apple Intelligence, despite hosting and managing the servers.
As privacy continues to be a top priority for the public, Apple is keen to highlight the privacy of Private Cloud Compute.
Initially, Apple invited a select group of researchers to attempt breaching the system, but has since expanded the opportunity to anyone interested.
To facilitate this, Apple is providing access to the source code for key elements of Private Cloud Compute, allowing for software analysis before any hacking attempts.
Additionally, Apple developed a virtual research environment for macOS that runs on Private Cloud Compute software and offers a security guide detailing the server system for Apple Intelligence.
The company stated: “To further encourage your research in Private Cloud Compute, we’re expanding Apple Security Bounty to include rewards for vulnerabilities that demonstrate a compromise of the fundamental security and privacy guarantees of PCC.”
Potential rewards include $250,000 for remotely hacking Private Cloud Compute and exposing a user’s data request file.
A $1 million prize is available for anyone who can remotely execute rogue computer code with privileges on the servers.
Apple also mentioned it might offer rewards for reported vulnerabilities that don’t align with existing categories.
They added: “We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time.”