3 Reasons I Trust Apple Intelligence More Than Microsoft Recall

Key Takeaways Always be skeptical: Apple Intelligence seems more trustworthy than Microsoft Recall, but trusting any company wholeheartedly is a mistake. Apple’s transparency in allowing security researchers to review Private … Read more

Taylor Bell

Taylor Bell

Published on Jul 12, 2024

3 Reasons I Trust Apple Intelligence More Than Microsoft Recall

Key Takeaways

  • Always be skeptical: Apple Intelligence seems more trustworthy than Microsoft Recall, but trusting any company wholeheartedly is a mistake.
  • Apple’s transparency in allowing security researchers to review Private Cloud Compute software is a major step forward.
  • Effort and transparency are crucial: Apple’s secure approach with PCC contrasts Microsoft’s negligent practices with Recall.

There’s more debate than ever about what is a feature and what’s a security nightmareand the line is becoming increasingly blurred. Apple announced Apple Intelligence, a suite of AI features that relies on cloud computing and optional outsourcing to OpenAI for ChatGPT integration. Microsoft previewed Recall, a “memory” of sorts for your computer, but that was delayed indefinitely after security researchers found it was storing the data in plain-text logs. There’s also a Recall-esque feature rumored to be in the works at Google. So, which of these intrusive software features should you trust, if any?

You should be skeptical of everything, but if we’re ranking these services, I’d trust Apple Intelligence way more than Microsoft Recall. My colleague Adam Conway recently explained why he thinks Recall is easier to trust than Apple Intelligence, and it has to do with how everything Recall stores is kept on-device. It’s a completely fair argument, but in my opinion, on-device processing isn’t inherently safer than cloud compute. Microsoft made the choice to neglect security and privacy by storing information in plain text, whereas Apple developed an entirely-new Private Cloud Compute operating system to ensure your data is safe — wherever it’s stored.

I’ll take the company putting in the highest amount of effort possible into keeping my data safe over another that isn’t, every single time. The disaster that was the announcement and unraveling of Recall proves the technical benefits of on-device processing can’t overcome the drawbacks of human (or corporate) incompetence. To the credit of Apple, I think it’s trying. After seeing that the company intended to store personal and private data in plain-text logs, I can’t say the same about Microsoft.

3 It’s open and reviewable

Apple isn’t hiding anything — it’ll pay you if you crack Private Cloud Compute

WWDC

Source: Apple

Apple historically keeps its proprietary software close to its chest and private, but it knows that Apple Intelligence won’t be accepted with blind trust alone. That’s why it made a pretty significant promise to make Private Cloud Compute — the mode of Apple Intelligence that is off-device and runs on a custom, secure OS on Apple silicon servers — verifiable and transparent. As the company explains in a security blog post, Apple will make every production build of PCC available for review by security researchers:

When we launch Private Cloud Compute, we’ll take the extraordinary step of making software images of every production build of PCC publicly available for security research . This promise, too, is an enforceable guarantee: user devices will be willing to send data only to PCC nodes that can cryptographically attest to running publicly listed software. We want to ensure that security and privacy researchers can inspect Private Cloud Compute software, verify its functionality, and help identify issues — just like they can with Apple devices.

This isn’t something a lot of companies do, especially for server-side software that users won’t directly interact with. It’s proprietary to Apple, and even includes a custom hybrid of the foundations iOS and macOS. The move to make PCC software available for review demonstrates two things. It shows that Apple is confident enough in its security practices that it will outline its goals for privacy, and let researchers verify that it is meeting these self-imposed goals. It’s also evidence Apple cares enough about security to let its proprietary software — which is usually kept internally — out into the wild for review.

For what it’s worth, Apple is putting its money where its mouth is. “The Apple Security Bounty will reward research findings in the entire Private Cloud Compute software stack — with especially significant payouts for any issues that undermine our privacy claims,” the company writes on the security blog. In other words, Apple is saying this: if you prove us wrong, we’ll pay you well. Security bounties aren’t new, but this kind of commitment is yet another indicator that Apple thinks Private Cloud Compute is the most secure cloud system we’ve ever seen.

I may not take Apple’s word for it that Apple Intelligence and PCC are secure, but I will take the word of the independent and proven security researchers that will surely try their hardest to dismantle PCC before it debuts. Those same researchers are the ones who exposed the flaws with Microsoft’s Recall. It’s possible that they will find problems with Apple Intelligence and PCC, but until that happens, I’ll trust Apple Intelligence more.

2 It’s transparent and full of choice

Don’t want to use ChatGPT integration? You don’t have to

apple intelligence writing tools compose prompt with ChatGPT

Source: Apple

Apple isn’t hiding anything when it comes to security, and it’ll be up to the user whether they want to trust Apple Intelligence or ship their requests off to OpenAI for ChatGPT’s answer. Unfortunately, users won’t know whether their Apple Intelligence request is being handled on-device or by PCC, but they will be explicitly asked whether they want to use ChatGPT. Apple also clearly states what kind of processes use certain security practices, like end-to-end encryption. You can read up on how data is secured, how it’s transferred, and more, and then make your decision on whether to use it.

ChatGPT Mac app responding to a query.

Related

There’s no way to completely opt out of Apple Intelligence — at least that we know of now — although you can opt out of ChatGPT integration. Of course, users can always just not use Apple Intelligence-supported features if they aren’t comfortable with them. In some ways, Recall was set to give you more control than Apple Intelligence. You’d be able to opt out entirely, block certain websites and apps, and disable Recall when needed.

However, transparency took a hit when it was revealed that Recall stored data in plain text. That means any app or user with access to your computer’s storage could view private data, and these possibilities weren’t referenced at all when Recall was first announced. Microsoft said that your data would be stored on-device. What it neglected to mention is that everything on your device could gain access to Recall data. This omission suggests a lack of transparency from Microsoft, as it obviously knew — or at least, should have known — the risks with its data storage methods.

1 Apple is actually trying

Effort and intent matter, and we have to stop pretending they don’t

Image Playground running on an iPad

Source: Apple

Again, it’s possible that security researchers will find ways to thwart Private Cloud Compute. It’s impossible to ignore the effort Apple put into securing PCC even if that happens. Microsoft’s approach to keeping Recall data on-device was lazy, and insecure. There is more to data security than simply considering whether it is kept on-device or off-device. The former is more secure in theory, but the latter can sometimes be more secure in practice, and I’ll explain why.

Imagine you left your social security card — a very important and private document for U.S. citizens — on your dinner table, in your locked house. The card doesn’t ever leave your house, and it is protected by a lock. But it still isn’t very secure. You might willingly let people in your house that you don’t want to see your social security number. You might also not want to make it so easy for a would-be thief to steal your identity by leaving it right on the table.

Copilot Plus laptops on a table.

Related

Copilot+ is now here: Let’s clear up everything about that controversial Recall feature

You can now go out and grab a device that can run Copilot+, so let’s explore how Microsoft has handled the Recall issue before you buy one.

Now let’s rethink this scenario in the context of Recall. It stores your data on-device, and tracks your every move by default. That data is locked behind a computer password, but is stored in plain text for anyone who has access to your computer to see. Would you want someone you let borrow your laptop for a second to be able to trace your every move with Recall data? Would you want someone hacking your PC to be able to easily find a record of your activity, revealing potentially sensitive information? I’d guess the answer would be no.

The point of this analogy is to show that security efforts don’t stop when you keep data on-device — there’s still work that needs to be done, and Microsoft didn’t do it. The concept of storing this kind of information in plain-text logs is so unheard of that the researcher who discovered it said it “deliberately set[s] privacy back a decade.” It’s not like Microsoft made a mistake, it’s like the company didn’t even try.

Meanwhile, here are just a few bits about the work Apple did to make Private Cloud Compute with Apple Intelligence as secure as possible:

PCC-stateless-computing

Source: Apple

I’m no developer or engineer, but I know that it is quite difficult to build something like PCC without server management tools or remote shells built-in. Apple went to great lengths to make this thing secure, and the effort Apple makes here is worthy of my trust. I wholeheartedly believe that if Apple Intelligence and Recall launched today, Apple Intelligence would be more secure. It’s much easier to breach a Windows laptop than it would be to breach a PCC server.

It would be incredibly premature to say that Apple Intelligence and PCC are impenetrable. I won’t make that claim in the slightest. However, I do think Apple’s effort in building secure environments at the start of PCC foreshadows the effort it would put into fortifying it in the event it had a security flaw. By comparison, Microsoft couldn’t be bothered to figure out a way to store Recall data in something other than plain text. It makes me think that Microsoft never really cared about making Recall as secure as it could be, which is why I won’t trust it.

My advice is to always be skeptical

I was skeptical of Microsoft’s Recall when it was first announced, and far before researchers dissected its poor security practices. I have the same skepticism toward Apple Intelligence, but I believe what I can see, and Apple is releasing everything there is to know about Private Cloud Compute before it launches. Security experts can review every part of Private Cloud Compute, down to the bootloader in plain text. If a white-hat hacker finds a flaw in PCC, Apple will pay them. Researchers have the tools and incentive to tear Apple’s PCC apart — if there is a vulnerability, we will find out about it.

Here’s the bottom line: you shouldn’t wholly believe the promises of any company, nor should you take their statements at face value. Blindly trusting Apple or Microsoft because you like their products, respect their track record, or believe their marketing claims would be a huge mistake. Every company has a great track record with user privacy and security until they don’t. If you wait until a company has their first security and privacy incident — like Meta and its Cambridge Analytica scandal — it’ll be too late. Always be skeptical.

Partager cet article

Inscrivez-vous à notre newsletter