Key Takeaways
- Trust Recall over Apple Intelligence due to on-device processing for increased safety.
- Apple’s Private Cloud Compute still requires trust in the cloud for data handling.
- On-device AI data processing is always more secure than cloud solutions.
When faced with a choice between Apple Intelligence and Microsoft’s Recall, I can’t believe I’m saying this, but I actually think I trust Recall more. There’s something to be said for on-device processing, and while Recall needs work, it’s still inherently safer than anything that involves the cloud, regardless of whether Apple is involved or not.
To be clear, I’m not saying which feature is more useful. I think that both have their benefits, and I think that Apple’s Private Cloud Compute can enable some interesting use cases that Microsoft can’t offer with just on-device processing. With that said, Apple has basically unveiled a very similar feature, but people don’t seem to be worried about it in the same way that they were about Recall.
Related
How Apple’s Private Cloud Compute supports private AI processing
Apple is making some bold claims about the privacy and security of their Private Cloud Compute. But how does it actually work – and is it any good?
Apple’s Private Cloud Compute still requires you to trust the cloud
No matter what happens, your data isn’t on your device anymore
Source: Apple
Whether or not you trust Apple more than Microsoft or vice versa, one thing is for sure: your data will be leaving your device at some point when using Apple’s services, and you need to essentially have faith in Apple to handle it correctly. The company has an exemplary safety record when it comes to privacy and managing user data, but that doesn’t make it immune to compromise. In fact, any data request from the U.S. government could see Apple have to give over whatever it can, but it’s not clear what that data would entail.
In contrast, Microsoft’s Recall has its own problems, but Microsoft has already committed to fixing them. The biggest problem was the allegations of the data being stored in plaintext in a user-accessible folder, meaning that any spyware that managed to get itself onto a consumer’s PC could access everything in there. That includes data historical to when the spyware actually made it onto the PC, such as if you logged into your bank or any other sensitive online platforms.
No matter what, an on-device solution for storing your AI data (which includes personal data) will always be more secure than a cloud solution. Apple has always been committed to user privacy, but part of the problem is that Apple needs to make it so that not even it can access user data. Otherwise, a subpoena can compel the company into handing information over, regardless of how the company feels about privacy.
Related
You’re going to have to wait a while to see the cool Apple Intelligence features from WWDC 2024
Apple had a ton of AI features to show off during WWDC, but once it releases in Q3 2024, a lot of them will be missing in action.
On-device processing is always superior
It’s all about implementation
Without getting into the specifics of either service, on-device processing is always superior, and that’s why both Apple and Qualcomm tout their NPUs. Microsoft has made an on-device NPU capable of 40 TOPS a requirement for Copilot+, as it means that a lot of AI processing can happen without the requirement of a cloud. While Apple has bigger dreams than that (and NPU TOPS are a bit of a weird metric to use anyway), data never leaving your device is always better than data going to the cloud.
As it stands, with all of the heat currently on Microsoft, I actually trust Recall more to get it right with user privacy at the heart of it than I trust Apple to get it right. As it stands, Apple is saying that it will have its own features that collect data in a way similar to Recall (and arguably more invasive), but it’s not clear how Apple is storing that. Is it stored in plaintext, too? Admittedly I doubt it, especially given what has been happening with Microsoft, but the point is that Apple can get things wrong too, and it wouldn’t be the first time if they did.
I’m cautiously optimistic that both rollouts will be good, but I suspect that Microsoft, in the long term, will get more right. Apple has a track record of trying to stand up for consumer privacy, but the promises that the company has made are just that: promises. They’re a series of “verifiable guarantees” in essence, but the point is that there are some proprietary bits of tech that haven’t been disclosed how they work. Plus, that’s still a cloud system that could have its own vulnerabilities. The attack vector is both your laptop and a cloud server, rather than just your laptop.
I’m not against either company trying these things, and I’m excited to hopefully get to test and compare both Copilot+ and Apple Intelligence when they reach consumers in their entirety. Until then, though, I’m going to remain skeptical of both services with a close eye kept on Copilot+ and how features like Recall eventually reach consumers.
Related
Copilot+ is now here: Let’s clear up everything about that controversial Recall feature
You can now go out and grab a device that can run Copilot+, so let’s explore how Microsoft has handled the Recall issue before you buy one.