Google is making it easier for AI developers to keep users’ data private

The mechanics of differential privacy are somewhat complex, but it is essentially a mathematical approach that means AI models trained on user data can’t encode personally identifiable information. It’s a common way to safeguard the personal information needed to create AI models: Apple introduced it for its AI services with iOS 10, and Google uses it for a number of its own AI features like Gmail’s Smart Reply.


Want to receive more content like this in your inbox?