Ad

Facebook recently open-sourced Opacus, a library for training PyTorch models with differential privacy that’s ostensibly more scalable than existing methods.

With the release of Opacus, Facebook says it hopes to provide an easier path for engineers to adopt differential privacy in AI and to accelerate in-the-field differential privacy research.

Typically, differential privacy entails injecting a small amount of noise into the raw data before feeding it into a local machine learning model, thus making it difficult for malicious actors to extract the original files from the trained model. An algorithm can be considered differentially private if an observer seeing its output cannot tell if it used a particular individual’s information in the computation.

tt ads

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.