Facebook recently open-sourced Opacus, a library for training PyTorch models with differential privacy that’s ostensibly more scalable than existing methods.

With the release of Opacus, Facebook says it hopes to provide an easier path for engineers to adopt differential privacy in AI and to accelerate in-the-field differential privacy research.

Typically, differential privacy entails injecting a small amount of noise into the raw data before feeding it into a local machine learning model, thus making it difficult for malicious actors to extract the original files from the trained model. An algorithm can be considered differentially private if an observer seeing its output cannot tell if it used a particular individual’s information in the computation.

CNBC takes a look at what’s next for the workspace based on what the big tech companies are doing.

Tech offices, from Apple’s 2.8 million square-foot “spaceship” campus, to Facebook’s Menlo Park headquarters complete with a botanical garden, have always pushed the envelope of office space. But coronavirus may make this type of work environment a thing of the past, at least for the near future, as companies try to balance communal work with safety. Here’s a look at how tech companies are changing their offices and work policies as they ease into reopening. 

Yannic Kilcher retraces his first reading of Facebook AI’s DETR paper and explain my process of understanding it.

OUTLINE:

  • 0:00 – Introduction
  • 1:25 – Title
  • 4:10 – Authors
  • 5:55 – Affiliation
  • 7:40 – Abstract
  • 13:50 – Pictures
  • 20:30 – Introduction
  • 22:00 – Related Work
  • 24:00 – Model
  • 30:00 – Experiments
  • 41:50 – Conclusions & Abstract
  • 42:40 – Final Remarks

Original Video about DETR: https://youtu.be/T35ba_VXkMY