Project Aria Open Ecosystem

EgoBlur: a new AI model from Meta

to preserve privacy by detecting and blurring

PII from images

EgoBlur helps researchers innovate responsibly

EgoBlur is used by Project Aria,
Meta’s research tool for accelerating AI and machine perception.

As a continued effort on responsible innovation, Meta has developed an advanced face and license plate anonymization system that has been used internally since the launch of the Project Aria program in 2020.

Privacy has always been a top priority for Project Aria, Meta’s research tool for accelerating AI and ML technology. As such, we have been committed to follow best practices of responsible AI.

We believe the external research community could also benefit from this state-of-the-art anonymization system, and as such, we have decided to open source both the faces and license plates' anonymization models under an open-source license (Apache 2.0) for both commercial and non-commercial use. This means that researchers working to accelerate AI and ML research can do so while maintaining the privacy of those around them.

What is it?

A FasterRCNN-based detector for faces and vehicle license plates

Designed for blurring faces and license plates,
optimized for devices with a first-person perspective.

Faces

EgoBlur detects faces in both color and greyscale images, so that personally identifiable information may be removed from captured data.

The model performs consistently across the full range of ‘responsible AI labels’, as defined by the CCV2 dataset, to ensure privacy is respected, for everyone.

License plates

In addition to faces, EgoBlur also provides strong performance for obfuscating license plates.

EgoBlur provides comparable or better performance than alternative state of the art systems.

Optimized for devices with an egocentric perspective

At time of release, EgoBlur sets a high standard of performance comparable to or better than other publicly-available methods for face and license plate detection on cameras that capture a first-person perspective, such as AR and VR devices.

However, EgoBlur also provides strong and reliable face and license plate detection for many diverse camera types and perspectives.

A researcher wearing Aria glasses.
How was the EgoBlur face detection model trained?

23M images, 790M bounding boxes

EgoBlur’s advanced capabilities are the result of training on millions of images and masks collected through the use of weakly supervised learning. Additional techniques, such as data augmentation, were also used to increase the model’s performance on greyscale images.

How is EgoBlur evaluated?

Evaluated on real-world data with responsible AI attributes

To evaluate performance and reduce bias, EgoBlur is benchmarked against Aria Pilot Dataset & the CCV2 Dataset.

Self-reported ‘responsible AI labels’ from the CCV2 dataset are used to evaluate EgoBlur against a number of attributes such as skin tone, self identified gender, age, and country. This helps to ensure EgoBlur works consistently for everybody.

Read the accompanying EgoBlur Research Paper

For more information about the EgoBlur model, read our paper on arXiv.

EGOBLUR RESEARCH PAPER
A screenshot from the EgoBlur research paper.

BibTex Citation

If you use the EgoBlur Model in your research, please cite the following:

@misc{raina2023aria,

title={EgoBlur Model},

author={Nikhil Raina and Guruprasad Somasundaram and Kang Zheng and Sagar Miglani and Steve Saarinen and Jeff Meissner and Mark Schwesinger and Luis Pesqueira and Ishita Prasad and Edward Miller and Prince Gupta and Mingfei Yan and Richard Newcombe and Carl Ren and Omkar Parkhi},

year={2023},

eprint={2308.13093},

archivePrefix={arXiv},

primaryClass={cs.CV}

}

Access EgoBlur Models & Tools

If you are a researcher in AI or ML research, access the EgoBlur models and accompanying tools here.

By submitting your email and accessing the EgoBlur model, you agree to abide by the model license agreement and to receive emails in relation to the model.

Frequently Asked Questions

Yes. Both face and license plate models are licensed under Apache 2.0, meaning the models are available for use for both research and industry applications.

Both the EgoBlur face and license plate models are approximately 400 MB and have ~104 million parameters.

The EgoBlur face model takes approximately 7 days to train on 4 machines with 8 NVIDIA V100 GPUs. The license plate model takes about a day to train on similar configurations.

No. Like other open source models, such as RetinaFace, the EgoBlur models are trained only to locate the position of faces and license plates of vehicles within color or greyscale images. The models are not used to track or identify individual faces or license plates.

Yes, in addition to egocentric data from Project Aria, the EgoBlur models are trained on non-egocentric data from the CCV2 dataset. The models should give a comparable performance to state-of the art models like RetinaFace for such non-egocentric data.

EgoBlur is based on the Faster RCNN model with a ResNext backbone. The models are trained using Meta’s publicly available Detectron2 and Detectron2go libraries.

Based on an RGB 2 MP image, both the EgoBlur face detection and license plate models run in approximately 0.5 seconds on a GPU, and 8 seconds on a CPU.

Based on an 0.3 MP greyscale image, both face detection and licence plate models run in approximately 0.3 seconds on a GPU, and 1.8 seconds on a CPU.

No, EgoBlur face and licence plate models are for detection only. The models output rectangular bounding boxes, not masks or labels.

Yes, the EgoBlur model works on images, videos and VRS files.

Please email projectaria@meta.com to report any bugs, or if you have any further queries about the EgoBlur models.

Subscribe to Project Aria Updates

Stay in the loop with the latest news from Project Aria.

By providing your email, you agree to receive marketing related electronic communications from Meta, including news, events, updates, and promotional emails related to Project Aria. You may withdraw your consent and unsubscribe from these at any time, for example, by clicking the unsubscribe link included on our emails. For more information about how Meta handles your data please read our Data Policy.