All Collections
Features
Face Blurring in OpenSpace
Face Blurring in OpenSpace

Protect the identities of people who may appear in your captures

Updated over a week ago

Face Blurring is designed to protect the private identities of people who may appear in your 360 Video or 360 Photo Captures. When enabled every face appearing will become pixelated after upload and publishing to your project.

Tables of Contents

Face Blurring Overview

This privacy feature is enabled by a machine learning algorithm, and the accuracy of the results depend on a variety of factors. When disabled, the option will not be offered during uploading. This feature is currently available for Enterprise clients and select international regions. If you are interested please contact your Customer Success Manager or Support.

Note: this feature is neither available on Field Notes nor Lidar.

Face Blurring as it relates to GDPR

OpenSpace takes data protection and people's privacy very seriously and we’re committed to complying with data protection laws. This note is to provide information regarding OpenSpace’s face blurring feature and our compliance with the General Data Protection Regulation (GDPR), in partnership with our customers. The GDPR creates consistent data protection rules across the EU.

Face blurring and GDPR compliance details:

  • Shared responsibility: GDPR compliance principles are a shared responsibility between the data controller (the customer) and OpenSpace (the data processor).

  • OpenSpace de-identification, or face blurring: To help reduce the collection of personally identifiable information (PII), customers may contact OpenSpace Support to turn on our face blurring feature. When activated, this feature detects faces in all future captures. When it can identify a recognizable face, the face and the rest of the body will be blurred in the image. By doing so, we render the likenesses of persons in imagery statistically unrecognized to a reasonable average viewer. While this process is very accurate using advanced machine learning algorithms that identify the likenesses of a generic person, it does not identify specific faces or individuals, and the false-negative rate is not zero. The purpose of de-identification is to reduce attack surface, not eliminate all in-scope data from the system.

  • Possibility of false negatives: If a face is not blurred, that usually means the facial recognition cannot make out a face due to circumstances on the image, such as distortion, pixelation, surgical masks, or other obstructions to the face. This is a normal result of the machine learning algorithms we use to detect a person. We tune our models to reduce both false negatives and false positives to deliver a superior service to customers, but in doing so, some likenesses, although statistically unrecognizable, may still appear unblurred and recognizable to viewers who have knowledge of the subject or the environment.

  • Possibility of false positives: If parts of the image are blurred where faces do not exist, that usually means the facial recognition incorrectly identified a face where there was not one. This is a normal result of the machine learning algorithms we use to detect a person. We tune our models to reduce both false positives and false negatives to deliver a superior service to customers, but in doing so, some objects or aspects of images captured may be blurred incorrectly. In these cases, our recommendation is to advance forward or backward one single frame in your capture to see if adjacent frames, only a half-second apart, do not exhibit the same false positive. In most cases, there will be a frame in your capture that does not.

  • Customer responsibility and best practices: The customer is responsible for obtaining permission from data subjects. For example, the customer can post a notice that 360° photo documentation will be taking place and efforts will be taken to de-identify facial recognition. Also, the customer can take actions when capturing images to prevent or limit the collection of data from data subjects, such as capturing during non-work hours when fewer people might be on the job location.

We hope this information helps answer any questions and minimizes any concerns about what may be perceived as an error on our face blurring feature (but in actuality is caused by the reasons listed above).

If you have any additional questions please reach out to the OpenSpace support team at support@openspace.ai

Related Article

OpenSpace Privacy Policy, please click here.

Did this answer your question?