Artificial Intelligence (AI) is a hot topic in today’s world, simplifying the way we do everything from finding the shortest commute to solving complex problems. As applications of AI continue to grow, a new issue has been raised for the organizations deploying such technology: How do we leverage the power of AI in an ethical way?
This is an especially essential question to answer as it relates to AI in video security. AI can offer a number of security advantages when used for both public and private safety applications. Solutions such as facial recognition, for example, employ AI to help to locate known criminals captured in video streams. However, such solutions are constantly walking the line between ethics and crime deterrence. When the machine itself gives consistently different outputs for one group of people compared to another, this is known as bias in AI, and it is more common than you’d think.
So, while AI can offer exciting security and business intelligence opportunities for both consumers and businesses, the historical data on which these algorithms are often built brings with it inherent biases. In the case of facial recognition, the machine may have seen more Caucasian faces in the data it was trained upon and so it’s learned to associate that with normality. This leads to biased outcomes, especially for minority populations and women, as facial recognition technologies have been shown to better detect Caucasian faces than they do the faces of people with darker skin complexions and do a better job of detecting men’s faces than those of women.
Risks of Bias When Using AI in Video Security
The risks of such bias present in AI systems are high, posing a risk to those whom the technology is applied to. Examples include false arrests due to cases of mistaken identity and racial profiling within airports and other public venues. For deploying organizations, such ‘mistakes’ could be costly, ranging from financial penalties to damaging public scrutiny and legal judgments. But those wanting to employ AI to their video security systems and take advantage of their security benefits are not without options.
Ethical Use of AI in Video Security
Unlike conventional video analytics, ASTRA’s anomaly detection doesn’t require any rules, and is not subject to bias and profiling. Sounds like a pretty big claim given all the broken promises from conventional video analytics solutions over the past few years. But this AI-powered anomaly detection is different, employing advanced machine learning algorithms that constantly adapt to events in a specific video stream.
Let’s say a person falls or is laying on the ground in an office where this type of activity is statistically uncommon. The software will identify this as an atypical event and automatically notify designated personnel or trigger a Standard Operating Procedure (SOP) that there is a potential problem in real-time.
ASTRA’s anomaly detection is completely unbiased, without profiling or infringement on personal privacy, because it doesn’t record or store any images or personal information as a point of reference. All anomalies are identified based on statistical data captured by the software in the identified surveilled area without any human judgment. It is important to note that this kind of anomaly detection differs from others, as some software may distinguish male faces to be “normal”, while female faces are assessed as “anomalies”. It doesn’t use complexion, gait, or perceived gender as reference for its baseline of normality. This unique type of anomaly detection is focused on detecting anomalous conduct and behavior, not personally identifiable information.
This makes ASTRA’s anomaly detection ideal for use in nearly any public and private environment, while removing the potential for personal privacy issues and liabilities. ASTRA allows an organization to confidently walk the line between ethics and deterrence, privacy, and security, without fear that their system is unjustly biased.
Learn more about ASTRA or get a free 30-day proof of value trial to see it in action for yourself.