By Reza Rassool, CTO, RealNetworks

SAFR: Consistency and Fairness

WIRED recently highlighted unacceptable levels of bias in facial recognition in the article The Best Algorithms Struggle to Recognize Black Faces Equally. They cited the poor test scores of leading facial recognition vendors, as reported by the National Institute of Standards and Technology (NIST) in its July 2019 results. WIRED specifically called out Idemia but generalized their concerns. Here’s what they said:

“The NIST test challenged algorithms to verify that two photos showed the same face, similar to how a border agent would check passports. At sensitivity settings where Idemia’s algorithms falsely matched different white women’s faces at a rate of one in 10,000, it falsely matched black women’s faces about once in 1,000 — 10 times more frequently. A one in 10,000 false match rate is often used to evaluate facial recognition systems.”

“The agency’s July report covered tests on code from more than 50 companies. Many top performers in that report show similar performance gaps to Idemia’s 10-fold difference in error rate for black and white women.“

What the WIRED article didn’t point out is that some algorithms perform much better with respect to bias. RealNetworks’ SAFR algorithm falsely matched black women’s faces about once in 3,162 — about 3 times more frequently than white women’s. While that is competitively better than Idemia’s 10x rate, RealNetworks is committed to further reducing bias. SAFR’s low bias rate is consistent over the last three NIST reports.

At RealNetworks, we set NIST-measured design criteria for accuracy, speed, size, and low-bias:

  • Accuracy – Facial recognition must have FNMR <3.5% (@FMR 1:10K)
  • Speed – (to keep up with live video) > 3 recognitions per second
  • Size – (for better embeddability) < 100 MB model size
  • Bias/Fairness – (for lack of bias across skin tone and gender) < 0.25% (@FMR 1:1K)

How Facial Recognition Companies Measure Up to SAFR Design Criteria

Accuracy, Speed, Size, Bias/Fairness as measured by NIST July 2019 testing

  SAFR AnyVision Idemia Ever.ai
Accuracy
FNMR <3.5% (@FMR 1:10K)
✔️ ✔️ ✔️
Speed
> 3 recognitions per second
✔️
Size
< 100 MB model size
✔️
Bias (Fairness)
accuracy variance < 0.25% (@FMR 1:1K)
(relative to SAFR’s accuracy variance)
✔️
0.19%
1x

1.01%
5.3x

2.14%
11.3x

.73%
3.8x

Idemia defended their bias results, citing variations in their NIST-tested vs. commercially offered algorithm. Why the discrepancy? In an effort to achieve the highest-possible NIST test scores, a number of companies don’t actually ship the algorithms they submit to NIST in their commercial facial recognition products. These NIST-tested algorithms may prove low in bias or high in accuracy, but too slow or too large for practical use. RealNetworks, on the other hand, submits the same SAFR algorithm to NIST that’s shipped in its commercial applications — SAFR’s low-bias rates measured by NIST are the same low-bias rates experienced by users in the real world.

Low bias is a foundational element for trust and excellence in facial recognition. RealNetworks is a U.S. company committed to identifying and eradicating bias in our algorithms and providing technology that performs equally well with any skin tone or gender — ensuring facial recognition works for everyone in our society.

Learn more about how we reduced bias in SAFR.

– – – – –

Results shown from NIST do not constitute an endorsement of any particular system, product, service, or company by NIST.