January 17, 2020
Pindrop for Amazon Connect | A Balancing Act
Defending the phone channel presents various challenges, especially due to…
Voice biometrics create an impenetrable obstacle for fraudsters. Fact or fiction?
The answer to this may seem obvious – fiction. However, claims like voice biometrics software being able to tell the difference between identical twins or detect a professional voice imitator persist even though both have been disproven.
In May of 2017, a BBC researcher found that a large bank’s voice ID system allowed customers to attempt to access their account after deliberately failing on 20 separate occasions within 12 minutes. Fraudsters can be persistent, but this would also be true of a frustrated customer who has been locked out of their account.
If they are unable to call in to fix this problem, what is their recourse?
The report went on to ask, “If you have to prove it wasn’t you who accessed your account – that it was either a mimic or computer software – then how are you going to prove that someone else was verified as you?”
This leaves unanswered questions around voice biometrics and voice recognition in general. Security professionals agree, relying on only one factor will allow one point of failure to break the entire system. While voice bio may not be bulletproof, if it is in use as a part of a larger, multi factor authentication scheme, it can be quite effective. As a point solution, it can leave some undesirable gaps.
Still have questions about voice biometrics? Get your answers here.