Police and security forces worldwide are testing automated facial recognition systems to identify criminals and terrorists.
However, concerns have been raised about issues surrounding privacy.
A recent report found that the technology being used by police forces in Britain gets identities wrong 81% of the time.
"It's being rolled out by police in the name of safety," Elizabeth Farries of the Irish Council of Civil Liberties explained. "But safety is something that is important to us in terms of our right to privacy.
"Facial recognition technology as it stands destroys our privacy rights.
"It scans and stores deeply private data "That's a gross violation of privacy," she added.