A Beyoncé concert in Cardiff had facial recognition technology used on the crowd to scan for potential terrorists and paedophiles.
According to the BBC South Wales Police and Crime Commissioner Alun Michael said scanning the crowd of gigs in the UK had become a normal occurrence following the 2017 terrorist attack at an Ariana Grande concert in Manchester.
Appearing before MPs on the Welsh Affairs Committee, the police commissioner said using cameras to scan the faces in the crowd was 'entirely sensible'.
He's giving evidence to the government committee as part of an inquiry into how police forces are fighting crime, and told them that in addition to terrorists, they were also scanning faces for possible paedophiles due to the 'very large numbers of young girls attending that concert'.
A facial recognition camera pointed at the audience compares the faces of the people it spots in the crowd with a police watch list of terrorists and paedophiles to pick out potential threats.
"There's been a lot of misunderstanding thinking that images are captured and kept - they're not," Michael told the politicians about how the face scanning technology worked.
"The only image that is retained is of an individual who's identified as being one of the people you're looking for.
"When there is a live facial recognition deployment I am informed in advance and told what the watchlist is.
"It's an operational decision which I am, in live time, able to review and check."
One such event that this technology was used at was a Beyoncé concert in Cardiff on 17 May, which Michael used as an example for how facial recognition was used.
It was decided that the faces would be scanned against 'two sets of individuals', in the case of the Beyoncé concert, paedophiles and terrorists were chosen because of past terrorist attacks at gigs and the age of much of the audience.
However, facial recognition cameras have been criticised by human rights campaigners.
While the CCTV footage taken by the camera can be kept and stored for up to 31 days, the police have insisted that if you don't come up on their watchlist your biometric data will be immediately deleted.
Critics have warned that more work needs to be done to check for bias in the use of the technology, with group Liberty saying it 'doesn't make people safer' and instead 'entrenches patterns of discrimination in policing'.