October 5, 2021
As technology develops and improves, its use in the law has increased. Now, a variety of software and gadgets are used to search for and provide evidence, as well as the identification of both victims and perpetrators. This article will take a look at some of the most interesting technology used in recent legal cases over the years.

 

Fighting Online Child Sexual Abuse

2017

Thorn is an international but US-based anti-human trafficking organisation which seeks to prevent the sexual exploitation of children. The non-profit uses technological innovation to combat the predatory behaviour of adults towards children, rescue the victims of their crimes and protect the vulnerable from sexual abuse.

It is estimated that more than 100,000 escort ads are posted in the US each day, with a substantial number of these ads featuring minors or are written by them. 63% of all American underage sex trafficking victims have reported being advertised or sold online.

Thorn partnered with Digital Reasoning, using the latter’s identification software ‘Spotlight’ to speed up investigations. Since 2017, Spotlight has reduced investigation times by 44% through leveraging machine learning algorithms. Where previously sex traffickers intentionally concealed data by continually removing, updating and reposting ads with false names, ages and pictures, Spotlight’s cognitive computing capabilities can connect these disparate data sources. The technology has assisted in more than 8,300 investigations and is now used by 1,200 law enforcement agencies across Canada and all 50 US states. Spotlight has been credited for the identification of 17,092 victims and has been endorsed by investigators with the US Department of Homeland Security.

Without Thorn and Spotlight, investigations into the online commercial sex market would take far longer, leading to fewer rescues of victims.

CTA

Get The Latest Legal News Updates Every Month

Never miss an update with our monthly CA newsletter

subscribe now

Wearable Technology used as Evidence

2018

In 2018, an Apple Watch was used to solve a murder case in Australia. Since then, the Watch has been hailed as the ‘key evidence’ used for prosecution.

Myrna Nilsson was murdered in Adelaide in 2016. Her daughter-in-law claimed that the murder was conducted by a group of men following a 20-minute road rage encounter. However, prosecutors engaged a forensic expert to analyse data from Myrna’s Apple Watch. This revealed the time between the attack and death was 7-minutes long. Furthermore, Myrna’s daughter-in-law claimed she had been tied-up as part of the attack. Here, prosecutors used evidence from her phone to reveal that she accessed eBay 11-minutes after Myrna’s heart rate tracking came to a halt.

Whilst the facts of the case are brutal, the use of an Apple Watch to build a timeline shows progress. Not so long ago, this timeline could only be tentatively formed through witness testimony. Now, devices we wear can show detailed records of our steps, activity and heartbeat, providing vital evidence.

The Use of Automated Facial Recognition Technology

2020

In 2020, the Court of Appeal ruled that the use of automated facial recognition technology (AFR) by the South Wales Police breached privacy rights. South Wales Police did not appeal the decision. This is a landmark case, welcomed by the UK Information Commissioner, and has since set out the judicial position on the use of automated facial recognition technology by public organisations.

The case arose when privacy and civil rights group Liberty questioned South Wales Police’s ability to scan members of the public in public places using facial recognition in their CCTV cameras. The Court of Appeal ruled that the use of AFR breached Article 8 of the European Convention on Human Rights (the right to privacy), as well as the UK’s Data Protection Act of 2018.

Interestingly, the judgement discussed the potential for indirect racial or gender discrimination. Although there was no evidence for this in the South Wales Police case, the judge called for an examination into any bias that might exist in the algorithm, indicating a belief that AFR will need to be re-considered for future cases.

Loading

Loading More Content