Why is data analysis such a challenge for UK police?
In June of this year, following the London Bridge attacks, Assistant Commissioner Mark Rowley revealed that UK police were already undertaking 500 investigations involving some 3000 individuals who posed major threats. Another 20,000 individuals of concern could become major threats very quickly.
Each of these individuals represents a vast amount of data. The average person in the UK has seven social networking profiles, many of which serve as logins to dozens of other online services. These profiles stack atop bank accounts, mobile phone data, CCTV data, Internet browsing history – and the interviews and witness reports generated when investigations intensify.
All this information is a powerful resource for investigation and crime prevention – but the sheer volume of it means it’s becoming increasingly difficult to manage with conventional, manual processing. Smart data analysis – the management and interpretation of investigation data – represents a huge opportunity for UK Policing. But with data fragmented between forces, and outdated, largely incompatible systems across the country’s police, it’s virtually impossible.
A different approach is required to get the most out of all this rich but disparate information, and it’s down to technology companies, policy makers, police forces and procurement to work together to get an answer. If policing doesn’t crack this, it will hand ever more abundant opportunities to the terrorist, the fraudster and the organised crime group.
Why is smart data analysis such a challenge for policing?
In a word – interoperability, or rather a lack thereof. A recent Royal United Services Institute (RUSI) report confirms that smart data analysis is simply impossible in UK policing at the present time: our 43 forces currently operate 2,000 outdated systems which aren’t compatible with each other. Some forces have modernised, some have innovated, but all too often they have only been working with data from their own force or region, which limits the potential of their efforts.
In November of this year, HMIC claimed police forces suffered a ‘significant gap’ in digital skills, from the basic ‘recovering data from a mobile phone’ level, upwards. Recruiting and training in-house specialists is costly and time consuming. Policing Vision 2025 will come round far quicker than you think, so upskilling in-house in such a short space of time is untenable. Outsourcing expertise to third parties, therefore, is essential.
And outsourcing is nothing new for the police – the likes of Sopra Steria, IBM and MASS have all won major contracts in the past few years. The problem is rather the lack of a joined up approach, where we’re left a hodge podge of systems and data which work independently but not as a unified whole. This isn’t the fault of the police; technology partners and third parties have to take some responsibility too. They should be working collaboratively to provide a joined up approach and solution that benefits all.
Often, however, companies with commercial interests at heart are often unwilling to approach projects collaboratively, and are guarded when it comes to intellectual property rights (IP). The trouble is, if these disparate systems can’t ‘talk’ then smart analysis is impossible.
Procurement is a further problem. Every force has different replacement and procurement cycles, making standardisation across forces almost impossible (as the 2012 attempt to establish a nationwide procurement hub demonstrates). Great strides have been made, but agility in procurement is still one of the burning issues for a police force trying to ingrain digital into its DNA.
We agree with the Police ICT Company, the Home Office, the NPCC and other bodies responsible for the 2025 Digital Policing Vision, that the keys to success for modernising police ICT are open standards and interoperability. It should be simple to overlay surveillance data with communications data, witness statements and numberplate recognition – and the system should immediately identify any links in all the data.
The answer is not to have one national system that everyone uses. An eco-system of different technologies and software products will always exist, but if they can all interoperate with each other and follow common standards, the data can be available for analysis by any force and at any level.
This isn’t a quick fix, but the people at the top are working hard to define the standards, while programmes such as the National Law Enforcement Data Programme (NLEDP) are working through this. But this shouldn’t be a challenge solely for the governing bodies.
Technology providers should be mandated to follow open standards and provide APIs (the interface gateways that allow data to flow freely from one system to another) free of charge. Each third party product provides a specialist capability, but if they’re programmed to be compatible, they can exchange data easily with each other and actually help investigators rather than frustrating their work.
The good news is that some technology providers in this space are already working together to establish standards for interoperating with each other right now. Creating an environment where willing and collaborative suppliers contribute to the discussion and the iterative process of improving the landscape will benefit everyone.
The truth is, technology is not the problem. Data analysis tools are nothing new, the technology has existed for more than a decade and in that time has become more nuanced, accurate and effective. The issue is alignment, agreement and a collaborative focus on what’s best for British policing, from internal policy makers to external suppliers and beyond. Only then will we see the true value of smart data analytics.
On October 5th, join sports law guru Richard McLaren and leaders in anti-doping and learn how to maximise technology for an intelligence-led approach to testing.