Artificial Intelligence (1131A/23)
Please can you send me details of any AI tool you are using which potentially falls under the Cabinet Office’s Algorithmic Transparency Reporting Standard? To be clear, that is any algorithmic tool which:
Has a significant influence on a decision-making process with direct or indirect public effect
Directly interacts with the general public
The definition of the standard is listed here, along with examples of projects that have already been reported:
Potential public effects include:
If the tool materially affects individuals, organisations or groups
If it has a legal, economic, or similar impact on individuals, organisations or groups
If it affects procedural or substantive rights
If it impacts eligibility for, receipt of, or denial of a programme
Please find enclosed our response.
There are currently 3 machine learning models that are being used to provide information by way of making predictions.
- Short-term knife crime (used causing injury affecting young people) which provides a category of relative likelihood of a knife crime occurring by geography (a 1 square kilometre grid) over a 4-weekly period.
- Theft of motor vehicle – similar to (1).
- Seasonality planner which provides forecasts for 12 months of broad crime types (by local policing area).
Two projects in beta testing are:
- Integrated offender management model – this defines a measure of harm created by individuals and then uses a machine learning model to estimate the probability of who may become a high harm offender.
- Force Contact demand forecasting – this forecasts the likely number of calls coming into Force Contact over various time horizons
Further details of these projects are available at: