Artificial intelligence in the police: VeRA under criticism!
Find out everything about the use of artificial intelligence in Hesse to combat crime and the legal challenges.

Artificial intelligence in the police: VeRA under criticism!
On September 24, 2024, the Baden-Württemberg cabinet published a remarkable security and measures package entitled “Strengthening Security, Organizing Migration, Preventing Radicalization”. A central point of this package is the planned use of artificial intelligence to analyze collected data, known as VeRA. This analysis software is intended to make it possible to combine data from different sources in order to make crime fighting more efficient. In fact, the Palantir software, which has been used in Hesse since 2017 under the name Hessen-Data, is seen as promising in this context, but was initially stopped at the federal level by the Federal Ministry of the Interior because the necessary legal basis for its use by the Federal Police and the Federal Criminal Police Office could not be created.
As the BDK reports, an extension of the police law there is required in order to use VeRA in Baden-Württemberg. However, the hearing process for changing the law is still pending, which makes the entire discussion in the green-black coalition a hot topic. The Association of German Criminal Police Officers Baden-Württemberg (BDK BW) expects that when VeRA is introduced, extensive legal powers must apply to the use of this technology.
Pilot projects and legal challenges
A Europe-wide tender for an analysis product has already been carried out in Bavaria, with Palantir being selected as the only suitable product. However, there is uncertainty because different federal states have different views on this issue. While North Rhine-Westphalia, for example, has had a legal basis for the use of Palantir's “DAR” software since 2022, Hamburg is still hesitating with a decision on automated data analysis, which was introduced in 2019 but was declared invalid by a ruling by the Federal Constitutional Court.
In February 2023, the Federal Constitutional Court set clear limits on the use of automated data analysis tools in order to prevent any data protection violations. According to the Süddeutsche Zeitung, these tools, despite their useful information potential, can become dangerous because they create extensive personal profiles and involve uninvolved people in police measures can. This happens, among other things, when users unintentionally access analysis systems through techniques such as radio cell queries.
Diversity of opinions and next steps
The prospect of Germany potentially taking major steps toward automated data analysis this year is met with both support and resistance. Some interior ministries are already asking about the possibility of using Palantir software or similar products. Federal states such as Saxony-Anhalt, Saxony and Saxony-Anhalt are currently not making any moves in this direction, while Berlin and Baden-Württemberg are considering following the Bavarian solution.
When discussing the use of artificial intelligence in combating crime, there are not only technical but also ethical question marks. It is important that the fundamental right to informational self-determination is preserved when developing and implementing such technologies. The Netzpolitik emphasizes that comprehensive regulations on data use and registration of analysis tools must be guaranteed in order to avoid misuse and data protection violations.
Overall, it remains to be seen how the discussion about VeRA, Palantir and the entire development of automated data analysis in Germany will continue. A good hand from the legislators will be required here in order to keep both security and civil rights in mind.