🔬 Research Summary by Eticas Foundation, a non-profit organization that addresses a very specific problem: today’s AI tools are developed without any transparency, awareness of their impact or legal compliance, and this often lead to discrimination and cause harms, specially among the most vulnerable.
[Original report by Eticas Foundation]
Overview: VioGén is an algorithm that determines the level of risk faced by a victim of gender-based violence and establishes her protection measures in Spain. It is the largest risk assessment system in the world, with more than 3 million registered cases.
Introduction
Gender violence is a key problem in Spain. 1,126 women were killed by their (ex)intimate partners between 2003 and 2021 and 32.4% of women over 16 have suffered physical, sexual, and/or psychological violence throughout their life (approximately 6.6 million).
VioGén was launched in 2007 with the objectives of bringing together all public institutions that have competence in the area, making risk prediction, monitoring and protecting victims of gender violence.
Since 2018 Eticas has reached out to the Spanish Ministry several times and offered a confidential pro-bono internal audit of the VioGén system. While this suggestion was well received, no action was taken. In 2021, Eticas decided to make an external audit of this system with a great social impact with the collaboration of Ana Bella Foundation. It has been made by using reverse engineering.
One of the most concerning findings is that, even though the VioGén system’s risk assessment has been designed as a recommendation system and the results can be modified, police officers keep the automatic outcome given in 95% of the cases. Protection measures given to the victim depend on that result. This means making this decision is being left to an algorithm.
Key Insights
VioGén is not accountable, nor transparent
VioGén’s algorithm uses classical statistical models to perform a risk evaluation based on the weighted sum of all the responses according to pre-set weights for each variable. It is designed as a recommendation system but, even though the police officers are able to increase the automatically assigned risk score, they maintain it in 95% of the cases. Therefore, it is the system that decides which police protection measures are assigned to each victim. In 45% of cases, that level of risk assigned is “unappreciated”. Consequences are clear, between 2003 and 2021 there were 71 murdered women who had previously filed a report without obtaining a level of risk that entails police protection. The system only gives the number of “extreme” risk scores (which means risk of homicide) it can afford, so funding cuts have a direct and quantifiable impact on the chances that women will have in order to receive effective protection after seeking police protection.
In terms of transparency, most VioGén studies have been conducted by the same researchers that contributed to its development. This reinforced the argument for the need of an independent oversight of the system.
Over 80% of women interviewed reported different issues with VioGén
VioGén is only activated when the victim officially reports her aggressor to the police. The system works through two questionnaires in which a police officer values each item as “present” and “not present”. Once the form is filled, the system assigns a gender-violence risk score. The levels of this risk score are “unappreciated”, “low”, “medium”, “high”, and “extreme”.
The VioGén system is built on the assumption that women suffering from gender violence understand and respond clearly to all 35 risk indicators in the Police Risk Assessment (VPR from its acronym in Spanish) form and police officers objectively transform women’s statements into binary answers. But in reality, the process rarely works in this idealized way as at the moment when they report their aggressor, the victims have a strong emotional charge and, often, are in a state of shock that makes it difficult for them to understand the questions and express themselves clearly. Over 80% of the women interviewed reported different problems with the VioGén questionnaire. This means that the quality of the data fed into the algorithmic system could be compromised during the input generation stage, resulting in possible sources of bias and misrepresentation within the system. It is urgent to revise the conditions in which women access the system and the questionnaire.
It is also worrying that only 35% of the women we interviewed were informed about their VioGén risk score.
Transparency, independent oversight, accountability, end-user engagement need to be reviewed
Eticas is concerned by the number of cases that the VioGén system “discards” by giving them an “unappreciated” risk score. In result, in 2021, only 1 out of 7 women who reached out to the police for protection actually received it. This a consequence of the algorithm under-valuing psychological violence and newer forms of non-physical violence, putting the emphasis on physical violence. Therefore, only 3% of the women who are victims of gender violence receive a risk score of “medium” or above and, then, effective police protection. Women who were killed by their partners and did not have children were systematically assigned lower risk scores than those who did.
Between the lines
Keeping all these findings in mind, Eticas recommends:
- Removing access barriers at individual, group-based, and institutional levels.
- Increasing the number of officers specialized in gender violence.
- Providing early legal and psychological support to victims to ensure their understanding of the process and their well-being when responding to the VioGén questionnaire.
- Accompanying the VioGén score with a justification of the police officers that provides an additional professional opinion.
- Seeking regular feedback from the victims and other stakeholders when evaluating and updating the system.
- Using historical data present to infer patterns of gender violence, as well as making these evaluations available to the general public to foster transparency and trust in the system.
- Promoting a public debate on the benefits and risks of incorporating machine learning techniques into VioGén.
The results of this external audit shows that if Eticas has managed to get to this stage without any access to the relevant data, much more could have been done with access to it. With this work, we want to raise awareness of the lack of transparency and accountability of these extremely sensitive and impactful algorithmic systems.