
Coded gender discrimination
Artificial Intelligence reproduces inequalities against women.
|11.12.2025
|
Even in the most widely used generative AI tools, such as ChatGPT, some forms of gender bias against women have been observed.
The data on which AI systems are trained reproduces existing inequalities in the real world.
Even a search engine like Google, which we use in our daily lives, can algorithmically reproduce deep gender discrimination.
The public, decision-makers and users need a better understanding of how AI systems work, how they are built, what data they are fed, what risks they pose and how discrimination in their output might be identified.

Kaltrina Shala
Kaltrina Shala is a lawyer specializing in technology law, AI and the regulation of digital platforms. She lives in Berlin, where she provides expertise on digitalization policies, the legal implementation of technological innovations and the regulation of algorithmic systems.
DISCLAIMERThe views of the writer do not necessarily reflect the views of Kosovo 2.0.
This story was originally written in Albanian.
Want to support our journalism? Join "HIVE" or consider a donation.