‘If the videos are posted, I will lose everything’ - Kosovo 2.0

‘If the videos are posted, I will lose everything’



Victims of image-based sexual abuse face their abusers alone.

28/08/2023

Five years ago, Albana, then in her early 20s, met a man in his 30s during a seaside vacation. They got talking and started to like each other and they kept in touch. They chatted often, mostly online, since Albana lived in Kosovo and he lived by the sea.

Around two years later, the relationship started to fade. Albana felt they had no future together and decided to end their relationship. 

She called her boyfriend to tell him. “If you don’t want to be with me anymore, I’ll send the videos to your father. I’ll post the videos,” he replied. He was threatening to publicly share intimate photos and videos that he had taken, often without her consent, while they were together.

“It crushed me,” Albana (a pseudonym) said about the impact this had on her. Now nearing her 30s, when K2.0 spoke with her in June three years had passed since her ex-boyfriend had first threatened her. Since then, he’s blackmailed her into sending him money, stalked her and occasionally coerced her into meeting him. For three years she has tried to end the abuse, including seeking help from the police. She has been unsuccessful.

At the beginning of August, K2.0 talked to Merita, (a pseudonym). Three years ago, when Merita was 21 years old, she was getting ready for a party with her friends. The party turned into a nightmare when a video was sent to her on Snapchat, with the caption “Aren’t you ashamed?” She opened the video. It showed a couple having sex and Merita’s name and Snapchat username were written in the video’s description. The girl looked a lot like Merita, but it wasn’t her.

“The video was sent to my whole family, my cousins, my sisters and their husbands. It was sent to everyone,” said Merita, adding that she didn’t make a big deal out of it at first, believing that it was just a tasteless joke.

Within six hours 1,000 people added her on Snapchat. As the messages started to pour in, she realized that this wasn’t a prank.

Before long, things spread offline. “People started messaging me everywhere,” said Merita. “When I went out into the city, everyone stared at me. People said: ‘look, it’s that whore.'” Although the video was sent to her by an unknown profile, Merita believes that the perpetrator was someone who had wanted to have a relationship with her, but whom she had turned down.

The video spread across social media in a matter of hours. The impact it had stays with Merita to this day. “It was discussed for a whole year. Now I know that the video has been removed, but there are people who still believe that I was part of it,” she said.

These abusers try to control their victims’ lives by using private and intimate images as leverage to demand sexual favors or to harass, humiliate, isolate them or exploit them financially. In Kosovo, instances of this type of abuse often make the headlines, but it is rarely called by its name and almost never recognized as abuse. It has taken different forms recently in response to technology and social media.

In July 2022 model Arta Nitaj and fitness trainer and model Juliana Nura were in the headlines in two cases that garnered significant attention and controversy. The public response showed the prevailing attitudes toward victims of sexual abuse.

In July 2022, a man named Teddy Gray published what he claimed were private messages between a man and Nitaj. The man wrote “I want another night with you or I will post the photo.” The messages continued “that night, you took 5,000 euros from me.” Later, Gray published an intimate photo of a woman he claimed was Nitaj.

Many online headlines and social media posts focused on the claim that Nitaj was escorting for 5,000 dollars a night, calling it “scandalous.” Unmentioned was the fact that Nitaj was a victim of image-based sexual abuse. Many media outlets republished the photo, multiplying its presence online. This led to an online hate campaign against Nitaj.

A little later that year, in December, the first Kosovar edition of the reality show Big Brother VIP aired. One of the houseguests, Juliana Nura, spoke about her experience as a victim of image-based sexual abuse. Nura said she wanted to use her participation in the show as a way of advocating for other victims of this abuse. 

Nura said that in 2010 she received an email containing an intimate video of her. The sender had asked for money in exchange for not posting the video. Nura paid the money out of fear. However, a few days later the blackmailer asked for a second, larger sum of money. When Nura was unable to pay, the video was posted on Youtube. “In 30 seconds, you can lose everything,” said Nura in an interview, and expressed that it left her humiliated and ashamed. She said that everything she had achieved in life up to that point had been lost.

After Nura shared her story on the reality show, many, including some of the other contestants, blamed and harassed her. They suggested that she was to blame and not whoever released the video. In the comment sections of some online media outlets, the images that had forced Nura to leave the public eye for 12 years started to reappear.

What Albana, Merita, Nitaj and Nura experienced is often known as revenge porn. However, the Cyber Rights Organization (CRO), an online rights organization based in the Netherlands, recommends avoiding the use of this term. CRO explains that the term revenge porn can imply that the victim deserves to have their intimate content shared as a response to something and that using the term contributes to the culture of victim blaming. They also argue that referring to intimate content created for private relationships as pornography can harm victims, as it makes them seem like consenting adult actors.

The increasingly used terminology is the “sharing of intimate images without consent” or “image-based sexual abuse.”

Image-based sexual abuse refers to the sharing of sexually explicit images of an individual without their consent. Images are distributed online, such as through email, social networks, porn sites and messaging platforms, as well as offline. The sharing of images that have been photoshopped, edited or altered can also be considered image-based sexual abuse. According to End Cyber Abuse, a global collective of lawyers and activists working against technology-facilitated gender-based violence, consent is required on two levels: before the image is captured and before it is shared with a third party.

The Council of Europe, in their 2021 report “Protection of women and girls from violence in the digital age,” recognizes image-based sexual abuse as one of the most common forms of technology-facilitated violence against women. According to CRO statistics, over 90% of victims of image-based sexual abuse are women.

Far from the public eye and separated from support networks, many women and girls in Kosovo face this abuse alone.

In addition to talking with Albana and Merita, in February 2023 K2.0 released a questionnaire to learn more about victims of image-based sexual abuse. This allowed victims to share their stories.

The questionnaire asked when the abuse took place, the impact it has had on the victim, the reaction of their friends and family, whether it was reported and dealt with by institutions and victims’ access to helpful resources.

The questionnaire received responses from 20 victims, 18 of them were women and two were men. In their answers, the victims described the abuse and the lack of support from families, friends and institutions. The information provided by victims dates back as long as two decades ago, when the primary means of distribution were email or text message, up until today. Nowadays, the advent of new communication platforms has increased this type of abuse.

Evershifting abuses

In response to our questionnaire, a woman from Prishtina, now 35, recounted something that happened to her in 2004. When she was 16 she broke up with her emotionally abusive boyfriend, but instead of freeing her from the abuse, it became worse. After the breakup, various people in her circle, including family members, started telling her that an intimate video of her was circulating through emails and messages.

The video, which her ex-boyfriend recorded without her consent, was circulated online from 2004 until around 2010, five or six years after the breakup.

Fearing the potential consequences if the video was circulated further, at a time when social media use was increasing, she was forced to leave Kosovo. “During that time, social networks began to be used more and I became terrified that the video would be shared again,” wrote the 35-year-old. “I have never confronted the person who recorded the video. I left the country for several years.”

Three of the women who responded to the K2.0 questionnaire said that they were minors when they experienced image-based sexual abuse.

A 24-year-old woman from Gjakova wrote that she broke up with her boyfriend five years ago. After she told him that she wanted to end their relationship, he spent the next year threatening to publish an intimate video of them, which he had already shown to others in person. In addition to filming a sexual act without her consent, the victim said that her ex-boyfriend had drugged her.

Subjecting their victims to constant anxiety and the threat of releasing these images gives the perpetrator the freedom to constantly change their demands. One of the most common demands is to force the victims into engaging in sexual intercourse and coercing them to produce more material the perpetrator can later use for threats.

This is what Albana is experiencing. The man she met on the beach forces her to send him more videos. “He kept video calling me and forced me to do stuff. Being afraid, I sent him videos and did anything he asked — just trying to keep him away from me,” said Albana. She said that he has forced her to go to a hotel with him several times.

Stalking her for many years, he has become like her shadow. When he cannot follow her physically, he asks her to send him photos to prove where she is, limiting her movement and interaction with others on a daily basis.

“He was threatening me, he said ‘okay, we will not talk anymore, we will see each other another six times and I will leave you alone.’ I met with him once after that. He would come to my city and every time he came, he would pressure me. ‘I’m at your door, I’m here, come out,'” said Albana.

When Albana was set to get married to someone else, she told her abuser the news in the hope that he would finally leave her alone. Instead, he changed his style of blackmail. Now, he asks Albana for money. “Without my family being aware, I have sent him 8,000 euros. And this was kind of an agreement that he would stop,” said Albana. In order to pay her abuser she was forced to take out loans and is now in debt. “I don’t even have enough money for myself. I am at zero, in the negative even,” she said. She is now worrying about how she is going to pay the 4,000 euros that he recently demanded.

Albana has unsuccessfully appealed to her abuser’s family and friends, but she fears he may try to involve her husband. He continues to open fake social media profiles to harass her, mainly on Instagram and TikTok.

Responses to our questionnaire indicate that these photos are being shared on Facebook, Instagram, TikTok, Snapchat, Viber and WhatsApp. Most of these social networks and communication apps have policies and guidelines against this phenomenon, but users are usually required to report the content before it is considered for deletion.

Although social networks have begun to deal with this type of abuse, the digital space continues to be unsafe for women.

In recent years, the development of artificial intelligence (AI), especially deepfakes — an image or recording that has been altered and manipulated to misrepresent someone as doing or saying something that was not actually done or said — has also increased opportunities for abuse.

In 2019, Deeptrace, a company whose mission is to protect individuals and organizations from the harmful effects of AI-generated content, published an analysis of 14,678 deepfake videos from across the internet. Based on their analysis, 96% of these videos contained pornographic content while 100% involved manipulated images of women. “Deepfake pornography is a phenomenon that exclusively targets and harms women,” the analysis states.

This use of technology seems to imitate the injustices that women experience offline, leaving them alone and with few resources to fight online abuse.

Many countries have failed to establish legislation and frameworks to address and ban AI-generated deepfake pornography. In April 2021, the European Commission proposed a regulatory framework for AI, which would be the first of its kind.

Through this proposal, the European Union aims to ensure that AI systems in Europe are safe and non-discriminatory and to create a system for victims when their rights are violated. The legislation is still under review by EU lawmakers.

Victims are shamed and alone

In Kosovo, victims face image-based sexual abuse in a context of shaming and victim-blaming. Despite being widespread, cases of image-based sexual abuse are often not given proper attention due to prejudices. Instead of addressing the abuser’s actions, the victim tends to be asked why they took the photo or video.

The Kosovo Women’s Network (KWN), a network that advocates for gender justice, said that the patriarchal mindset — men exercising power over women’s bodies and lives — paves the way for image-based sexual abuse.

KWN has observed a correlation between gender-based violence and image-based sexual abuse. “Victims do not explicitly mention revenge pornography. They don’t always talk about it or report it due to prejudices. In addition to describing years of violence, victims also mention threats and blackmailing involving videos and/or photos,” said a KWN representative.

The fear of being judged has made Merita, Albana and the women who answered the K2.0 questionnaire choose to remain silent. They try to navigate this situation alone, not even daring to confide in their friends or family.

In fact, it is a common goal of abusers to isolate their victims, destroy their interpersonal relationships and limit their social and professional life. In Kosovo, victim-blaming enables abusers to achieve their goal.

At one point Albana gathered the courage to talk to a relative about her situation, hoping that they would support her. Instead, they stopped speaking to her and she had to keep quiet about her abuse in order to rebuild the relationship. She has not dared to ask for help from her family again, fearing that they would cut her off completely. 

“I know that if the videos are made public, I would lose everything,” said Albana, who would rather fight her abuser alone than risk losing her family.

This loneliness has changed Albana. “I am very unhappy. I cannot concentrate. I have no will to do or pursue anything. I have zero motivation,” she said.

The impact of this abuse stays with the victims for a long time. “The last time I saw him was on a plane. I was concerned, or maybe even anxious. I realized that I’m still traumatized from that period, and that maybe I need to seek professional help,” wrote a 35-year-old woman from Prishtina in the K2.0 questionnaire, almost two decades after her video was shared online.

According to CRO, victims are scared of potential offline, physical attacks from their abusers, which contributes to their isolation.

Prishtina-based psychologist Kaltrina Ajeti, who has treated a victim of image-based sexual abuse for over a year, said that shame and embarrassment are instrumental in maintaining a society that tries to sweep the experiences these women under the rug. “Stigma plays a fundamental role in this phenomenon. Knowing this, abusers threaten to post the material. The victims are constantly psychologically abused, hoping that the material won’t be published and they won’t be stigmatized by society,” she said.

Ajeti emphasizes the importance of support from family and friends in dealing with image-based sexual abuse. However, the victim’s families and friends often stop them from reporting their abuse. “They oppressed me emotionally. They didn’t allow me to report it because it would be shameful,” said a 20-year-old in the K2.0 questionnaire, adding that her family blamed and isolated her.

Other women responded similarly. Victims noted being excluded by family, being looked at with disdain, after opening up about their abuse.

Victims of image-based sexual abuse are not just afraid of the reactions of their friends and families. Albana, Merita and the majority of those who responded to the K2.0 questionnaire said that they did not report the case due to a lack of trust in public institutions, especially in the Kosovo Police, which should be the institution where the cycle of violence ends.

‘Even the police judge you’

When making a official complaint about image-based sexual abuse, victims tend to turn first to the police. Afterward, other parties such as the Center for Social Work (CSW) and the prosecutor’s office may get involved.

It took time for Albana to overcome her distrust of the police. “If he goes to jail, I know he will become crazier. And then, what will happen when he’s released? I don’t know what the police can offer me. I know many cases that have been reported to the police and then [the perpetrators] went and killed women,” she said. Despite her misgivings, in 2022 she picked up the phone and called the police out of desperation, because the situation was getting worse.

“I called them once and asked them how they could help me. They told me: ‘you need to come to the station, or we have to come to your home in order to report the case,’ but I didn’t want to meet them at home, or at the police station. What if someone saw me at the police station? I just said that I could not,” she said.

The police did not offer her another option.

Albana did not pursue reporting the case to the police. Their response had reaffirmed the lack of faith she had in them. “In our country, even the police judge you. Even the police ask ‘what were you doing with him?'” she said.

Merita has not called the police. She was scared that if she reported her abuse, her family and her abuser’s family would fight. Most of the women who responded to the K2.0 questionnaire have the same fear as Merita.

A 24-year-old girl, whose ex-boyfriend shared an intimate video, wrote, “I was afraid to report it because I thought the police would spread the word, because they know my family. I was also ashamed that this happened to me and I kept it a secret.”

According to the answers in the questionnaire, there have been cases in which reporting to the police only created new opportunities for the victims to be mocked or disregarded. When asked how she was treated by the police after making a report, a 21-year-old woman from a town near Prishtina wrote: “As a joke. They made fun of it, between the staff” and indicated such treatment pushed her to abandon the case.

According to the Kosovar Center for Gender Studies — which critically examines police actions — prejudice, lack of confidentiality, stigmatization and victim-blaming are some of the factors which affect victims’ confidence in reporting to the police.

K2.0 asked the Kosovo Police about the procedures they follow when dealing with image-based sexual abuse, but has not received an answer.

Image-based sexual abuse is not recognized as a criminal offense and is not specifically regulated. When K2.0 asked the police for data on the number of cases involving this type of abuse, they only provided general data. The Criminal Code covers different aspects of sexual abuse, such as Article 202 “Unauthorized photographing and other recording,” Article 230 “Degradation of sexual integrity” and Article 329, “Blackmail.”

This offense is also covered by point three of Article 232, “Abuse of children in pornography.”

According to declassified police data, in 2022 there was one registered case of sexual degradation, one case of providing pornographic material to someone under the age of 16, 13 cases of abuse of children in pornography and 115 cases of unspecified forms of blackmail. Between January and February 2023, two cases of abuse of children in pornography and 27 cases of blackmail were registered.

K2.0 has followed police incident reports since the beginning of 2023. Based on these, there have been around 10 registered cases characteristic of image-based sexual abuse. Two of these cases were classified as harassment, two as blackmail, one as rape and five as abuse of children in pornography.

Image-based sexual abuse is not mentioned in Kosovo’s existing legislation, such as the Law on the Prevention and Fighting of Cyber Crime, the Law on Protection against Domestic Violence and the Law on Gender Equality. KWN said the lack of recognition of image-based sexual abuse in Kosovo’s legislation is a problem. KWN also indicated that they have recommended the Law on Protection from Domestic Violence and Gender-Based Violence should cover this abuse. While the law is still in being drafted, KWN is monitoring whether image-based sexual abuse will be included.

However, even if it is included in Kosovo’s legislation, KWN remain concerned that the institutions have neither the practical experience nor legal framework to deal with this complex type of abuse.

Nevertheless, in 2022 the Chief State Prosecutor’s Office appointed coordinators for sexual criminal offenses in every prosecutor’s office.

Shpejtim Peci, coordinator for Mitrovica, said that there are many cases of rape that happen under the threat of image-based sexual abuse and other cases which contain elements of this crime. According to him, framing this abuse specifically as a criminal offense would facilitate the work of the prosecutor. As it stands, it is difficult to distinguish such cases when they are not recognized as a separate criminal offense.

K2.0 asked the Ministry of Justice if they plan to make image-based sexual abuse a separate criminal offense, but has not received an answer.

Besides the police, the Center for Social Work plays an essential role in supporting victims. In Kosovo, there are a total of 38 CSWs, which are mandated to provide services for 47 categories of people, including victims of sexual crimes. 

Vebi Mujku, former director of the CSW in Prishtina, told K2.0 that often the victims are not aware that they can report the case to the CSW, adding that the police do not inform the victims that they can receive help from CSWs. Mujku also noted the lack of psychologists at CSWs, which prevents them from offering the full range of services they are tasked with.

Albana, meanwhile, sees no chance of safely ending her abuse through state institutions. An ongoing victim of image-based sexual abuse, she says her future seems distant and elusive.

“I’ve reached the point where I cannot even dream anymore,” she said between tears. She told her story in the hope that when other women face what she is going through, they will know they are not alone.

Feature Image: Dina Hajrullahu / K2.0.

The content of this article is the sole responsibility of K2.0.

Curious about how our journalism is funded? Learn more here.