European Schoolnet Podcast Episode 3 - The role of AI in Gender-Based Violence.

European Schoolnet
25 Jun 202439:45

Summary

TLDRThe European School Net podcast episode discusses the critical role of AI in addressing online gender-based violence. Postdoctoral researcher Sylvia Simine, an expert in tech-facilitated gender violence, highlights the risks of AI misuse, such as deep fakes and algorithmic bias, and their impact on women. She emphasizes the importance of ethical AI development, inclusive of marginalized communities, and stresses the need for societal and cultural change alongside legal measures to combat this digital violence.

Takeaways

  • 🌐 The podcast episode discusses the role of AI in addressing and potentially facilitating gender-based violence online, highlighting the transformative potential and risks of AI, such as data privacy, ethical implications, and bias.
  • πŸ” AI systems can be misused to create and distribute deep fakes and manipulated content, which often targets women, leading to significant psychological and social impacts.
  • 🧐 The intersection of AI and gender-based violence is nuanced and ever-changing, with risks including online harassment, defamation campaigns, and the reinforcement of harmful gender stereotypes.
  • πŸ’» The misuse of AI tools often stems from a desire to target and harm women, reflecting broader societal issues of misogyny and the need for more inclusive and ethical technology development.
  • πŸ“ˆ The growth of online misogyny and the accessibility of AI tools contribute to the rise in gender-based violence, with young people potentially using these tools out of curiosity or a lack of understanding of the harm they cause.
  • πŸ“š The importance of sexual and digital education is emphasized to equip young people with the knowledge to navigate the digital world safely and understand the implications of technology on gender relations.
  • 🌐 The campaign for the criminalization of non-consensual image sharing in Italy demonstrates the power of collective action and political engagement in driving legal and societal change.
  • πŸ”‘ The role of intersectional feminism in guiding technological development is highlighted, emphasizing the need for diverse and inclusive perspectives to prevent the perpetuation of harmful biases.
  • πŸ”’ The discussion underscores the non-neutrality of data and algorithms, and the necessity of transparency and ethical considerations in AI development to avoid reinforcing societal biases.
  • πŸ›‘οΈ The need for a cultural and ethical shift alongside technological advancements is stressed, rather than relying solely on technical solutions to combat gender-based violence.
  • 🌟 The episode concludes with a call to action for continued efforts in education, political will, and cultural change to create a safer digital environment and address gender-based violence effectively.

Q & A

  • What is the main focus of the European School Net podcast series episode featuring Sylvia Simine?

    -The episode focuses on the intersection of artificial intelligence and gender-based violence online, discussing the role of AI in facilitating or addressing such violence.

  • What are some examples of how AI can be misused to facilitate gender-based violence online?

    -Examples include deep fake technology for creating manipulated explicit content, AI algorithms used for online harassment or doxing, and AI chatbots programmed with biases that reinforce gender stereotypes.

  • What percentage of deep fake content online is sexually explicit, and who is the majority of this content targeting?

    -96% of deep fake content online is sexually explicit, and 99% of this content targets women.

  • How can generative AI algorithms contribute to gender-based violence?

    -Generative AI algorithms can be used for defamation campaigns, automating online harassment, or escalating violence through the spread of hateful or threatening messages.

  • What is the potential issue with AI-driven chatbots and virtual assistance?

    -They can be programmed with biases that contribute to reinforcing harmful gender stereotypes.

  • What is the term used to describe the immersive reality experiences where online violence can be more vividly experienced, and what is an example of such violence?

    -The term is 'metaverse', and an example of violence in this context is 'Meta rape'.

  • What is the role of algorithmic bias in AI systems?

    -Algorithmic bias can unintentionally perpetuate and amplify societal biases present in the training data, leading to discriminatory outcomes.

  • What was the main finding from Sylvia Simine's research on non-consensual dissemination of intimate images on Telegram?

    -The research found that these channels were used by men to construct masculinity by using women's bodies as trade currencies, normalizing non-consensual sharing of intimate images.

  • How do AI systems reinforce gender stereotypes, according to Sylvia Simine's discussion on social media platforms?

    -AI systems on social media platforms can reinforce gender stereotypes through biased algorithms that decide what content is seen, often encoding discriminatory social norms into their operations.

  • What are some of the issues with AI systems in terms of data collection and privacy?

    -AI systems often scrape data from the internet non-consensually, use it to train models, and can lead to ethical discussions regarding the use of personal data without consent.

  • What is the importance of including marginalized communities in the development of AI technology?

    -Including marginalized communities ensures a more inclusive and diverse development of technology, leading to better results and innovations that avoid perpetuating existing biases and discrimination.

  • What was the outcome of Sylvia Simine's campaign against non-consensual image sharing in Italy?

    -The campaign led to the criminalization of non-consensual image sharing in Italy, although it did not initially include digital platforms as part of the responsibility.

  • What is the role of education in addressing the issues discussed in the podcast?

    -Education plays a crucial role in raising awareness about the risks and ethical implications of AI, promoting a better understanding of consent, privacy, and the impact of technology on society.

  • How can young people be empowered to navigate the digital world safely?

    -By providing them with digital and sexual education, fostering an understanding of technology's impact, and promoting critical thinking to avoid falling into traps set by AI systems or malicious online behaviors.

Outlines

00:00

πŸ€– AI and Gender-Based Violence: The Transformative and Risky Potential

The podcast episode begins with a discussion on the role of artificial intelligence (AI) in addressing and potentially facilitating gender-based violence online. The host, Ena, introduces the topic by highlighting the transformative potential of AI and its associated risks, such as data privacy, ethical implications, and AI bias. The guest, Sylvia Simine, a researcher and international activist, shares her expertise on technology-facilitated gender-based violence. The conversation touches on the misuse of AI systems, including deep fakes and manipulated content that can perpetuate harmful gender stereotypes and facilitate online violence.

05:03

πŸ”Ž The Nuances of Online Gender-Based Violence and AI

Sylvia provides an in-depth look at the types of online gender-based violence risks that young people and children may encounter with the use of AI. She discusses the ever-evolving nature of these risks, starting with deep fake technology, which is predominantly used to create sexually explicit content targeting women. Sylvia also mentions other AI-related issues such as defamation campaigns, online harassment, doxing, and the reinforcement of gender stereotypes by AI-driven chatbots and virtual assistants. The conversation emphasizes the importance of understanding the intersection of AI and gender-based violence as a complex and dynamic phenomenon.

10:03

🚫 The Misuse of AI Tools and the Predators' Motivations

The discussion delves into the reasons and motivations behind the misuse of AI tools, particularly by predators and offenders. Sylvia explains that the creation of deep fakes and manipulated content can stem from various motivations, including targeting women to silence their voices online and reinforcing traditional masculine values. She also touches on the accessibility of deep fake tools and the potential for misuse by individuals who may not fully understand the harmful consequences of their actions. The conversation highlights the broader social issues related to misogyny and the need for comprehensive sexual education to address the underlying problems.

15:05

πŸ“š Research Insights on Non-Consensual Image Sharing and Masculinity

Sylvia shares her research findings on the non-consensual dissemination of intimate images, specifically within Telegram groups. Her study, conducted with colleague Lucha Botti, explored the social reasons behind the formation of these misogynistic channels and the role of masculinity in such behaviors. The research revealed a 'homo-socialization' dynamic where men use women's bodies as a currency to construct their masculinity. Sylvia discusses the shocking findings, including the existence of an archive referred to as 'the Bible,' and the implications of these behaviors on societal norms and values.

20:07

πŸ”„ Algorithmic Bias and the Perpetuation of Gender Stereotypes

The conversation shifts to the role of AI algorithms in reinforcing gender stereotypes. Sylvia provides examples of how AI systems, including those on social media platforms, can perpetuate discriminatory norms and biases. She discusses the opaque nature of these algorithms, which often operate as 'black boxes,' making it difficult to identify and address the biases they propagate. Sylvia emphasizes the need for transparency and the inclusion of diverse perspectives in the development of AI to prevent the perpetuation of harmful stereotypes.

25:08

🌐 AI Data Collection and the Ethical Implications of Generative AI

Sylvia explains the technical aspects of how AI systems collect and use data, focusing on the ethical implications of generative AI. She discusses the non-consensual scraping of data from the internet to train AI models, which can lead to the propagation of existing biases. Sylvia stresses the importance of considering data neutrality and the need for an ethical approach to AI development, particularly with regards to intersectional issues such as gender, race, class, and sexual orientation.

30:11

πŸ” The Impact of AI on Sexual Education and Youth

The discussion addresses the impact of AI on sexual education, particularly in the context of platforms like Pornhub. Sylvia shares her research findings on how these platforms track users and personalize content based on collected data, often reinforcing heterosexual male gaze. She argues for the importance of comprehensive sexual education to counteract the potential negative influences of AI-driven content on young people's understanding of sex and relationships.

35:12

πŸ›‘οΈ The Role of Digital and Sexual Education in Navigating AI Risks

Sylvia discusses the importance of digital and sexual education in helping young people navigate the risks associated with AI, such as automated harassment and the manipulation of emotions by AI chatbots. She emphasizes the need for a collective understanding of privacy, intimacy, and consent, rather than placing the burden on individuals. Sylvia also highlights the role of political will in implementing ethical guidelines for AI development and the importance of cultural change alongside technological advancements.

πŸ›οΈ Legal and Cultural Shifts in Addressing Non-Consensual Image Sharing

Sylvia shares her experience with the campaign that led to the criminalization of non-consensual image sharing in Italy. She describes the process from its online inception through a petition to the eventual legal changes. Sylvia emphasizes the importance of continued advocacy, both nationally and internationally, to address this transnational issue. She concludes by stressing the power of collective action and the necessity of legal and cultural shifts to achieve justice for survivors of online gender-based violence.

Mindmap

Keywords

πŸ’‘European School Net

European School Net is an organization that collaborates with various projects to enhance education through digital means. In the context of the video, it is the host of the podcast series discussing the intersection of AI and gender-based violence. The organization's role is to facilitate discussions on online safety, digital rights, and media literacy.

πŸ’‘Gender-based violence

Gender-based violence refers to harmful acts perpetrated against individuals based on their gender. The video discusses how AI can both facilitate and address this issue, with examples such as deep fakes and algorithmic biases that disproportionately affect women. The term is central to the podcast's theme, highlighting the need for awareness and action against such violence in digital spaces.

πŸ’‘Artificial Intelligence (AI)

Artificial Intelligence, or AI, is the development of computer systems to perform tasks that typically require human intelligence, such as learning, problem-solving, and pattern recognition. The video explores the transformative potential of AI but also its risks, particularly in the context of gender-based violence online, emphasizing the need for ethical considerations in AI development.

πŸ’‘Deep fakes

Deep fakes are AI-generated synthetic media in which a person's likeness is swapped with another's in a video or image. The script mentions that 96% of deep fake content online is sexually explicit, with 99% targeting women, illustrating how this technology can be misused to facilitate gender-based violence.

πŸ’‘Algorithmic bias

Algorithmic bias refers to the systemic bias that occurs in AI systems due to the use of biased training data or algorithms that favor certain outcomes. In the video, it is discussed how AI systems can unintentionally perpetuate and amplify societal biases, affecting areas such as hiring processes or even risk assessment in cases of domestic violence.

πŸ’‘Digital citizenship

Digital citizenship encompasses the rights, responsibilities, and behaviors that are expected of individuals in the digital environment. The speaker from the European School Net's digital citizenship department discusses the importance of addressing online safety, digital rights, and skills to navigate the digital world safely and responsibly.

πŸ’‘Online harassment

Online harassment is the use of digital platforms to threaten, intimidate, or harm others. The video script mentions generative AI algorithms being used to automate online harassment, indicating the intersection of technology and negative social behaviors that need to be addressed to ensure a safe digital environment.

πŸ’‘Ethical implications

Ethical implications relate to the moral and principled consequences of actions, particularly in the development and use of technology. The video discusses the ethical concerns surrounding AI, such as data privacy and the potential for misuse, emphasizing the need for ethical guidelines in AI development and deployment.

πŸ’‘Sexual education

Sexual education is the teaching of issues related to human sexuality, including emotional, social, and physical aspects. The video highlights the importance of providing comprehensive sexual education to young people, particularly in the context of AI and digital platforms that can influence their understanding of sexuality and relationships.

πŸ’‘Data privacy

Data privacy refers to the protection of personal information from unauthorized access or disclosure. The script discusses the risks associated with AI systems, such as the non-consensual scraping of data from the internet for training AI models, which raises significant concerns about individual privacy rights.

πŸ’‘Feminist activism

Feminist activism is the advocacy and action promoting women's rights, gender equality, and the end of sexism. The video features a postdoctoral researcher and activist who has contributed to the criminalization of non-consensual sharing of intimate images in Italy, showcasing the impact of feminist activism in shaping laws and societal attitudes.

Highlights

The podcast discusses the role of AI in addressing and potentially facilitating gender-based violence online.

AI systems can be misused to create deep fakes and reinforce harmful gender stereotypes.

96% of deep fake content online is sexually explicit, with 99% targeting women, indicating a gendered risk.

AI-driven chatbots and virtual assistants can be programmed with biases that perpetuate gender stereotypes.

Algorithmic bias in AI systems can unintentionally amplify societal biases present in training data.

The importance of addressing online misogyny and its impact on the creation of deep fake content.

The role of sexual education in understanding the violence behind using AI applications.

A study on the nonconsensual dissemination of intimate images on Telegram, revealing a social reason behind the sharing of such content.

The campaign for the criminalization of non-consensual image sharing in Italy and its impact.

The need for transparency in how platforms like Pornhub track users and the implications for privacy.

AI systems often lack inclusivity, with teams developing AI being predominantly composed of white men.

The potential for feminist activism and approaches to guide the development of more ethical AI technologies.

The importance of understanding AI as a cultural and social issue, not just a technical one.

The role of intersectional feminism in addressing biases in AI and promoting more inclusive technology.

The potential for AI to be used for good, as demonstrated by feminist hackers and activists.

The challenge of educating young people about the risks of AI chatbots and the importance of understanding consent and privacy.

The MANABO toolbox, an interactive game aimed at empowering digital gender dialogue and challenging stereotypes.

Transcripts

play00:06

welcome to the third episode of European

play00:08

School Net podcast series that today we

play00:09

are running together with a manable

play00:11

project that aims to tackle online

play00:13

gender-based violence and today we're

play00:15

talking about the intersection the role

play00:17

that AI plays in add in addressing

play00:19

gender-based violence artificial

play00:22

intelligence is one of the most

play00:23

discussed topics nowadays mostly because

play00:25

of it its transformative potential but

play00:28

also because of the risks that it POS

play00:30

particularly concerning data privacy

play00:33

ethical implications and AI bias for

play00:36

instance AI systems can um be misused

play00:39

and can cause uh and facilitate gender

play00:42

based violence online from Deep fakes

play00:44

and manipulated content to AI algorithms

play00:47

that reinforce harmful gender

play00:55

stereotypes my name is Ena and I'm

play00:57

working in digital citizenship

play00:59

Department of European school net where

play01:00

we tackle the topics related to Online

play01:02

safety digital rights digital skills and

play01:05

media literacy and everything related to

play01:07

digital environments and the use of

play01:09

technology and um today I'm here with

play01:11

amazing guest Sylvia simine who is um a

play01:14

researcher postdoctoral researcher and

play01:16

her uh expertise focuses on technology

play01:18

facilitated gender-based violence online

play01:21

and also uh she is an international

play01:23

activist and thanks to her

play01:25

contribution uh now uh the in the

play01:29

non-consensual sharing of intimate

play01:30

images is criminalized in Italy so I'm

play01:33

very much looking forward syia to speak

play01:35

to you today I have a lot of questions

play01:37

so thank you so much for joining us

play01:39

thank you Inda for the invitation and to

play01:41

manabo H for inviting me I'm very happy

play01:43

to be here and

play01:45

chatting perfect so let's dive into it

play01:48

um my first question would be um could

play01:50

you give us like a general overview on

play01:53

what uh what type of risks were I

play01:55

talking about um when when young people

play01:58

and children go go online uh what type

play02:00

of online gender-based violence risk

play02:02

they can face and encounter with the use

play02:04

of

play02:05

AI of course in so I would start from

play02:09

the very beginning of the whole question

play02:11

so as you have rightly mentioned you

play02:14

know that AI right now is a bus word so

play02:16

we're seeing it everywhere in the past

play02:18

year everyone is discussing generative

play02:21

AIS and they harms and potentially risk

play02:25

also possibly what they could do for

play02:28

good um but in the conversation around

play02:31

gender-based violence particularly both

play02:33

on women and youngers this is very

play02:36

interesting because AI is uh showing us

play02:41

how gender-based violence online is a

play02:43

very nuanced phenomenon and also it's

play02:46

everchanging together with technology so

play02:48

it's very difficult to give an overview

play02:51

which is uh exhaustive right now because

play02:54

we have to take into account that this

play02:56

is always changing and evolution evolu

play02:59

so um what we can see right now is that

play03:03

in the past year at least there was a

play03:06

growth of risk uh embedded in into

play03:09

online gender based violence and their

play03:11

relationship with with AI and the first

play03:14

example that I'm thinking of and I'm

play03:15

sure you must be also familiar with it

play03:18

is uh what is called known as deep fake

play03:21

technology deep fake technology is uh

play03:24

basically the use of AI for creating

play03:27

manipulated content manipulated pictures

play03:29

video audios and 96% of this content

play03:33

online is sexually explicit meaning that

play03:37

most of this deep fake content is abuse

play03:41

uh 99% of this content targets women so

play03:44

uh this shows us the gendered side of

play03:47

this kind of particular risk which can

play03:51

have a big detrimental impact on the

play03:54

reputation and on the on the psychologic

play03:56

and social life of women and girls

play04:00

uh but let's not just focus on deep fake

play04:03

technology because this is just one of

play04:05

the problems that AI can cause I'm going

play04:07

to list THS and then if you want we can

play04:09

get a bit into more depth uh for example

play04:13

generative AI algorithms can also be

play04:16

used against women and girls for a

play04:18

defamation campaign or for automating

play04:20

what it's called online harassment or

play04:22

doxing this is already happening and

play04:25

this is also very important for example

play04:27

in light of political elections or

play04:29

crucial political moments where women

play04:32

are uh of course the most targeted for

play04:35

hateful or threatening messages which

play04:37

are now facilitated by this uh

play04:40

generative AI which can escalate this

play04:42

kind of violence then I also want to

play04:44

mention the AI driven chat Bots and

play04:47

virtual assistance which can be

play04:49

programmed with certain biases that can

play04:51

contribute to reinforcing gender

play04:53

stereotypes then we have uh virtual

play04:56

reality AAL reality the metaverse which

play04:59

are very immersive reality where online

play05:02

VI violence can also be uh lived in a

play05:06

more immersive way and we have what is

play05:08

called The Meta rape for example already

play05:10

experienced in the metaverse and finally

play05:12

maybe this is the most common and also

play05:15

something that has been uh around for a

play05:17

very long time now uh which is the

play05:19

algorithmic bias that can

play05:22

unintentionally perpetuate and amplify

play05:25

soet biases which are present in the

play05:27

training data just an overview you yeah

play05:31

yeah very very comprehensive overview I

play05:33

think we'll need to go into details for

play05:34

each each uh issue in particular so um

play05:38

am I right to say that it's um some

play05:40

issues are related to the behavior of

play05:42

users around AI tools and around AI

play05:45

systems but something is of course based

play05:48

on AI algorithms that are out there so

play05:50

if we're looking at the kind of like the

play05:52

issue of misusing AI tools how um maybe

play05:56

you've done some uh some work some in

play05:59

your research

play06:00

that uh gives us an understanding of

play06:02

this Behavior what type of reasons

play06:04

predators and offenders uh follow what

play06:07

type of incentives do they have to act

play06:09

like this to create deep fakes and

play06:12

manipulated content what triggers them

play06:15

so it's very complicated there is not

play06:17

just a simple question to a simple

play06:19

answer to this question I think because

play06:21

reasons can be very different as well

play06:23

depending on what um which level you are

play06:26

of creating deep fakes there are people

play06:28

who are uh the very beginning of the

play06:31

implementation of deep fakes so they can

play06:33

code and they can decide openly to uh

play06:36

create deep fakes uh against women for

play06:39

example and then there are also users

play06:41

who can do it from a a more naive way

play06:44

somehow because these deep fakes Bots

play06:47

and application are very easy to use and

play06:49

to find they are free they are very

play06:51

quick so I would say it highly depends

play06:54

we have to take into account that right

play06:56

now we are witnessing a moment of a big

play06:59

growth of online misogyny so it means

play07:02

that the so-call the manosphere what we

play07:04

always thought that it was uh just a

play07:07

closed environment of the internet is

play07:09

popping the bubble is popping so it's

play07:11

going everywhere and we have a big

play07:13

monoculture I would say on the internet

play07:16

uh so part of the reason I would say is

play07:18

often to Target women and to harm them

play07:21

to make sure their voices can stay out

play07:24

of the internet which has been

play07:26

traditionally very masculine

play07:28

masculinized so so uh this is uh

play07:32

specifically happening to those who

play07:35

decide openly to uh create this kind of

play07:38

content but I think that in general deep

play07:42

fake contents are also understood as

play07:44

less malicious because of this idea of

play07:47

them being fake so it means we

play07:50

underestimate the actual arms that women

play07:52

and girls can leave when they experience

play07:55

deep fake abuse uh so as I was say 99%

play08:00

of this abuse is targeting women these

play08:03

AI are trained on women body so we have

play08:07

a huge social problem is here I think uh

play08:11

especially for young men and boys who

play08:14

are using this kind of tools out of I

play08:18

think also sort of curiosity or thinking

play08:23

that this is not harmful again they

play08:26

don't have a sexual education that make

play08:28

them understand the violence that is

play08:31

behind using these kind of applications

play08:35

and platforms and uh I also have this

play08:39

feeling that much can also do with the

play08:42

So-Cal pornification of society so

play08:44

meaning we are exposed to a lot of porn

play08:47

images or sexualized images and we also

play08:50

tend to look for more extreme content to

play08:53

what it's be what it's felt as more

play08:56

authentic is more intriguing somehow so

play09:00

this fake technology can also provide

play09:03

this idea to have more authentic content

play09:05

you know so they want to address women

play09:09

that they know for example feeling that

play09:11

this is more exciting somehow and this

play09:13

has a lot to do with masculinities as

play09:16

well very interesting yeah I think it's

play09:19

it's interesting to see how accessible

play09:21

these platforms are to be honest I was

play09:23

shock when I discovered that uh and also

play09:26

the fact that um yes the it's highly

play09:30

customizable so you can definitely like

play09:32

use your imagination and fantasy and

play09:35

allow it to go the directions that you

play09:37

you wanted to yeah which can pause some

play09:40

risks um so my question to you actually

play09:43

uh in regards to masculinity you've done

play09:44

some um amazing work and a study on the

play09:46

telegram case if I'm not mistaken and it

play09:48

was on the nonconsensual dissemination

play09:51

of intimate images could you probably

play09:52

tell us more about it yes I think this

play09:56

research is very very very ented with

play09:59

what what I was saying also because

play10:00

imagine that already back to 2020 when

play10:03

the research took place deep fake porn

play10:05

was already there so meaning there were

play10:07

already Bots for deep faking women and

play10:11

girls so very briefly what I did

play10:14

together with a colleague Lucha Botti

play10:16

was accessing 50 groups and channels we

play10:19

created a sample of this abusive and

play10:22

misogynist channels on telegram which

play10:25

are specifically created for sharing

play10:28

non-consensual ual material sexual

play10:30

material of women and girls just women

play10:33

and girls so they are populated mostly

play10:35

by men and our research had a research

play10:38

question at the very Basit which was why

play10:41

why is this happening why are these men

play10:44

meeting to share this type of content

play10:47

and what is happening what is wrong what

play10:50

is the social reason behind this and be

play10:53

Beyond finding material that was uh

play10:57

really shocking because it was not just

play11:00

of course the content and the sexually

play11:01

explicit content that we found

play11:03

non-consensually shared but we even

play11:06

found an archive called the Bible where

play11:08

all this material was um collected and

play11:12

encrypted to make sure that if the

play11:14

channel gets close all the material will

play11:17

be saved so they have this o idea also

play11:20

of this non-consensual material that's

play11:22

why it's called the Bible and there

play11:24

there were also material of women who

play11:26

are already dead like tips Canon the

play11:30

Italian surv well not Survivor the

play11:32

Italian victim uh that then also uh

play11:35

opened the discuss about a

play11:38

non-consensual dissemination of intimate

play11:39

images in Italy and um to reply our

play11:44

question and go back to masculinity we

play11:46

understood that the problem was mostly

play11:49

about how these men and boys from every

play11:53

age and from every kind of social class

play11:56

were creating their social relation they

play11:58

were PE relations so it's what in we

play12:02

called the homo socialization of peers

play12:05

the heteronormative hos socialization

play12:08

meaning that women bodies are used as a

play12:11

trade currencies to construct

play12:14

masculinity to make other men and other

play12:16

boys telling you you are a good man you

play12:18

are a true man you're a real man because

play12:20

you desire and you like women and you

play12:23

like sex and so in this sense the

play12:26

consent is negotiated or even hidden

play12:29

women are just objects you know and that

play12:31

was I think the most shocking result and

play12:34

also telling us so much about what we

play12:36

are missing right now in society to

play12:39

limit this kind of problem yeah yeah

play12:42

yeah wow it's really shocking and a

play12:43

little bit sad as well to hear that

play12:45

stuff like this exists out there well

play12:48

thank you so much for sharing syia uh I

play12:50

think I would like to now move a little

play12:52

bit towards the algorithms and gender

play12:54

stereotypes and how this really um uh

play12:57

the intersection that it has could you

play12:59

probably give us some examples on how AI

play13:02

systems have reinforced perhaps the

play13:04

gender

play13:05

stereotypes of course there are many

play13:07

many examples in this case as I said

play13:10

some of them are already out there for a

play13:12

very long while and thinking for example

play13:14

of social media platforms you know

play13:16

sometimes it makes me smile that now

play13:18

we're discussing generative Ai and

play13:21

intell artificial intelligence at large

play13:23

as if it was something new something

play13:25

that uh has just been popping up in

play13:28

society but but social media platforms

play13:30

has been used using artificial

play13:32

intelligence forever to for example uh

play13:36

track ads or decided deciding which

play13:38

content we can see whose voices should

play13:41

be listened or not so this means that

play13:44

for example in the in this case of

play13:46

social media platforms one of the most

play13:49

um interesting examples and maybe the

play13:52

most known is censorship right so how

play13:55

these algorithms can decide with which

play13:59

bodies can be seen uh how women bodies

play14:02

for example should or should not be seen

play14:04

so it means that they have um basically

play14:09

encoded rules social norms that are very

play14:12

discriminating into writted Norms online

play14:16

so for example you know we cannot show

play14:18

our nipples online uh but men can and

play14:21

this is just arbitrary you know I don't

play14:23

think we never had a discussion on this

play14:26

but then if we move to other more

play14:28

complex system that are also already out

play14:31

there and I'm thinking for example of um

play14:34

systems algorithmic system that decide

play14:36

who to H so this was an example in

play14:39

Amazon um deciding who was uh the best

play14:44

candidate to cover a very high position

play14:47

in computer science inside Amazon and

play14:50

not surprising this algorithm was

play14:53

excluding women from um from the process

play14:56

because it was fed again with

play14:59

discriminatory data and you know we have

play15:01

a problem of women into uh Tech so the

play15:05

algorith was just reinforcing this

play15:07

another example that comes to my mind is

play15:10

an example of Spain uh where the

play15:13

government decided to use and Implement

play15:16

an algorithmic system that was used to

play15:19

decide and classify the risk of violence

play15:22

that a woman could have based on her uh

play15:26

experience of stalking for example or

play15:29

prev violence and these algorithms again

play15:32

were fed with insufficient data or

play15:34

biased data so the result uh as killed

play15:38

women who were classified as low risk

play15:41

you know in stain what happens with many

play15:44

of these systems it's something that I

play15:46

want to highlight is that often times

play15:49

they are very opaque and they are black

play15:53

boxes where this data becomes very

play15:56

difficult to check so if we don't open

play15:58

this black boxes we cannot face these

play16:01

biases that are implemented Within These

play16:05

systems so the big problem right now

play16:07

with artificial intelligence and bias is

play16:09

that women are not included most times

play16:12

marginalized communities are not

play16:14

included often times these teams uh and

play16:17

councils of AI inside platforms or

play16:20

inside these systems are basically

play16:22

populated by white men so imagine where

play16:25

this bias can come from you know

play16:27

sometimes we think that it's technology

play16:29

that it's just a bit um uh float and

play16:34

then we can fix it but I think this kind

play16:37

of example can just show us that

play16:39

actually it's a social problem it's a

play16:41

cultural problem it's not

play16:44

technical it's culturally embedded

play16:46

already and then we just train AI kind

play16:48

of to do to continue this pattern thank

play16:51

you so much syia actually this this

play16:53

brought me to the question how could you

play16:55

explain us how AI collect data on us I

play16:58

think it's it's it's became pretty

play17:00

evident that it's just following us on

play17:02

across all platforms across all networks

play17:04

and devices that we connect to but could

play17:07

you perhaps explain us more in technical

play17:09

details if possible what type of uh

play17:12

Technologies used to collect this data

play17:14

well I want to highlight and first of

play17:16

all that I'm not a technology a

play17:18

technologist so I'm a sociologist I

play17:20

study digital technology so I know how

play17:23

these systems works but if I say

play17:25

something that is not technically

play17:26

correct please forgive me um however how

play17:30

this technology work is uh basically

play17:33

with a training in the your system with

play17:37

millions of millions of data I'm talking

play17:40

now about generative AI which often are

play17:44

taken and scraped from the internet and

play17:47

this often times happen non-consensually

play17:49

actually we have a huge ethical

play17:51

discussion regarding generative AI on

play17:54

art on culture on journalism on

play17:56

information because the data they are

play17:58

using to create the beautiful images

play18:00

that we see out there are created with

play18:02

images of artists that are already out

play18:05

in the internet and that are scraped and

play18:07

put into this system and then uh trained

play18:12

uh to give you an output that it's a

play18:15

fabricated output right so the big

play18:18

problem with how this data well with

play18:21

these machines are trained is precisely

play18:24

data because data are not neutral and so

play18:27

on uh algorithms are not neutral and

play18:30

outputs will never be neutral you know

play18:32

so we have to discuss data we have to

play18:35

think of data because if we don't these

play18:38

algorithmic systems will only

play18:40

regurgitate the same biases that humans

play18:43

have trained them to adopt you know I

play18:45

think this is the crucial issue we are

play18:47

facing right now with artificial

play18:49

intelligence and from an intersectional

play18:51

point of view this is very relevant

play18:53

because it's not just about gender but

play18:55

it's also about race class disability

play18:58

sexual orientation

play18:59

religion and all and how all this

play19:01

Dimension intersect among them and with

play19:04

gender meaning that the people who will

play19:06

be most discriminated by this systems

play19:08

that now uh seem to be the Revolution

play19:11

and that we're thinking that they will

play19:12

make our life better will be the ones

play19:15

that will suffer the most from the

play19:17

consequences of having this kind of

play19:20

Technologies into have be put into

play19:23

society without a previous discussion on

play19:25

their

play19:26

ethics okay very interesting

play19:29

thanks syvia um I think yeah I would

play19:31

like to come back to the issue of gender

play19:33

stereotypes and uh I would like to ask

play19:36

you to perhaps El elaborate a little bit

play19:38

more on the study that you made also on

play19:40

PornHub that uh was about header nor

play19:43

normativity that uh prior was

play19:45

prioritized on the platform could you

play19:47

tell us more about this

play19:48

study of course the study was also um

play19:52

connected with this topic that I was

play19:54

commenting right now about algorithms

play19:57

and uh the So-Cal platform affordances

play20:00

that are offered uh from the platform to

play20:03

uh again decide uh which type of content

play20:06

you will consume in the platform so what

play20:09

we wanted to see on forab since it's the

play20:13

most used uh platform for consuming

play20:16

pornographic content was understanding

play20:18

how PAB was uh tracing users and the

play20:23

personalizing their own page according

play20:26

to uh the data they were collect Ing and

play20:29

and tracing uh the first result which I

play20:32

think is very interesting is that first

play20:34

of all porhab at time of the research

play20:37

was not telling out openly that they

play20:39

were tracking users so that was a

play20:41

violation of gdpr and that opened also a

play20:45

strategic litigation against bab for

play20:47

violating rules um so not telling users

play20:50

that they will be traced and this is

play20:52

particularly important in pornographic

play20:55

platforms because you know that they are

play20:57

uh getting to know your sexual

play20:59

preferences your sexual orientation and

play21:02

they can use it to send this to

play21:04

companies or against you you know so we

play21:07

have to be transparent about our data

play21:09

again and how they are being used and

play21:11

second thing which I think is the most

play21:14

interesting result and it's connected

play21:15

with what you were saying regarding uh

play21:18

CET normativities and mail Gaze on

play21:21

online platform is that although the

play21:24

platform was tracing users and although

play21:27

we we saw that data were actually

play21:30

collected um pornh have decided to not

play21:33

personalize the H page which is pretty

play21:36

crazy if you think of how the rest of

play21:38

social media platform work which are

play21:40

highly personalized but in the case of

play21:42

forna it was not interesting whether you

play21:45

were a woman a heter woman a lesbian

play21:49

woman or a transgender woman or man the

play21:52

content you were proposed by Vi perap

play21:54

was always the one for a head to remain

play21:57

so this is exactly the result showing

play22:00

how Al also how pornographic platforms

play22:03

embed this uh heterosexual male Gaye and

play22:07

since pornograph pornographic platforms

play22:09

are still used from so many young people

play22:12

to understand sex and inform them

play22:15

themselves about sex I think this is

play22:17

particularly crucial because it tells us

play22:19

that without sexual education this

play22:22

platforms will be the ones telling um

play22:25

young boys and girls how they should

play22:28

expect their sexual relations to be and

play22:30

we know that these platforms also have

play22:32

very extreme content a lot of

play22:34

non-consensual content so I don't think

play22:37

we should leave uh sexual education to

play22:41

pornographic

play22:42

platforms okay okay yeah fair enough uh

play22:45

actually now I have a question on sexual

play22:47

education and Technology I know uh syvia

play22:50

that you also do some seminars and

play22:52

workshops and schools and you try to um

play22:55

you know give this knowledge to to young

play22:57

people could you perhaps um give a

play22:59

little bit of an Insight how does it

play23:01

happen what type of response you get

play23:04

from young people because it's quite a

play23:06

sensitive topic so I can imagine that

play23:07

it's very difficult to to approach young

play23:09

people on that I mean young people I

play23:12

think they are very eager to learn so I

play23:15

have to say that in the past years yes

play23:18

in the past year um I I've done sexual

play23:21

education mostly through an NGO that I

play23:24

co-founded in Italy a virgin and Marty

play23:27

uh that was born with the aim of doing

play23:29

sexual education and digital education

play23:32

and I have to say that it was often

play23:35

times young people calling us to the

play23:37

year schools so they were like hi can

play23:38

you please come we have this day where

play23:41

we have

play23:42

autonomous uh day independent day where

play23:45

we can decide who to invite can you come

play23:47

we would like to know more about um I

play23:50

don't know gender identities or even

play23:52

porn uh bodes you know or Digital Sex

play23:56

and so for us it was always very

play23:59

intriguing to see how they wanted to

play24:01

receive sexual education but in a

play24:03

country like Italy they can't because

play24:06

they have because we have a political

play24:08

willing willing willingness to uh limit

play24:12

and censor discourses around sexualities

play24:15

and this creates a problem so uh again I

play24:19

think this is uh really about uh

play24:22

politics it's really about deciding

play24:25

whether we want to give young people the

play24:28

tools to understand what is happening

play24:30

around them both on the digital and and

play24:33

the offline and the in dimension that

play24:36

both have because otherwise we are

play24:39

leaving them alone and I don't think

play24:41

this is fair and I have to say that

play24:44

every time I went to school I always

play24:46

came out with a very positive feeling of

play24:49

saying even if I convinced one person

play24:52

today about um positive or inclusive uh

play24:56

discourses there was always somebody

play24:58

raising their your hand and say thank

play25:00

you for saying this I am nonbinary or I

play25:04

have an I had an experience of online

play25:06

sexual violence so thank you for saying

play25:08

this and I felt that saved them you know

play25:12

so what if we had this every day this is

play25:14

what I'm uh thinking because two hours

play25:17

per year is not enough and we are still

play25:20

having this problem all around Europe so

play25:22

we should have path of sexual education

play25:25

and digital education to make sure young

play25:28

people

play25:29

can be um can feel safe online and

play25:33

offline yeah yeah yeah I agree uh

play25:36

actually my next question is about

play25:38

mostly digital education and it's it's

play25:40

concerning AI chat Bots because I think

play25:43

it's it's it's it's really concerning

play25:45

issue right now especially that they're

play25:47

trained for automated harassment or they

play25:49

can mimic language and behavior of The

play25:52

Trusted adult or sometimes it's also

play25:54

extremely extremely scary to realize

play25:56

that they also can leverage your

play25:57

emotions they can read when you feel

play25:59

lonely or sad and they can play with it

play26:01

and make uh make you feel comfortable

play26:04

and secure Etc so my question here is

play26:07

that how do we um explain and how do we

play26:12

um you know uh educate young people uh

play26:16

to conversate and to uh avoid these type

play26:18

of traps and just to be familiar aware

play26:21

of those um when they conversate so to

play26:24

say with AI chat Bots what can they do

play26:26

and and know

play26:28

I mean again this is complex because I

play26:32

think we shouldn't think that adults in

play26:35

this case can educate young people

play26:37

because often times if you go to uh

play26:40

people who are older than us they are

play26:43

possibly even more fragile in front of

play26:46

these traps than young people who are

play26:48

living into this kind of technology so

play26:50

of course we all need to be educated and

play26:53

the way to do it at a

play26:56

societal um

play26:58

overview is uh understanding the risk

play27:02

first of all understanding what is

play27:04

happening and we're going to a huge

play27:06

speed you know and having political

play27:08

responses in this sense uh is difficult

play27:11

because you know that political

play27:13

institutions are always lower than

play27:15

technology uh but I think what we need

play27:17

again is uh understanding the the conent

play27:21

the concept of privacy of intimacy again

play27:25

of consent both of both all of Concepts

play27:29

as collective issues not just individual

play27:31

issues otherwise the response will

play27:33

always be put on the individual instead

play27:37

of collective uh Society you know which

play27:41

avoids providing political answers in my

play27:44

opinion so of course we need to do

play27:46

education as I said but also I want to

play27:49

highlight once again that this is about

play27:51

political willness is to limit also the

play27:54

implementation of certain malicious

play27:57

application of um of AI because right

play28:00

now when you discuss AI especially in um

play28:03

political institution it's all about

play28:05

Innovation it's all about uh letting go

play28:08

because we cannot stop the Innovation

play28:11

and the we cannot limit technology blah

play28:13

blah blah and this is false and this is

play28:17

false because we have to put ethics at

play28:19

the core of the Innovation and the

play28:22

technological development otherwise that

play28:25

will always have um bad con consquences

play28:28

on Society and also on young people who

play28:31

of course are more fragile but what I

play28:33

want to say Ina is that I will never

play28:35

tell young people not use technology

play28:37

because it's impossible and they will

play28:39

always do but this is what they are told

play28:42

often times at schools by governments

play28:44

you know but they your parents and this

play28:47

is not the right way to do it it's more

play28:51

again also in this case to understand

play28:54

together with them what is going on and

play28:57

what we could do regarding

play28:59

changing yeah yeah yeah yeah I agree we

play29:02

cannot avoid technology is there AI is

play29:04

there and it will be there so we need to

play29:05

learn and we need to help young people

play29:08

to uh become more aware in terms of how

play29:11

we can use it and leverage it to its

play29:12

full potential instead of just avoiding

play29:14

it that's not gonna that is impossible

play29:18

okay that's um a very interesting

play29:20

Insight siia so my question here

play29:23

is um so we know the complexity of the

play29:26

risk that are related to AI um my

play29:29

question here is that why can we not

play29:30

train and I know that there are some

play29:32

efforts already made to train and to

play29:34

detect Predators offenders and to flag

play29:37

those and to detect generated AI

play29:39

generated content online but my question

play29:42

here here can we trust are those tools

play29:46

reliable enough uh and do you believe

play29:48

that in near if not do you believe that

play29:50

in near future it will be the case that

play29:52

we'll be able to um kind of use AI to

play29:56

prevent those type of things from from

play29:58

happening instead of allowing them and

play29:59

facilitating

play30:01

them so um again um I think my answer in

play30:07

this case will be also nuanced in the

play30:10

sense that I think first of all that

play30:13

technology should could and should be

play30:16

more inclusive and more feminist so we

play30:19

need to use feminism as the core concept

play30:23

to develop technology meaning we need to

play30:25

include more women and more marinal I

play30:28

Community into the development of

play30:30

technology to make sure we can have uh

play30:33

better results and better Innovation you

play30:36

know and in this case for example

play30:38

feminist activism with a age so hackers

play30:42

feminist shows us that there are

play30:45

implementations uh of technology and or

play30:48

even of AI that can be used for good uh

play30:51

but this is again uh something that

play30:54

requires a very um big political

play30:58

understanding of the problems um so when

play31:02

we understand the problems and we

play31:04

understand discrimination we can think

play31:05

of Alternatives so again I think

play31:08

intersectional feminism shall guide the

play31:10

change in this sense otherwise there

play31:12

won't be change at all the problem is

play31:14

that most of these alternatives are

play31:17

always side so they are not massified

play31:20

and they are not used by the broad

play31:23

public the general public because you

play31:25

also know that the internet is not a

play31:28

democratic place there are there are

play31:31

bubbles that are Biggers and there are

play31:33

platforms that are more used much more

play31:35

used than others and these are not

play31:38

inclusives um so on the other hand my

play31:41

answer also connects with the idea that

play31:44

for me since we have discussed um so far

play31:48

how much this is a social problem and a

play31:50

cultural problem I don't think that we

play31:52

need necessarily a technical response to

play31:56

this problem I think that technology

play31:58

will change together with society and

play32:01

culture we don't need more technology I

play32:03

think we need more culture we need more

play32:07

uh ethics again and we need a broader

play32:11

understand of consent and we need to

play32:13

also recognize that the gender-based

play32:17

violence exist and is causing so much

play32:21

harm that we cannot just ignore it you

play32:25

know and um and this is so important

play32:29

because again we always think that these

play32:32

technology then are just technical you

play32:35

know are just mathematics and we forget

play32:38

how much there is always a human choice

play32:40

in the systems that we use even in

play32:43

platform moderation we think that

play32:45

platform moderation is just for

play32:47

algorithms online it's not there is

play32:49

always a human being deciding whether

play32:51

the content it stays or goes you know so

play32:55

again I think we should work more on the

play32:57

The Human Side of technology and less on

play33:00

mathematics and then possibly we will

play33:02

have better results at the very end yeah

play33:05

that sounds good and quite positive that

play33:07

we still have some power and we can uh

play33:10

make a change and speaking about a

play33:11

change actually could you tell us more

play33:14

uh about your uh the campaign that you

play33:16

contributed to uh that led to the uh

play33:19

criminalization of non-consensual image

play33:21

sharing in Italy I think it's very

play33:23

impressive so I think this is also

play33:24

example of how we can um you know how

play33:27

can be how we can change the the

play33:29

situation that we're at right now um

play33:32

syia share us share share it to us

play33:34

please thank you so much yes of course I

play33:37

will be happy to share it with you I

play33:38

have to say it's a very long story so I

play33:40

will try to

play33:41

resume um this started back in 2017 when

play33:46

I started to work on online gender-based

play33:48

violence and image based sexual abuse um

play33:52

I have to say this as most of uh us

play33:55

women feminists start to work on these

play33:58

kind of topics uh is always due to a

play34:01

sort of uh abuse experience that we have

play34:04

so that was also my case unfortunately

play34:07

this is how we started I I like to say

play34:10

this mostly because sometimes we think

play34:12

that feminists start to be feminist

play34:14

because they had like an illumination on

play34:17

the way uh you know and they started to

play34:19

be passionate about this topic out of

play34:21

nothing and it's not true we experience

play34:24

violence our lives are embedded by

play34:26

violence so this in my case was also

play34:30

online not just online I had several

play34:32

experiences of that and in that moment I

play34:35

was also studying a PhD so I wanted to

play34:38

give a name to what I was to what I

play34:41

experienced I wanted to understand to

play34:43

how many other women that was happening

play34:46

and this is how everything started right

play34:48

because I started to collect data I

play34:50

started to work with NGS like Amnesty

play34:52

International in Italy and we started to

play34:54

create this framework for gender based

play34:57

VI online that was unexisting back to

play35:00

2017 and this slowly took me into a

play35:04

political engagement that was completely

play35:07

unexpected at the very beginning and it

play35:09

growth uh exponentially uh during the

play35:12

years until in 201 end of

play35:15

2018 beginning of 2019 I launched this

play35:19

campaign online at the beginning through

play35:21

a petition um the hashtag was intima

play35:25

meaning violated intimacy where I was

play35:28

asking to Italian institutions to do

play35:30

something regarding non-consensual

play35:32

dissemination of intimate images and

play35:34

recognize it as a gendered problem and

play35:36

recognize it as a violation of consent

play35:39

and recognize also the responsibility of

play35:41

all the digital platforms that were um

play35:44

facilitating that uh uh that content and

play35:47

we're not removing it and we're not

play35:49

replying to survivors so that campaign

play35:51

started online but then it became a

play35:54

mediatic campaign it also took form

play35:57

offline through schools again through

play36:00

events on the territory and we started

play36:03

to work on a bill because I started to

play36:05

collaborate with a political a

play36:07

politician in in Italy until uh the law

play36:11

was presented uh in July uh 2019 and

play36:15

approved in July

play36:17

2019 uh so it was a very quick campaign

play36:20

then in the end uh in six months we had

play36:22

the the low approved and uh

play36:25

unfortunately it was not approved for as

play36:28

I want it so the part on digital

play36:30

platforms was gone for example yet the

play36:34

the law recognized the gender aspect of

play36:37

it because it was introduced in the red

play36:39

code bill which is about gender-based

play36:43

violence uh last things I want to say

play36:45

regarding this campaign is first of all

play36:47

that it shows that we can always make a

play36:50

change if we want if we get angry for

play36:53

real and we start to collaborate with

play36:55

other people and try to make

play36:58

ourself understood and we can really

play37:01

make the difference uh I I really

play37:03

believe it and this is uh empowering

play37:06

because sometimes we think that in front

play37:07

of technology and violence there is

play37:09

nothing we can do but this is not true

play37:11

this is what they want us to think um

play37:14

and then on the other hand I think that

play37:16

also in this case the legal response is

play37:19

necessary but is not enough so we need

play37:21

to do more because this is also a

play37:23

transnational problem it's not just

play37:25

National we need to work on the European

play37:28

side of it because platforms are not

play37:30

National they are Global so the work is

play37:33

not is not finished you know and we

play37:35

there is much more that we have to do

play37:37

and that we are doing I'm working with

play37:39

other survivors and activists around

play37:41

Europe and around the world and

play37:43

hopefully this will go further and

play37:46

further until we receive Justice for Rio

play37:50

because right now it's not happening and

play37:52

so we will insist until this uh is done

play37:57

yeah yeah yeah sounds very motivating

play37:59

and inspiring thank you syvia I think on

play38:01

this positive note we'll try to draw

play38:03

this podcast to an end although there

play38:05

are so many questions that I would like

play38:07

to continue asking you but probably um

play38:10

next time when we meet again hopefully

play38:13

um so thank you so much syia for making

play38:15

yourself available and for sharing your

play38:16

expertise today and the work amazing

play38:18

work that you do thank you it was an

play38:21

amazing conversation very happy thank

play38:24

you so much um I would just take uh

play38:28

opportunity to also share the um the

play38:32

outcomes of the manable project that we

play38:34

are currently working on um uh tackling

play38:37

gender-based violence um recently the U

play38:41

manable toolbox has been launched and

play38:43

it's a interactive uh game that contains

play38:47

uh quizzes dilemas and challenges that

play38:50

young people can uh play with and um

play38:53

Empower digital uh gender dialogue and

play38:56

gender stereotypes di look in the

play38:58

classroom or at home together with their

play39:00

teachers or parents and caregivers but

play39:03

also by themselves so this can be found

play39:06

and with with other initiatives of men

play39:08

on at

play39:10

men. so please go ahead and visit you

play39:12

can also subscribe to uh social media

play39:15

channels of the enable projects on Tik

play39:16

Tok and um

play39:18

Instagram uh and uh please please

play39:21

subscribe to European school night

play39:22

podcast Channel as well to follow up on

play39:25

the next episodes that we'll be dealing

play39:27

with with Technology Innovation and uh

play39:29

education topics thank you so much for

play39:32

uh listening us today and I wish you a

play39:34

beautiful day thank you

play39:40

[Music]

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
AI EthicsGender ViolenceOnline SafetyDigital CitizenshipCultural ImpactFeminism in TechDeepfakesAlgorithmic BiasSexual EducationYouth Empowerment