This is the third conversation of the 100+ Conversations to Inspire Our New Direction (#OKFN100) project.

Starting January 2023, we are meeting 100+ people to discuss the future of open knowledge, shaped by a diverse set of visions from artists, activists, academics, archivists, thinkers, policymakers, data scientists, educators and community leaders from all over the world. 

The Open Knowledge Foundation’s team wants to identify and debate issues sensitive to our movement and use this effort to constantly shape our actions and business strategies to deliver in the best possible way what the community expects from us and from our Network, a pioneering organization that has been defining the standards of the open movement for two decades.

Another objective is to include the perspectives of people from diverse backgrounds, especially those from marginalised communities, from dissident identities, and whose geographic location is outside the world’s major financial powers.

How openness can speed up and strengthen the fights against the complex challenges of our times? This is the key question behind conversations like the one you can read below.

*

Prepared especially for International Women’s Day, today’s conversation is with Thais Ruiz de Alda, founder and executive director of DigitalFems in Barcelona, Spain.

Thais is a lawyer with doctoral studies in Political Theory and Public Policies that has led digital teams and projects since 1999, both in the private and public sectors, as well as universities such as the Universitat Oberta de Catalunya. Since she founded DigitalFems, her focus is on giving visibility to women in the world of technology through public policies, programmes, training, and product development. One of Thais’ most impactful projects is GenderDataLab.org, an open-source data laboratory implementing CKAN with a gender perspective. 

We spoke to Thais one week ago, before she flies to New York City to attend UN Women’s sixty-seventh session of the Commission on the Status of Women (#CSW67). The priority theme this year is “innovation and technological change, and education in the digital age for achieving gender equality and the empowerment of all women and girls” – that’s the topic that guided our conversation.

With each conversation, more people are joining the project. This time, we had the pleasure of having the participation of Carol Matos, OKFN’s Partnerships Lead based in Brazil, Shashi Gharti, OKFN’s Software Developer based in Nepal, and Lucas Pretti, OKFN’s Communications Lead based in Spain.

We hope you enjoy the read.

*

Thais Ruiz de Alda: Can I introduce myself before we start?

Lucas Pretti: Well, we have researched a lot about you. But of course! Please feel free to do it in your own words.

Thais Ruiz de Alda: Great! I’m an internet utopist, I believe in the Republic of Internet since someone explained to me what the TCP/IP protocol was in the last century, I think around 1998 or 1999. I believe in the digital commons and that we have a responsibility in building the internet that should have been built. Now we have actually two tasks: to struggle against the ones who do not want the internet as a free space and to get things done.

Lucas Pretti: Super. I totally related to that.

Carol Matos: Yeah, getting things done is really a match with the Open Knowledge Foundation. So you’re good here. 

Lucas Pretti: So here’s a first provocative question to kickstart. How would you describe technology from a feminist perspective?

Thais Ruiz de Alda: Yesterday I was in Barcelona in a meeting with a very interesting group of people. They were Palestinian female developers, and we were talking about this precise question: what is technology under a feminist perspective? 

To me, the purpose of technology is to help us achieve justice, gender equality, and impact in a good sense. Feminist technology is a tool to build against a gender equality goal. This means that technology can be designed taking into account what gender equality is and not what the technology should do. Today most technology is actually generating the opposite effect: it is amplifying gender gaps. We need to stop this, rethink, and take action. We need to tell technologists that tech is not a white man’s “brogrammer” thing.

Technology is a power structure, and as with all power structures, there’s a need to limit the ownership of such power with regulations. Today technology is like the far west, a private infrastructure, and no one is putting limits on it. We still don’t know how to translate the offline rules to the online world. That’s why we need policymakers to believe in us activists and feminist technologies that can enable women and other genders to achieve equality and not the opposite.

Carol Matos: Women are the main caregivers in our society, and we recently suffered a great disadvantage during the pandemic, because of the need for social distancing and the urgent digital transformation. In that context, how do you see women’s political participation in government spheres, and in society as a whole, and its relationship to digital technology? Do you think digital technologies make it more inclusive? What are the barriers that women face and how to overcome them?

Thais Ruiz de Alda: We at DigitalFems recently released a report about that. The full data and the survey answers are published in a CKAN repository we run called Gender Data Lab. We asked this big question to hundreds of women developers in Spain: what are the barriers you are finding in the tech industry? The main answer is about the “brogrammer” culture, which is very toxic. It generates a bias issue, that is related to different types of bias. One of the most presents is performance bias. When a woman technologist goes through a yearly review, they feel a chaotic feeling of not being evaluated by what they have done and the potential they have, but the opposite way, they are evaluated by what they have not achieved. This implicit bias generates a contradictory feeling of stepping back a little, and the imposter syndrome comes out. Instead of feeling empowered, this male corporate culture environment makes many of them don’t feel good enough. Sometimes even the reviewer feels the same thing. And that happens because the situation is toxic and the context is not favourable. This was a common pattern in the report. More than 50% of women developers were not promoted at the same tempo as male developers. So this is one big barrier: if the tech industry is not inclusive, how are we going to build inclusive technology? It’s a chicken and egg situation. 

On the other hand, we at DigitalFems also work with the public sector. In a specific city hall, we worked with the gender department to build a law, a very specific affirmative women tech political action framework. What we need is a big agreement around such a framework with the public sector, academia, civil society organisations, and the private sector.

Carol Matos: What about a broader context beyond the tech industry? How did women adapt to being digitally available during the pandemic as caregivers? What are the effects today? Do you think it led to more inclusion?

Thais Ruiz de Alda: The pandemic was useful somehow to make very explicit globally what was going on with the care thing. During the lockdown, women went through the same thing in every household, every apartment, and every street of every single city in the world. Suddenly there were data facts because everyone was in the same situation. Before that, it was more difficult to visibilise the caring problem. But then how do we stop and change this?

We are right now working with an academic group called Dimmons (Digital Commons Research Group) from the Universitat Oberta de Catalunya (UOC) on the plan of a datathon on the next 17th of March within the VIII Congress of Feminist Economy. Some of the guiding questions are: what is the role of women participating in the economy and in politics? Why is caring not a value taken into account to explain gender equality? Why is reproductive work not considered an economic indicator? Those are the big questions for me. Participation in the economy and in politics needs to be reframed taking into account other ways of understanding our societies.

Carol Matos: Yes, I agree that there was a positive side because now there’s data to prove, as you said. A lot of women started working from home, and for a caregiver, say a working mom, being available in the digital environment is good. But do you think this can inform governments? Do you think there’s a change happening there? If you look at the open movement, there are three big organisations with women CEOs right now (Open Knowledge Foundation, Creative Commons and the Wikimedia Foundation). Building on my previous question, how do you see our open movement helping the rest of the world acknowledge the issues we are facing as women? 

Thais Ruiz de Alda: I think that the open movement is a big ally because you are able to impact big audiences. I mean, you’re important on the internet. So this is key. Here’s an example of such importance. In Catalonia there is a Wikimedia local association, which belongs to the global Wikimedia movement, with strong arguments about how to define professional women in the Catalan version of Wikipedia – with male as the neuter genre, i.e., the female variation is currently not accepted in the biographies of women lawyers, doctors, etc. Of course, in English and other languages, this problem does not exist, but in other languages such as Spanish or Catalan, it does. In Spanish we say “doctores” and “doctoras” for male and female doctors, “jueces” and “juezas” for male and female judges. Feminists in the open movement group are struggling to change the male point of view about how professionals should be renamed. 

With that, I’m trying to argue that even in the open movement there’s an issue of representation, and that’s a shame. It’s definitely a political statement to see women in CEO positions, as you highlighted, but the rate is still very low, especially in some tech frameworks. It is still difficult to find women among open source developers, members of boards, or speakers at big conferences. There has been a big effort from some of the open source movements to bring more diversity, but there are still big issues that have not been resolved – code of conduct and sexual harassment are examples of that. And it becomes a vicious circle again. 

Last week I read about a girl who suffered sexual harassment at a very big python conference in the US. At least the open communities around the world made a big echo about this, something that wouldn’t happen ten years ago. But it also shows that we still have a lot to achieve, and a lot of things to be done.

Shashi Gharti: Do you think this bias against women in the tech industry is intentional? Is it a situation of “this is our space and you can’t come”?

Thais Ruiz de Alda: I’m a person with good faith, so I don’t think it is intentional. We all have biases in understanding the world, we need to acknowledge and accept it. If you are not a male white developer in the tech industry, you need to know that you will be biased – if you don’t accept that, it will be harder to confront and surpass the situation. Today there is a huge global perception that women developers are not as good as men developers. Such a perception is really global and happens the same way in Silicon Valley, China, India, Brazil, Spain, and everywhere. Our role as the open movement is to generate tools to help people identify their own biases. Once this has been accepted, people in the tech industry will more likely understand and think: “Before I say anything I need to ask myself if this is biassed or not”. 

This is like learning a new framework for doing things. In the end, it’s also a personal option. A humble attitude would be the first requirement to start asking questions. “What can I do to improve? What can I do to change my behaviour? What can I read? What, where can I learn?” Then it’s important to realise that many voices, links, banners and campaigns around are reiterating the same speech. 

In the case of women, the imposter syndrome does not help, as I said before. We step back when we see that our work is not being appreciated. So women also to learn how to be empowered. In the DigitalFems we design free workshops for women developers to help them learn how to feel empowered in negotiation techniques. This is a skill that requires practice and someone training us. And that means having resources, time, and money. 

That’s my message to big corporations. We can design and develop public policies, but it will be useless if big corporations are not involved or don’t understand. The process becomes long and painful. Instead of the policymaking path, there’s also a shortcut, which is exposing companies that are doing pinkwashing. They need to stop. In such cases, we do gender assessments for free so they can understand the problems behind.

Lucas Pretti: I have a question for Shashi. Do you have experience working in the tech industry as a developer? Maybe you could add your personal perspective to this conversation.

Shashi Gharti: I do have. And I can relate to the imposter syndrome. Every time I go to a presentation I have to prove each and every word that I say. I don’t think my male counterpart has to face that kind of question. But, in my case, everything I say about technology I have to prove and reiterate: “I’ve worked on this area and I’m knowledgeable about it, etc”. When that happens, I usually get frustrated and step back. Then the imposter syndrome comes and I think: “Maybe I don’t know all the things that I should know”. Then I talk less, and so on. Yes, I have passed that.

Carol Matos: This is also a reality in academia and other fields. But with the growth of the digital and the popularisation of remote work, I believe that there’s an opportunity. Being behind the screen is good to avoid male-heavy environments, as we have to put ourselves there in other ways. So it’s really important to change things at the moment when we are thinking about new products or platforms, and make sure it is really inclusive. I think Thais reached the right point, which is talking to corporations directly. Because if they don’t change it, we can change the policy but still nothing’s gonna happen.

For example, nowadays there are algorithms based on gender deciding what we want to see and what information a woman should be getting. This is almost like consenting to something you don’t, and it relates a lot to the feminist cause because we don’t get to choose how women are projected into the digital world. If you are a woman, it’s very likely that you are going to get ads for makeup and other things that maybe you’re not even interested in. That takes me to the discussions about body image, which is a huge issue on the internet. I think we need to talk about this. 

Thais Ruiz de Alda: Yes, I totally agree. I take this opportunity to make something public: my daughter is going through an eating disorder. She is one of the thousands of women teenagers that during and since the pandemic have been exposed to shitty content. I can’t understand how social media platforms and algorithms don’t understand how harmful this is, and why there has been no limit on making money on top of that. 

DigitalFems belongs to some interesting tech women activist networks. In one of the working groups, Autodefensa.online, we are having this huge conversation. How can we litigate and regulate the content issue directly? During the 80s, the struggles led to limiting the amount and type of content that appeared on TV. The pop culture of today is social media – so why there is no interest today in limiting content on teenage platforms? The only possibility today is to flag specific content pieces, but nothing else. It’s useless. You can report and say whatever you want, but the advertisements will still be there.

The “ad tech” has destroyed the internet as we know it – that is my point of view. Two decades ago, I collaborated with an association trying to generate standards and limits to advertising on the internet, and we failed massively. Looking at this over time, I can say it was an epic fail! People working for that association ended up stepping back because of the strength of Big Tech companies. At that time, it was like the Pareto principle the other way around: 80% of the total amount of money circulating on the internet was managed by 20% of the companies. Today we would probably have to reduce this percentage even more, maybe to 5% of the companies. This is one of the big topics that feminists need to take action on strongly. 

I know that United Nations working groups are starting to take this seriously, but it’s still not enough. The pandemic has happened almost three years ago, and my daughter still receives ads in her timeline. Do I need to go to court? There’s a legal figure in Spain that allows people to start litigation collectively. Maybe this is the only way. Many mothers like me are suffering, and this is a matter of public health, with public infrastructure being used to deal with this problem. We can prove it with data.

To me, this is gender violence. Full stop. 

Carol Matos: It’s very harmful, yes. And it’s getting exponentially harmful and bad to women with AI applications now. The more we wait to act on this, it’ll be harder to revert it afterward. This is the time to talk about this. 

Thais Ruiz de Alda: Exactly! But loudly and clearly: “Big Tech companies are generating this cost to public health”. This is gender-based violence. Not to mention all the other violent situations that women are facing on social platforms. The political agenda needs to be reviewed, and regulatory authorities and the force of the law should be applied more often.

These days I was discussing with a women tech developer how to improve dating apps. And here is a very simple example: where is the feature allowing users to remove photos of dicks or other explicit content as profile pictures? I’m sure that such a thing would exist if there were women in positions of power. So it’s not only a discussion about representation but also about giving women the power to decide. And here we get back to the beginning of the conversation, on the importance of having more women developers in the industry. Otherwise, it’s more of the same.

Lucas Pretti: What you are saying is very connected to the concept of platformisation and user-generated content, which is the business and legal models widely adopted on the internet for the last 15 years. This concept is now being challenged in jurisdictions like the US, with the recent Gonzalez v. Google case, for example. Do you think it can create a precedent so that we stop dealing with platforms as if they were stores, and start seeing them as publishing companies?

And a second question related to that: do you think the word “open” is worn out? These two words, “open platform”, were extensively used together, and now it seems to me that there was a process of “openwashing” in the last decade led by Big Tech players. Do you agree?

Thais Ruiz de Alda: When I first read the news about the Gonzalez v. Google case, my first reaction was being very angry. I can’t believe that questioning the responsibility of platforms has reached the headlines only because of terrorism. Really? All that again: security, state police, and military principles are the drivers of the conversation about what the internet should be. This is an epic fail too. But anyway it’s good we are having this conversation. Let’s see what the court says. It would be very positive if this jurisprudence starts to lead us to a new framework for digital platforms. 

That takes me to your next question. Being open was the big thing about the internet! That was the revolutionary component. But making this profitable has been an epic fail too. Maybe a different direction for the internet today would be monetising different values – for example, we could pay $10 to make sure our data is ours, that no one is spying on our accounts, etc. Another alternative could be a public social fund for the open internet, guaranteeing that data is not being extracted the way it is today.

I don’t know if there was a washing with the concept of openness, but it’s a very interesting question, as it makes us discuss what is our understanding of an open internet. To me, the open internet is a space in which human rights are respected, and that enables connections between people and knowledge to generate social good – not shitty surveillance, spying or making money. How can we regulate the internet in a way there are clear authorities, limits, and democracy is a value? That would make the internet a public common good, and not a private infrastructure to make people rich. We will never be strong if we do not limit the way money can be made on the internet. To me, that’s the point.

Carol Matos: This whole North American model that digital platforms adopted as a pattern leverages freedom but without accountability. And that’s the exact opposite way in comparison to the open movement values. We are free and open, but also accountable. That’s what’s missing in the policies. It’s a different understanding of freedom.

Shashi Gharti: I’d like to follow up on something you said before, but changing the subject to artificial intelligence (AI). My son is 12 years old and I have a niece who is 14, and I do not allow them to use mobile phones because of the ads. I’m really worried about what kind of video they would watch or the content they would see. All the data extracted by big platforms are now being used to develop AI applications. What is needed to create some regulations and policies? What is right and wrong? 

How should operate a separate body with policies to prevent companies to spread content that is not good for society?

Thais Ruiz de Alda: I think it’s related to what Carol has just said. Having a body like this means being accountable.

DigitalFems is also connected to a group of organisations in Spain under an initiative called Around AI. That’s exactly what we are discussing there: what public bodies are going to supervise all the AI that is being created? Who is going to be accountable for what and what would be the consequences of not doing things correctly? Who’s making money out of girls who are trying to be slimmer or look prettier because of AI filters? Today there’s no public body supervising even algorithms and their effectiveness.

I think there are two sides to this story. On one side, the accountable, open, and democratic movements trying to legislate and regulate. On the other, the super neoliberal Big Tech corporations extracting dollars from the far west internet. This needs to end. I’m a lawyer, so of course I think this would end with norms. Norms and regulations are a social need under democracy.