Are you feeling exhausted scrolling through your daily dose of information overload? You’re not alone. […]
Celebrating Pride Month, more and more tech companies are showing support for the LGBTQ community reminding us that there are important social issues technology can help solve. To get a grasp of the role tech can play in improving social inclusion, The Recursive talked to Andra Bria, founder of Craft Product School and co-organizer of the Feminist Futures Hackathon.
Social impact has been a topic close to heart for Andra early in her journey. Throughout her Master’s degree at the Political Science Faculty in Romania, she studied Equality Policies, diving into topics such as gender equality, feminist studies, human rights, gender mainstreaming, and social exclusion.
In parallel, Andra has developed as a digital product designer and technology aficionado. In 2017, she put on the entrepreneur’s hat, launching MeetReplika, a conversational interface augmented with AI that could automatize the process of business introductions. She then took on the challenge of leading a team into creating a new tech product for the education sector in a 3 months challenge at UnternehmerTUM, in Munich.
Andra’s passion for inclusive technology can be best seen in her entrepreneurial work. In 2019, she founded the Craft Product School to help people create more ethical and meaningful technology products through design courses. Here, technology becomes an activist tool. Recently, they co-organized the first edition of the Feminist Futures Hackathon together with Feminist Design Lab. The event hosted inspirational discussions and a 3 days hackathon aimed at finding tech solutions against systemic inequalities and oppressions of all kinds.
In this interview, Andra sheds more light on the blindspots pervasive in tech and business that hinder equality, as well as how to use technology’s potential to promote inclusivity.
The Recursive: How inclusive is the technology sеctor in 2021?
Andra Bria: In 2021, the technology space is still very far from being an inclusive space, especially for women and marginalized groups: people of color, LGBTQIA+, people of different races, ethnicities, and abilities.
The virtual environment can capture, reproduce and multiply many of the inequalities and discriminations that we encounter in real space.
That’s why it’s important that the builders of technology represent the diversity of the real world and also that they are aware of some of the blind spots that can occur in the development of their products. We can encounter some of these blindspots for example in machine-learning-powered tools like Google Translate -when the data used to train the ML model reinforces a cultural bias. So, when the tool makes gender assumptions – for example, pilots are men and flight attendants are women.
Another example is when the data used to train machine learning models does not represent the diversity of the customer base. Web cameras used to track user movements that only work well for white users, because the initial training data excluded other skin tones.
What is missing in the ecosystem that leads to inequality of opportunity for vulnerable groups?
We need gender mainstreaming education alongside computer science education. We need more diverse builders of technology. And we need to acknowledge our biases and assess our blindspots.
In the virtual space, we face a new type of “evil”: cyberstalking, cyberharassment, online dating violence, deepfakes, non-consensual pornography. Why is it harder to be a woman or a minority of any kind in the online environment (non-binary, LGBTQIA +, BIPOC, etc.) and why do men need to take that seriously?
Technology is a mirror of society because it highlights all its divisions. The new question is how can we use technology’s potential to create safety mechanisms in these cases for users?
For this, we need to act both in terms of product and service design, as well as in leadership and strategy. We need to think more about what we ask from technology, we need to ask technology to solve better problems and we also need to empower people to think critically about data usage, harmful content, and inclusivity.
How can we better use technology’s potential to promote inclusivity?
First, by critically looking at what’s missing, whose concerns are not being articulated, whose interests aren’t being represented, and whose truths aren’t being told. Secondly, by using technology to tell future stories of equality and inclusion.
Yes, AI can help spread discrimination and racist content (see Microsoft’s Tay chatbot), but a properly managed AI can also correct, annihilate harmful language and improve the quality of public discourse. Algorithms can also be used positively, it depends on us what we teach them.
During Craft Product School’s course, Designing for Equality, participants are learning how to develop products, services, and experiences with equality and inclusion as first principles, in the professional space, in the domestic space, or in the digital space.
Some of the products developed during Design for Equality are a Chrome extension that replaces harmful language on the Internet, a Slack chatbot that detects sexist language in conversations and proposes a replacement, or an Alexa that warns when men are taking too much space in work meetings.
At the same time, words matter and are incredibly powerful when it comes to creating the best possible user experience in a digital product or service. Whenever we are designing forms, we have to pay attention to the language we use in their description and microcopy.
Offering inclusive gender options (trans, bi-gender, non-binary) and asking what pronoun do your users prefer (her, him, them) is a way to start with inclusivity. Facebook offers 58 options when it comes to gender identity. Letting users pick as many labels as they want is also desirable. You can also offer an open-ended field, to allow for self-expression, and a “Prefer not to say” option.
Be transparent, explain why exactly you are asking, and how it will benefit your users. Reassure them that your company strives to be inclusive of everyone so they can feel welcome and protected while disclosing their information.
If you need gender information for market segmentation, ask yourself first whether your audience’s attitudes are really different based on gender.
Same with choosing ethnicity. The most commonly used options are: White, Hispanic, Asian, Black/African American. When requesting ethnicity, make sure you ask the question in a mindful way (“With which racial or ethnic group(s) do you most identify?”) and also make sure you give the option for multiple-choice, or the option of not picking either of the labels provided.
What can we expect from you in the future?
Alongside Design for Equality and Design for Degrowth, Craft Product School offers a No-Code course for beginners and people who want to enter the technology space, in order to make the space more inclusive towards people of different backgrounds and education. It will be expanding its range of product courses soon, so make sure you keep an eye on them and register.