Stereotypes – be it gender or race – simplify complex human beings into limited key defining traits, effectively determining and streamlining how we interact with one other. They are in fact such a persistent feature in society that it is not unsurprising to find these stereotypes – gender, in particular – still perpetuated in the artificial intelligence bots that have been produced over the years. These gender stereotypes are played out in the roles that these bots are assigned to perform in an industry and their overall personalities. Female bots typically perform more administrative and secretarial roles such as assisting in the completion of routine tasks, scheduling meetings and customer service. Male bots on the other hand, often perform more analytical roles like providing financial advice and paralegal services.
Recently, a growing group of companies have started to buck this trend by choosing to create gender-neutral bots instead, sparking discussions in the tech industry on the necessity and consequences of assigning gender (and with that stereotypical traits) to bots in the first place.
In our increasingly inclusive society, where equality and diversity are celebrated and stereotypes are often disrupted, the tech industry now has the responsibility of addressing some very hard questions: Should bots be assigned a gender? Or should gender be taken out of the equation?
Gender stereotypes in bots: not a myth
According to a Maxus Survey in 2016, although 56% of gendered bots are female, 100% of Law bots and a majority of Finance bots are male. Conversely, virtual personal assistants are often female – look at Apple’s Siri, Amazon’s Alexa and Microsoft’s Cortana. They are armed with flirtatious personalities that simultaneously encourage and brush off the advances of curious users. Sure, tweaks have been made over the years (e.g. you can now choose from a range of languages, accents and genders for Siri, and Cortana is feistier at fending off unwanted advances now) but most of their feminising qualities still appear as a default setting.
On other hand, more “serious business” is assigned to the male bots. According to the American Banker, Goldman Sachs is working on “Marcus”, a bank bot that will dish out financial loan advice – the latest in a string of analytical bots with male names introduced into the fintech and banking industry. Bots with female names are not entirely excluded in the industry of course, but can mostly be found within the realms of customer service (i.e. information requests). In the law industry, “Ross” is a lawyer bot created by IBM and employed by firms such as Baker & Hosteler to ease research workloads, often delegated to paralegals.
Bots are designed by people, and therefore reflect the disproportionate gender representation in the tech industry, where women continue to form a minority in discussions on development and design. In order to disrupt gender stereotypes in the design of bots, all genders need to be part of the conversation.
Gender-neutral bots: the way of the future?
At Kasisto (a startup which uses artificial intelligence software in banking), an inclusive team (that included Jacqueline Feldman and was headed by self-proclaimed feminist Dror Oren) decided that designing a gender-neutral banking bot, KAI, was the only logical answer. In a recent interview with “Refinery 29”, Dror Oren, the co-founder and VP of Product at Kasisto, explained: “We wanted to do a genderless bot, not an assistant that was a continuation of what was already out there. As a company, we had a choice here. We could have KAI respond with flirts and funny jokes, or we could choose to answer with a funny joke, but not a flirt.”
In this regard, bigger tech companies such as Apple, Google and Microsoft are lagging behind. Although Google’s “Google Assistant” is a genderless bot for example, it still has a female voice – a feature that developers are thinking about expanding on.
It is clear that the utilisation of bots to complete routine tasks (such as banking, information-gathering, scheduling etc.) is a growing phenomenon that will continue to pervade people’s daily lives. Because people essentially learn through modelling and representation, we have a very real opportunity now to affect the society through the disruption or reinforcement of gender stereotypes by the way bots are designed and the ideologies they inevitably represent. It is up to us to take on this challenge head on, or to allow the “community of bots” be a direct reflection of the sexist flaws of our society and perpetuate the status quo.
RSS