Emily Meyers

A Digital History Portfolio

Class ExperiencesClio

Module 8: Ethics, Biases, and Diversity in a Digital World

I will say from the start that this is a topic I have a deep interest, but this was a week that I had little mental motivation and may add some of this into my post next week. All I can say is that mid semester catches up to you at some point!

In academia, there has always been a discussion of how historians and archives only save the things that they deem as important and forgetting about other parts of a culture they did not understand. This level of erasure and (lack of understanding) also happens in the digital world as well. From social media to academics, content creators recognized as people of color tend to have a much more difficult time sharing or getting noticed for their work. There is conversation happening about how this could be due to algorithms that are created with inherent bias that the generator may have. This may be white privilege or being male. This is further investigated by Sharon Block in Erasure, Misrepresentation and Confusion: Investigating JSTOR Topics on Women’s and Race Histories as she points to faults of even the classification system for the Library of Congress. As Block explains, we can begin to understand that the issue is not a recent one, but simply transitioned on to the internet as Google and other companies designed their classification system similarly. An example shown later on such as this highlight the issue quite well, Jeffrey Daniels noted in 2015 that an Ex Libris discovery tool returned sexist results: a search on stress in the workplace returned only a Wikipedia article on “Women in the workforce”, implying that women and stress were the same thing.” Robyn Caplan and others have also written Algorithmic Accountability: A Primer” to trace the issue of algorithms back and explain everything starting from the basics. That thankfully start with explaining what algorithms are and how they function before moving to more complex pieces of the issue. 

When diving into the issue on a personal research level, I found YouTuber Khadija Mbowe who does video essays on her channel about many different racial issues. This video by her titled “Algorithms & skin tone bias (colorism), to be dark on the internet/”breadtube” does a great job of breaking down how algorithms affect creators. She does mostly focus on social media, but that does still affect people and their research. In a different style of video essay is another creator called Tee Noir. I enjoy her essays because they are more focused on how stereotyping a group can affect that group as a whole for a long time. In a video titled “TikTok vs Black Creators: If You Hate Us Just Say Dat” she talks about how the algorithms can be changed at anytime to avoid paying the minority equality and it lessens views for those groups. This is an issue that is just another form of racism and Capitalism, there is a good theory that Tee comes up with as she thinks its because the minority doesn’t have the money to get the best quality for their work so it goes to the majority that already has that money anyway.

This is where I feel comfortable ending this rant and will most likely bring this back up in the next post.

Share this post

5 comments

  1. Hi Emily. I really liked the video. I think it is very important to raise awareness about how youtube as an industry works, and how algorithms impact the outcomes of creators from minorities. Usually we tend to naturalise our relationships with content platforms, and part of the conversation about ethics, biases in diversity in a digital world is about remaining attentive. It would be interesting to find out how the same happens in netflix, spotify, disney+. It may even be a problem in e-commerce in general: do vendors from marginalized communities get less views because of an algorithm that inadvertently privileges some groups over others on those plattforms?

  2. Hi Emily! Really enjoyed reading your post. You make a very valid point regarding exposure in social media for marginalized groups. As someone who has is into content creation I see it from a personal standpoint where minorities get shadowbanned from algorithms across all social media platforms. I find it troubling that these algorithms are also very non-inclusive in academia like JSTOR. I think it’s a problem that will continue to affect research until new regulations are put in place.

  3. I enjoyed reading your post and thank you for sharing these videos. It is really troubling to see how these systems of oppression have been reproduced in the digital world and as you point out affect those inside academia and out. It is nice that more people are starting to become aware of these issues and if I put on my optimistic cap maybe that means we are on the slow road to change. But I always cannot help but think if whatever comes after, be it a solution or a new system working towards equality and equity on these platforms, will it be better or will it be more invisible and harder to fight against.

  4. With companies taking pointers from one another by designing similar algorithms, the transmission of unintentional (or possibly intentional) racism spreads like wildfire. The progression of ideas without taking in the full reasoning and moral responsibility of each possible conclusion at hand leads to methods and software that gets popular and further spreading unmeant harm to marginalized communities. This is why digital historians must approach ideas from every possible angle; because the digitization and spread of unintended racism through a project or software can lead to major repercussion down the line!

  5. Your point about the TikTok algorithm only further emphasizes in my mind the need for some sort of oversight on these algorithmic search databases. That was certainly one of my key takeaways from our readings this week, and your example here elevates this point even more. All of the potential good that can come from an algorithm being used for a search engine or social media is completely eliminated if the people who create these algorithms are not held accountable and if there is no transparency into what criteria they used to create their algorithm.

Leave a Reply

Your email address will not be published. Required fields are marked *