Howdy y’all! Welcome back to my blog. This week’s topic is a reflection on Node 1, part of which I explored through my last post.
Node 1 encompassed a fair amount of ground, including basic coding with dictionaries and the nature of algorithms. I was surprised at how broad some of these topics can be: I have known that algorithms control my interactions with media, but I have never dwelled upon how complacent I am. I sit through five, sometimes more advertisements in a row when listening to Spotify, and while it irritates me, I take it. (And stubbornly don’t invest in Premium. Though is there a point to ‘rebelling’ against it?)
Maybe it’s because I got used to watching Charlie the Unicorn and almost every other video without ads growing up.

Because of this unit, we can now consider ourselves more digitally-fluent: I can tell you what some functions are for, and how they work (especially thanks to my basic Arduino background). The most interesting parts to me were when we dipped into digital citizenship, which requires ethical thinking. We got to try to answer the question: “what should we do with or about it?”
And while there is never a right answer, trying to answer these questions (emphasis on trying) has encouraged me to become digitally-fluent. According to Douglas Rushkoff in his talk Program or be Programmed, we can’t tell if the inherent bias in programming is from the media or from the programmer unless we understand the technology. By doing so, the understanding and execution is mutually-intentional.
“The more humans become involved in their design, the more humanely inspired these tools will end up behaving.”
Douglas Rushkoff
I felt as if we could start to see more ‘humanity’ in our code with Code Practice #2. Obviously, the very nature of quantifying traits isn’t very representative of the human condition, but it is the start. I made sure to incorporate different values that represent various parts of myself such as my general physicality, my needs, and my mindset respectively.

In the podcast “Biased Algorithms, Biased World,” Cathy O’Neill calls the process of quantifying different aspects to measure something (ex. success) a proxy. One of Google’s definitions of a proxy is “a figure that can be used to represent the value of something in a calculation.”
Sure, pretty straightforward. But the real key to that definition is the word represent. Your interests, demographics, and even your face are represented by values as part of a calculation. O’Neill further mentions that no one can truly determine if an algorithm is working: there is no standard for them. Now, I’m not trying to paint them in a negative light. I just want to express how all these things I have taken for granted mean a little something more to me now.
My favorite point from “Biased Algorithms, Biased World” is when it is pointed out that the definition of success to the creator of an algorithm is opposite of that of its targets. This was probably the most thought-provoking part for me. Consider where that can be applied in your life.
This can be applied to Season 4, Episode 4 of the TV show Black Mirror, titled “Hang the DJ”. Take a peek at my pseudocode from Code Practice #3:

This episode is about a program that tries to find one’s “perfect” match (with a 99.8% success rate!™) and this example focuses on two of Frank’s partners: Amy and Nicola. The AI, named Coach, collects data from various interactions with various partners. However, as Amy speculates, it seems as if part of the system is to wear its users down from perpetual partnership to the point where they treat s*x as data too – just something to get over with. Of course, this contrasts with the interests of the user, which is to find someone that matches the values they already exhibit.
Coach may consider intimacy and humor as prominent traits, and assign them values such as what was seen in the practice. You can discern Frank’s ‘preferences’ (determined by Coach) in this example.
A fun little tidbit: we got to try StoryFace and its almost-parody of what it means to be authentic. I won’t spoil anything, but I got to keep trying to date Pam Beesly from The Office. See the results below (but try it for yourself).
Thanks for checking out my blog! Even if you didn’t learn something brand-new, I hope you will think critically about how algorithms present content in your everyday life, and the limits of information that can be discerned about your being.
Sincerely,
Sterling