Aurora Turek

  • Section 1
  • Student

Activity Feed

On April 15, 2020, Aurora Turek commented on Video Interviews with AI: Better for gender equality? Or worst? :

Thanks for sharing this article! I can definitely understand your hesitation to rely on video interviews instead traditional live interviews and think we should be careful to evaluate how and when they should be used.

I think the interview experience generally is stressful, whether in person or over video, and so I don’t think the issue of having to take into account nerves is just something that comes up during the video interview process. Someone might be really nervous in person as well, and could still be negatively evaluated for that by their interviewer. If an algorithm can be trained to ignore someone’s gender, perhaps it could also be trained to look for facial/speech cues of nervousness and take that into consideration in the evaluation. So maybe algorithms could be even better for individuals get really nervous during interviews!

On April 15, 2020, Aurora Turek commented on Locked in by Algorithms? :

Very interesting post, Paula! There’s an economics professor from the University of Chicago, Jens Ludwig, who studies this very topic. He recently gave a guest lecture in one of my classes and he discussed how biases can be baked into algorithms. However, despite their potential to replicate the same biases as humans, he’s still a proponent of using algorithms because they are more transparent than human decisions and thus, can help us to reduce discrimination. Here’s a link to an article about it if you’re interested: https://review.chicagobooth.edu/economics/2019/article/how-making-algorithms-transparent-can-promote-equity

Considering this perspective, I definitely support algorithmic assessment along with you! I think it’s better than relying solely on human decisions, which we don’t have much insight into. By using algorithms, we can at least understand how decisions are made and adjust for systematic biases baked into them, while we also work to develop better algorithms that don’t exhibit bias to begin with.

On April 15, 2020, Aurora Turek commented on The Ethics of People Analytics :

Very interesting article!

I couldn’t agree with you more on your stance on the use of algorithms – it definitely seems like a mistake to think that algorithms could never replace human intuition on moral decision making when we know so much about how human decision-making can be flawed. Your example of the judicial hearings really makes apparent how important human decisions can be influenced by outside factors that aren’t even related to the decision at hand. Algorithms could definitely be leveraged to aid in these decisions.

On the point about transparency and trust, I also agree that the two are not necessarily synonymous. However, I do think that being transparent is the first step in gaining the trust of employees. Not only do organizations need to let their employees know their data is being tracked, but they should also be clear about the analyses and outcomes the data is being used for. It might also be helpful to repeatedly assure employees that their data isn’t being used for any other purposes, that it’s truly anonymous, and that it won’t be used to single out any employees. By being transparent about the entire process and how the data will be used, I do think that it’s possible for organizations to collect data from their employees and still retain their trust.