12:35:22 From Kevin Moody : I thought the final four is april 3 and 5 12:35:49 From Makenna Girard (She/Her) : What time weds evening? 12:48:47 From Michael Finch : I don’t like the term Silicon Valley “refugee”. Like I feel like the term refugee makes me think of people who have escaped like terrible lives and war zones and things. What happens in Silicon Valley like “OH noooo I got a six figure salary and got to shop at Whole Foods ooooooo” 12:49:04 From Dillon Shipley : ^ 12:49:35 From Makenna Girard (She/Her) : shows very little global awareness or perspective 12:52:31 From Kevin Moody : siri is male by default in the uk 12:52:48 From Makenna Girard (She/Her) : Would the women and people of color be listened to? 12:54:50 From Tristan Call : Having written a paper on defaults for ethics, I can confidently say people stick with most defaults the vast majority of the time 12:55:04 From Tristan Call : So the defaults have effects 13:00:13 From Mia Brasil : Apparently Siri only defaults to a male voice in four languages: Arabic, French, Dutch and British English. 13:03:34 From Daniel Kar : I found an article that talks about why AI assistant voices are female and it talks about how when Google first made “text to speech” technologies, it was much easier to work with higher pitched female voices. 13:03:36 From Tristan Call : According to one source I looked up for my paper there are at least 467 different ways they recorded for people to say 'there isn't a statistically significant difference. But we're going to heavily imply that there is an effect anyways because we want to have an interesting result.' 13:10:44 From Tristan Call : There's also the potential for ethical training to be used to excuse immoral behavior they were going to do anyways. For instance, the corporate ethicist that is hired not to police their AI ethics, but to justify their AI policies 13:11:23 From Tristan Call : Using whatever ethical loopholes necessary 13:17:17 From Tristan Call : One aspect of this is there are 'standard learning libraries' that are really common/widely used because they are free. So if one of those is biased, literally every algorithm that uses it for data will be encoding that bias in itself. 13:18:15 From Nate Remcho : Arthur slaps… everybody that you meet has an original point of view 13:24:01 From Makenna Girard (She/Her) : They had their hand raised* 13:24:05 From Dillon Shipley : mb 13:27:24 From Fisher Ng : I feel like there are so many ways to solve a problem beyond technology. In some communities I am connected with around the world like Papua New Guinea or Kiribati, the people of a community have learned to see the act of committing a crime as the worst form of punishment because it is both embarrassing and prevents them from becoming the best person they can be. In socializing concepts through transforming culture, something like facial recognition for surveillance is unnecessary. 13:31:16 From Megan Rice (she/her) : & how people judge merit is variable 13:34:09 From Paloma Whitworth (she/her) : Systemic issues contribute to apparent ‘merit’ and ‘lack of merit.’ I think evaluating merit should be looked at holistically rather than previously defined metrics. 13:35:24 From Makenna Girard (She/Her) : Also interesting how people who have less access to “merit” achievements might not know how to conform their resume or personal statement to get through resume screening algorithms so its another barrier to entry 13:36:07 From Sophia Whitworth : Also, workplace culture in STEM is not always very welcoming to women and minorities- this makes it really hard for some people in this groups to stick to a job 13:38:15 From Megan Rice (she/her) : Speeding is more ethical than going the speed limit. Other cars need to get out of my way