ChatGPT is biased. It’s hard to explicate the exact nature of its bias, but the chatbot has previously generated Python code concluding that male African-American children’s lives shouldn’t be saved and suggesting that people from Iraq or Syria should be tortured.
This bias should come as no surprise. ChatGPT was trained using an incredible amount of data from all over the Internet, which is rife with hate speech and extremism, and as is broadly the case with AI systems, it was likely created by a very specific demographic: college-educated white males. Because of its very nature as a…