TThe annual conference for the American Indian Science and Engineering Society hosted a workshop where students created metadata to train a photo recognition algorithm to understand an image's cultural significance.
The students tagged images of ceremonial sage in a seashell and a 19th-century picture of Native American children outside a boarding school, with words carrying indigenous connotations.
The researchers then compared the algorithm's responses to those generated by a major image recognition application.
Microsoft engineer Tracy Montieth said the app was unsuccessful because it lacked proper training data, demonstrating that such data dictates the performance of artificial intelligence (AI), and in this case was biased against marginalized cultures.
Florida International University's W. Victor H. Yarlott said more accurate data makes AI systems more representative of human intelligence.
From The New York Times
View Full Article - May Require Paid Subscription
Abstracts Copyright © 2022 SmithBucklin, Washington, DC, USA
No entries found