AI systems can be designed to analyze data and make predictions about future events or outcomes as about the future itself based on past patterns and trends. They can be particularly efficient for prediction and planning in situations where there is a large amount of data available and where the relationships between different variables are complex and not easily understood by humans. These predictions can then be used to inform planning and decision-making processes.
However, as Kelly Foster from Whose Knowledge? and one of the speakers, discussed, data politics appear to be very biased against marginalized communities, not properly registering and storing information about them. Which inevitably leads to the AI system bias against the same groups and repetition of these patterns and trends that enforced the discrimination policies in the first place.
Read the full conference report here.
Related Posts
GAAAP_ The Blog | March 6, 2023