AI

UK exam results debacle reveals a higher level problem for Ireland

Algorithms are not the problem, inequality is
Blogs
Image: Stockfresh

19 August 2020

Criticism of the Department of Education for the late release of Leaving Certificate results should be muted after the proposed model of teacher assessment and algorithmic validation failed so badly in the UK.

The short version of how the model was to work involved teacher- submitted predicted grades based, screened by a machine learning algorithm to create a spread of results at national level, broadly reflecting year-on-year trends. This year’s state Leaving Certificate results would look like any other, though they would arrive significantly later.

However, across the Irish Sea the machine learning element has been had to be ditched entirely in favour of teacher-selected grades after students were routinely downgraded. So what went wrong and how?

The ‘what’ is dilligent pupils were seeing their prospects of getting into their college demolished by having their predicted grades lowered. The ‘how’ is another example of AI as an identifier of prejudice and inequality as grades were brought into line with historical trends. If private school A has a history of sending more students to third level than state school B then the calculated results leaned in a way that preserved that trend over recognising the individual effort of the student regardless of where they came from. It’s a great way to preserve a class system, a meritocracy? Not so much.

This is not the first time we have seen the predictive power of AI laid bare. Amazon, for example, had to scrap a hiring tool that used machine learning to filter out women candidates on the basis of most of the workforce was male. Another case of AI showing up negative trends in the past to the detriment of future success.

The BMJ cites a similar case where black people had lower instances of melanoma but a higher mortality rate. A close inspection of the data revealed that as most of the diagnostic images of melanoma were on white skin AIs would struggle to identify them on darker skin. A restricted data set was leading to incorrect diagnoses.

Numbers don’t lie

The issue of poor data – either at collection level or viewed without context – has yet to be addressed. Dr Nicolai Baldin, CEO and founder of Synthesized, has nearly a decade of experience in AI. According to the former University of Cambridge researcher, there will be reluctance to try an algorithm-driven solution to any project of such scale for some time.

“The use of a similar algorithm again will be considered long and hard, but this situation has also sped up the likelihood of the introduction of legislation regulating their use,” he says. “This can only be a good thing, with the potential regulations focusing heavily on the issue of fairness.”

Baldin agrees that the issue is not the algorithm but the data fed into it. Execting a machine to deliver a dispassionate result is acceptable only if the raw materials it is given are free of bias.

“To avoid flawed data causing flawed decisions, a different approach is needed and this is where synthesised data offers real hope. It is generated in a simulated environment and can be heavily stress-tested to meet specific needs or conditions that are not in existing original data that has imbalances or under-representation,” he says.

September 7 is a long way off for Irish school leavers but the black boxes are ditched in favour of the professional opinions of our educators it will be worth the wait.

Read More:


Back to Top ↑

TechCentral.ie