A ReEnTrust student is conducting a research study about how explanations can help improve fairness and transparency in job sourcing platforms from the viewpoint of the applicant.
For the past couple of decades, there has been a lot of discussion about trying to find the balance between transparency and accuracy in recommender systems. Due to the significance of e-recruitment systems, some papers suggest we need to sacrifice accuracy to ensure full understanding of how the algorithm produces recommendations. The belief is that this will enable job seekers and system designers to more easily recognise unfair results.
Therefore this student project aims to compare four different types of explanations, to find out whether they achieve the desired goal of allowing job seekers to identify and contest biases, and what type of explanation is most effective in this aim.
So far the project has received responses from 100 survey participants and 5 interview participants, with some exciting new findings. Please look out for this space for further updates!