One of my personal favourite movies of recent times is ‘The Imitation Game.’ Among many others it raises a question that can in a way be mapped to our current discussion of the ethical dilemma autonomous systems face.
After the historic breaking of the Enigma, Alan Turing and his team decide to design an algorithm to decide what vessels to save and what not to, based on a preset priority list. This was to ensure that the Nazis do not come to know of the code’s failure to stay unbreakable and hence not try to change it. Whatever may be the historical discrepancies with retelling of the historic event that changed the course of the war, the question remains – Who, if any at all, decides who to save and who to not?
After the historic breaking of the Enigma, Alan Turing and his team decide to design an algorithm to decide what vessels to save and what not to, based on a preset priority list. This was to ensure that the Nazis do not come to know of the code’s failure to stay unbreakable and hence not try to change it. Whatever may be the historical discrepancies with retelling of the historic event that changed the course of the war, the question remains – Who, if any at all, decides who to save and who to not?