My research interests are in philosophy of Artificial Intelligence (AI), evolution of moral agency, emotions, responsibility, and social ontology.
I believe clarification of what is fundamental to human agency is crucial for greater understanding of ontological and moral status of AIs.
In my dissertation, I will bring in philosophy of science, in particular the evolution of moral agency. I will argue that social interaction is fundamental to moral agency. I will argue that emotions are necessary for moral agency, in particular, to access moral information. To be able to have social interactions with humans, AIs need to have emotions, otherwise; AIs are just moral 'tools', rather than moral 'agents', and they cannot be trusted.
If AIs are just tools, one might trust the AIs (like a self-driving car) because they are reliable (as tools that are used in moral spheres). This kind of trust is between the designer of the system and the user, but trusting a self-driving car refers to the ‘reliability’ of the car; the car consistently ‘acts’ (works) for the benefit of the user but it is not morally responsible. If AIs meet the criteria of moral agency, have the sufficient properties that are needed to be moral agents, one can trust them since AIs themselves are trustworthy. For instance, the artificial moral agent AIs can make a moral decision to cooperate or not with the human moral agent. Thus, the AIs that are moral agents can be morally responsible.
Other academic roles:
• I am one of the co-founders of the Ethics and Technology Early-Career Group (ETEG). ETEG is dedicated to promoting a network for early career researchers working on ethics and technology. See the link below for more information on the group:
• I am one of the organisers of the Conference by Women, Genderqueer, and Non-binary Philosophers (CW*IP). See the link below for more information on the group:
• I am a team member for the University of Vienna's "A Salon for Underrepresented Philosophers" (UPSalon). See the link below for more information on the group: