When we try to mechanize our society, create robots, program artificial intelligence as intelligently as possible and do all this in a way that humans will accept, the focus is primarily on scientific and psychological research. However, it should not be forgotten that people’s trust in a safe coexistence is not strengthened by a trustworthy appearance of technology alone, but not least by the fact that people know that „everything is already somehow clarified „.
Rules and laws are primarily intended to regulate and facilitate the harmonious coexistence of a large community. Above all, they are often a protection for those who are in the supposedly weaker position.
When we speak of hybrid societies (a society composed of humans and embodied digital technologies – EDTs), it is no longer just humans acting with each other, but EDTs with humans, or everyone with each other. The question we legal experts ask ourselves in these scenarios is: Can the legislation be applied unchanged to the new situations? After all, one thing remains for sure: Even with regard to the new encounter situations, humans want a predetermined structure, the certainty that everything will work, and the trust that this situation is somehow regulated. With regard to our technical „fellow human beings“, too, it is a matter of dealing with each other in a responsible and trusting manner – and laws provide security, strengthen trust in the other person and ensure that people encounter each other responsibly.
One simple but straightforward example of the legal problem: In a sales contract, the seller is obligated to hand over the purchased item to the buyer, and the buyer has to pay in return. As a rule, the transactions are carried out by humans. What happens if the smart refrigerator notices that the orange juice is empty and therefore orders new juice on its own? According to our legal system, only a human can actually make a declaration of purchase (more precisely: the declaration has to be traceable to a human being) – so does the owner of the refrigerator have to accept and pay for that orange juice? A layman may now think: „Well, logically, who else?“. This is also completely understandable; after all, the refrigerator cannot pay itself; just as you cannot call the programmer of the refrigerator’s algorithm to do so. What if the refrigerator buys 50 bottles of juice instead of one due to a mistake?
This simple example is already the starting point for legal research: can a smart kitchen appliance make a legally binding declaration for its owner? To do so, the right – tailored to humans – would have to be transferable to the device. Is this possible without further ado or does it require a new interpretation or adaptation of certain terms or perhaps even a completely new regulation?
Yet here we are dealing with a simple (merely intelligent) kitchen appliance. In the case of independently acting entities, which we will deal with in our subproject, such a purely abstract-legal view is not always sufficient. The point is that rights and rules are designed for human beings: For this reason, in subproject E02 we investigate the construct of responsibility from two angles. From a psychological point of view, we first have to ask what characteristics define responsibility at all. Only then, in a second step, we can ask whether the laws are also applicable to entities that may be morally capable of acting only to a limited extent, or whether there must be new rules and, above all, a different view of the construct of responsibility for technologized „fellow humans”.
With the assurance that legislation will take into account the findings of science and the desires of users as revealed by psychological studies, confidence in responsible coexistence between humans and EDTs will be strengthened from the outset.
Thus: In all conscience, legal research work creates trust.
Photo: Larissa Flade
Find out more on the topic of responsibility and EDT in our podcast.