The project “Good Governance and Compliance: AI to support good corporate governance” is located in the “Use Case Economy” of the Manchot Research Group. Here, four subprojects deal with the possibilities and conditions for the application of AI in companies.
A first subproject uses AI to identify various actions in the company, but also to forecast business-significant corporate events such as insolvencies, takeovers or patent applications. The focus of interest is the development and optimization of own algorithms.
The use of existing AI algorithms in human resource management, on the other hand, is the subject of the second subproject. The goal here is, among other things, to identify suitable measures for improved acceptance of AI use by the company among employees. In particular, factors such as transparency with regard to the use of AI, participation in AI-based decision-making processes, and improved opportunities for digital participation and their effect on the acceptance of AI use in human resource management by employees will be examined.
Finally, two other subprojects investigate cross-cutting issues that affect all of the research group’s use cases: The goal is to understand the gap between the provision of AI as a decision-support technology and its actual use, for which validated empirical findings are still lacking. The approaches used are behavioral and neuropsychological, respectively. Extensive behavioral and neuropsychological economic studies on an experimental basis (behavioral economics) as well as with the help of neuropsychological measurement procedures (neuroeconomics) are to identify the mechanisms on which AI acceptance or rejection are based. The main focus of the studies on AI acceptance and its enhancement is on the comprehensibility, traceability and explainability of decisions made by AI. Both subprojects aim to provide concrete recommendations for the design of the interaction between users and AI, which will help to realize the considerable potential of AI in various business areas.