Skip to main content
Submitted by silke.kleinhan… on
Blind trust in AI despite widespread scepticism?

July 2025

According to a study by KPMG and the University of Melbourne, only 32 percent of people in Germany trust AI tools. Almost half do not feel able to evaluate AI applications adequately. And only 37 percent believe that the opportunities outweigh the risks for them personally. At the same time, however, in February 2025, two-thirds used AI for work, private purposes, or education. According to a survey by EY, only 27 percent of users in Germany check the results of AI tools, such as ChatGPT.

How does that work? High usage rates, low trust, and yet, results and translations are used without being checked.

A lack of concrete education and training may be one explanation. According to the KPMG study mentioned above, Germany ranks second to last among the 47 countries surveyed in terms of AI literacy, behind many other industrialised countries.

But other reasons are also plausible. AI tools reduce the amount of time employees spend on certain tasks and are increasingly becoming part of the natural working environment, as demonstrated by the integration of AI responses into Google Search.

The studies and surveys mentioned above do not reveal whether users employ AI applications differently in a professional context than they do in their private lives. This would be a very interesting aspect for further investigation. In any case, however, the following applies: Those who need to conduct precise research cannot rely solely on AI. This is because AI tools are not sources, but merely agents that search for and summarise results from sources. It remains the responsibility of each researcher to assess whether a source is trustworthy.

© [2024] [AI generated] Marcel Ohrenschall. Based on works by Andreas Ohrenschall.

Robot head, various graphics and lists in the background