Martin-Luther-Universität Halle-Wittenberg

Alexander Bondarenko
Foto: Uni Halle / Markus Scholz
Download

Kontakt für die Medien

Tom Leonhardt
Wissenschaftsredakteur
Telefon: +49 345 55-21438

Ansprechpartner*in zu dieser Pressemitteilung

Alexander Bondarenko
Institute for Computer Science at MLU
Telefon: +49 345 55-24779

Internet

Kontakt

Manuela Bank-Zillmann

Telefon: +49 345 55-21004
Telefax: +49 345 55-27404

Universitätsplatz 8/9
06108 Halle

Weiteres

Login für Redakteure





When it comes to health-information, search engines often get it wrong

Nummer 141/2021 vom 01. November 2021
Google and the Russian search engine Yandex are not necessarily reliable sources of health-related information. Often the small text snippets that appear as previews of search results contain inaccurate or insufficient information. Information on home remedies and so-called alternative treatments is particularly problematic, according to researchers from the Martin Luther University Halle-Wittenberg (MLU) in Germany and the Ural Federal University in Russia. The scientists are therefore advocating for clearer warnings about possible health risks.

As a starting point, the German-Russian team used an archive of around 1.5 billion queries submitted to Yandex - the most popular search engine Russia. Using medical terms from ICD-10 - the World Health Organization’s International Classification of Diseases - and Wikidata as the respective knowledge base, the scientists identified 1.2 million queries that contain symptoms, diseases and so-called alternative treatment options. In those queries, people searched for around 4,400 different diseases and symptoms and 1,000 medicinally used plants or other home remedies. "Usually people were looking for information about private, everyday matters like pregnancy or sexually transmitted diseases. In general, treatments for acne or cellulite were more popular than treatment options for cancer," says Alexander Bondarenko, a computer scientist and part of the team at MLU. Most queries that were typed as questions fell into one of two categories: either people wanted to know whether a particular remedy was helpful for treating a specific disease, or they were looking for specific instructions on how to use a certain remedy to treat a disease. "The latter assumes that people already believe that the remedy does work, even though most of the time there’s absolutely no evidence," explains Dr Pavel Braslavski, a senior researcher and lecturer at the Ural Federal University. 

Next, the team checked how Yandex and Google responded to the 30 most frequently asked questions. The first ten result snippets for each question were analysed. Snippets are short segments of text that a search engine displays as a preview of search results. The team examined the accuracy of the snippets’ answers and also whether they contained warnings about possible health risks. A search of the medical study databases "Cochrane", "PubMed" and "BioMed Explorer" was carried out by the team’s medical professional for all the diseases and proposed remedies. 

Yandex falsely stated 44 per cent of the time that a specific remedy worked against a certain disease, although there was no scientific evidence to support this. For Google, this occurred in about a third of all cases. Moreover, the team found warnings about potentially toxic substances in only 13 (Yandex) and 10 per cent (Google) of the cases, respectively. "The information given in the snippets tends to confirm existing bias or misbeliefs and far too rarely provides adequate warnings about possible health-risks," says Bondarenko. According to him, this is particularly problematic since previous studies have shown that people have the tendency to believe in the healing powers of certain remedies, although more often than not there is no scientific evidence for them. The researchers therefore argue that search engine results for medical questions should come with clearer warnings about possible health risks. 

The team will present the study at the "30th ACM International Conference on Information and Knowledge Management (CIKM 2021)", which will take place online, November 1 - 5, 2021. 

The study was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) and Russia’s Ministry of Science and Higher Education. 

Study: Bondarenko A. et al. Misbeliefs and Biases in Health-Related Searches. Proceeding of the 30th ACM International Conference on Information and Knowledge Management (2021). doi: 10.1145/3459637.3482141

 

Zum Seitenanfang