Logo-doh
Submitted: 25 Aug 2025
Revision: 01 Sep 2025
Accepted: 14 Sep 2025
ePublished: 14 Sep 2025
EndNote EndNote

(Enw Format - Win & Mac)

BibTeX BibTeX

(Bib Format - Win & Mac)

Bookends Bookends

(Ris Format - Mac only)

EasyBib EasyBib

(Ris Format - Win & Mac)

Medlars Medlars

(Txt Format - Win & Mac)

Mendeley Web Mendeley Web
Mendeley Mendeley

(Ris Format - Win & Mac)

Papers Papers

(Ris Format - Win & Mac)

ProCite ProCite

(Ris Format - Win & Mac)

Reference Manager Reference Manager

(Ris Format - Win only)

Refworks Refworks

(Refworks Format - Win & Mac)

Zotero Zotero

(Ris Format - Firefox Plugin)

Depiction of Health. Inpress.
doi: 10.34172/doh.2025.17
  Abstract View: 9

AI in Health Information Management

Editorial

AI-Chatbots as an Alternative for Humans in Interviews of Qualitative Studies

Vahideh Zarea Gavgani 1* ORCID logo

1 Tabriz Health Services Management Research Center, Tabriz University of Medical Sciences, Tabriz, Iran
*Corresponding Author: Email: vgavgani@gmail.com

Abstract

Today, artificial intelligence (AI)-based research assistants are used in various stages of qualitative studies, including methodology, data collection, group interviews, writing, editing, and qualitative data analysis (1). However, it seems that chatbots can also be used as a data source in human-computer interaction (2).

One of the key elements in qualitative research is reaching theoretical saturation, meaning that data collection reaches a stage where no new data is generated, and the researcher considers continuing the interview unnecessary (3). Perhaps at this stage, conversations with chatbots can be used as a complementary or even alternative data source in qualitative study interviews. Obviously, all aspects related to entry and exit criteria, such as the interviewee's previous experiences and cultural backgrounds, which are very important in the interview, must be observed. Perhaps AI can access diverse data from a wide range of sources to produce conceptually rich and relevant data and provide new perspectives.

However, research integrity must be respected, but not necessarily in the same way as human studies. For example, we cannot define and identify specific inclusion criteria such as the real work experience of a human in an organization, the years of experience of a patient with a disease in real conditions, or the cultural and ideological backgrounds of the participant in the case of a chatbot. Therefore, interviewing with chatbots does not yield theoretical saturation and may produce incomplete and artificial results. Thus, in addition to the transparency of research and data collection, it is also necessary to define the framework for the ethical and correct use of chatbots instead of humans in interviews.

This editorial highlights a new perspective on the use of AI-based chatbots in qualitative research, where the chatbot serves as a data source rather than as an analyst, methodologist, or assistant writer.

Although AI provides opportunities for qualitative research, it also faces challenges that reviewers and authors should be aware of until the necessary technology is developed. Some of the opportunities and challenges of using AI chatbots in qualitative research can be the following:

The use of AI and recommender systems in qualitative interviews helps reduce the cost and time of research, creates a sense of security, greater comfort for the interviewee, and allows them to express information without worry and bias (4), which helps with the depth of the data. Also, when reaching people who are geographically remote or specific groups that are not easily accessible, AI chatbots trained for specific purposes can be used.

However, one must also recognize the challenges ahead and address them with appropriate policies. Among the most important of these is the depth of human feelings and emotions as they may arise in specific situations, which has not yet been defined for the machine. Also, informed consent, maintaining information security, and privacy are serious challenges and ethical issues for chatbots instead of humans (5). Ultimately, chatbots may be subject to a variety of errors, not from human error but from the data available to the AI, language limitations when translating data into the researcher's language, and even in countries like Iran, where access and use of IP from other countries are restricted. These technological challenges are unavoidable.

Therefore, journal editors and authors should be cautious when using chatbots for various purposes, including as a substitute or complement to interviews and a source of data collection in qualitative studies.

First Name
Last Name
Email Address
Comments
Security code


Abstract View: 10

Your browser does not support the canvas element.


PDF Download: 0

Your browser does not support the canvas element.