When many internet-savvy people fall sick, their first instinct is to do a web search of their symptoms. There is a running joke that services like WebMD and MayoClinic often give a diagnosis of cancer or death for the slightest headache. With ChatGPT becoming a popular alternative to search engines, it’s no surprise that people want to use ChatGPT for self-diagnosis. Furthermore, ChatGPT recently passed the US Medical Licensing Exam.
ChatGPT is an artificial intelligence tool that provides straightforward answers to any question. It’s a natural language-processing tool. It has also passed law and business exams.
The AI tool has become the fastest-growing app in history. The parent company, OpenAI, launched a subscription-based advanced version of the available free version. With the rapid advancement of the service, it’s not surprising to wonder if the platform will become the latest self-diagnosing tool.
Is Dr ChatGPT the future of virtual healthcare?
Research shows that online self-diagnosis will always be a part of human behaviour. However, it can lead to anxiety and, more often than not, overwhelm patients with information. Using online platforms to make diagnoses can easily lead to medical errors.
ChatGPT isn’t like Google, a search engine that can access multiple websites and provides optimized results. The AI uses a human programmer who gives it access to limited information on a server. It uses the knowledge stored to predict the next word after a user submits a prompt. This is how it has such a high accuracy rate. It can also highlight shortcomings, admit mistakes or refuse to answer inappropriate questions.
Medical journals are testing the platform to see how it can be helpful and understand where it falls short. When using Google to get a diagnosis, you get overwhelmed with possibilities and try to narrow it down. You can do this by adding more specific and unique symptoms. You can also end up discussing with the chatbot of a medical website.
However, ChatGPT self-diagnosis can give a correct answer 60% of the time. It also provides further explanations within seconds.
What ChatGPT means for medical publishing
Guidelines were issued saying that ChatGPT and other AI tools can’t be listed as authors on a published study because they can’t carry any accountability for the work. However, ChatGPT has written a paper saying it can replace a human medical writer when writing study reports, patient documents, and translating medical data.
Unfortunately, ChatGPT can fabricate data that can present a significant liability in self-diagnosis. AI is only as accurate as the information it has access to. Some conditions have too much misinformation, and ChatGPT can’t flag this. Furthermore, ChatGPT self-diagnosis can be inconclusive because it doesn’t show you how it collates its answers.
Some doctors believe that AI will make the medical field improve. This is because it may empower patients and make it easier for doctors to communicate with them. However, ChatGPT can’t replace doctors.
Check out
The Rise Of Artificial Intelligence: Dangers, Benefits, Realities And Opportunities Of AI.