X

Give up using ChatGPT for these 11 reasons immediately.

When you know how to prompt, ChatGPT and other AI chatbots may be effective natural language tools. ChatGPT can help you manage your weekly food prep, save money on trips, or even change careers.

Despite being a fan, I am also aware of ChatGPT’s limits, and whether you are a novice or an expert, you should be as well. While experimenting with new recipes, learning a foreign language, or organizing a trip might be enjoyable, you shouldn’t give ChatGPT complete control over your life. It isn’t excellent at everything; in fact, it may be blatantly dubious at many things.

ChatGPT may not always contain the most recent information and sometimes presents information that is hallucinated as truth. It’s so self-assured, even when it’s completely incorrect. (Of course, this also applies to other generative AI technologies.)

The greater the stakes, such as when bank balances, court dates, medical expenses, or taxes are included in the conversation, the more important that becomes. Here are 11 situations when you should switch off ChatGPT and go with a different solution if you’re not sure when using it may be dangerous. Avoid using ChatGPT for the following purposes.

(Disclosure: CNET’s parent company, Ziff Davis, sued ChatGPT creator OpenAI in April, claiming that the latter violated Ziff Davis’ copyrights by using its AI systems for training and operation.)

1. Making a physical health diagnosis

Curiosity prompted me to feed ChatGPT my symptoms, but the results might read like your worst nightmare. You can go from the flu and dehydration to cancer as you go through possible diagnosis. I put the fact that I had a lump on my chest into ChatGPT. As it turned out, I could have cancer. Fantastic! Actually, I have a lipoma, which is one in every 1,000 individuals and is not malignant. That’s what my licensed physician informed me.

Not that there aren’t some beneficial applications of ChatGPT for health; it may assist you in creating a symptom chronology, translating medical terminology, and crafting questions for your next visit so you’re better prepared. And that could lessen the anxiety associated with going to the doctor. AI does not, however, have malpractice insurance, nor can it request labs or evaluate you. Recognize its limitations.

2. Attending to your emotional well-being

While ChatGPT may provide grounding skills, it is unable to assist when you are experiencing severe mental health issues. I am aware that some individuals utilize ChatGPT in place of a traditional therapist. As long as she was aware of its limitations, Corin Cesaric of CNET found it to be somewhat beneficial for coping with sorrow. However, I can tell you that ChatGPT is still at best a pathetic copy and at worst very dangerous since I have a very real, very human therapist.

ChatpGPT has no real empathy, no life experience, and no ability to understand your tone or body language. All it can do is mimic it. A licensed therapist protects you from damage by operating under professional norms and legal obligations. ChatGPT doesn’t. Its recommendations may not work as intended, fail to notice warning signs, or inadvertently reinforce biases present in its training data. The deeper job, the laborious, dirty, human labor, should be left to a real person who has been trained to do it. Please call your local hotline or 988 in the US if you or someone you care about is experiencing a crisis.

3. Making snap judgments about safety

Please refrain from opening ChatGPT and asking your carbon monoxide alarm whether you are really in danger if it begins to beep. I would ask inquiries once I went outdoors. Big language models are unable to detect smoke, smell gas, or send out an emergency team. Every second you spend typing during a crisis is time you might be using to evacuate or call 911. Because ChatGPT can only process the fragments of information you provide it, it can be too little, too late in an emergency. Thus, your chatbot should never be a first responder, but rather a post-incident explanation.

4. Obtaining customized tax or financial planning

Your debt-to-income ratio, state tax bracket, filing status, deductions, retirement objectives, and risk tolerance are all unknown to ChatGPT, but it can explain what an ETF is. When you click enter, its advice could be outdated since its training data does not include the most recent rate increases or the current tax year.

For a do-it-yourself return, several of my pals enter their 1099 totals into ChatGPT. A CPA who can identify a mistake that may cost you thousands of dollars or uncover a secret deduction worth a few hundred dollars is just too good to be replaced by a chatbot. Call an expert, not AI, when there is actual money at stake, as well as filing deadlines and IRS fines. Additionally, keep in mind that everything you provide an AI chatbot—including your salary, Social Security number, and bank routing information—will most likely be used as training data.

5. Handling regulated or private information

I get embargoes in my email on a daily basis as a tech journalist, but I’ve never considered throwing any of these press releases into ChatGPT to obtain a synopsis or further details. The reason for this is because if I did, the text would escape my control and end up on a server owned by a third party, outside the boundaries of my nondisclosure agreement.

Client agreements, medical records, and anything else governed by the GDPR, HIPAA, the California Consumer Privacy Act, or the traditional trade secret legislation are all subject to the same danger. It covers your passport, driver’s license, birth certificate, and income taxes. You cannot be certain of where sensitive data is held, who may examine it internally, or if it will be utilized to train subsequent models after it enters the prompt window. Additionally, ChatGPT is vulnerable to security risks and hackers. Don’t put anything into ChatGPT that you wouldn’t post into a public Slack channel.

6. Engaging in any unlawful activity

This one should go without saying.

7. Plagiarism in academic assignments

If I claimed to have never cheated on a test, I would be lying. I’m not very proud of the fact that I used my first-generation iPod Touch in high school to take a peak at a few challenging equations I struggled to remember in AP calculus. However, the scope of contemporary cheating using AI makes that seem relatively mild.

Professors can already hear the “ChatGPT voice” a mile away, while Turnitin and similar detectors are becoming more adept at identifying AI-generated text every semester (thanks for killing my beloved em dash). There are actual dangers of suspension, expulsion, and license revocation. ChatGPT is best used as a study partner rather than a ghostwriter. Additionally, if you have ChatGPT perform the job for you, you are just depriving yourself of an education.

8. Information tracking and breaking news

Since ChatGPT Search was introduced by OpenAI in late 2024 (and made publicly available in February 2025), the chatbot can instantly get new websites, stock quotations, gas prices, sports scores, and other current data, along with clickable citations that allow you to confirm the source. It won’t, however, broadcast constant updates by itself. Live data feeds, official press releases, news websites, push notifications, and streaming coverage are still your best options when speed is a concern since every refresh requires a new prompt.

9. Gambling

Although I would never suggest ChatGPT to anyone, I did have some success with it during the NCAA men’s basketball tournament when I nailed a three-way parlay. I’ve seen ChatGPT have hallucinations and provide false information on win-loss records, player statistics, and ailments. Only after verifying each claim against the current odds did I pay out, and even then, I was fortunate. Don’t depend on ChatGPT to win you the game since it cannot view the box score for tomorrow.

10. Creating a will or other enforceable legal agreement

ChatGPT is excellent at simplifying fundamental ideas. Ask away if you have any questions concerning a revocable living trust. But you’re taking a chance the instant you ask it to write genuine legal content. States and sometimes even counties have different estate and family law regulations, so if you remove a witness signature or the notarization provision, your whole document may be thrown out. Allow ChatGPT to assist you in creating a list of questions for your attorney, and then pay that attorney to transform that list into a document that is legally sound.

11. Creating artwork

I don’t think AI should be utilized to make art, however this is simply my personal view and not an objective fact. By no means am I against artificial intelligence. I utilize ChatGPT to generate fresh ideas and get assistance with my headlines, but that is an addition rather than a replacement. Use ChatGPT if you want to, but don’t use it to create art that you then claim as your own. It’s a little disgusting.

Categories: Tech
Abu Bakar: