Five things you should never discover in chatgpt if you want to protect your privacy

People turn to chatgpt for all sorts of things – pairs therapy, help write a professional email, turning pictures of their dogs to humans – leaving the platform of artificial intelligence in some personal information.

And apparently, there are some specific things you should never share with chatbot.

When writing something in a chatbot, “you lose his possession,” said Jennifer King, a friend at the Stanford Institute for Artificial Intelligence focused on The Wall Street Journal.

There are ways to protect your privacy when using a chatbot to him. Getty Images

“Please do not share any sensitive information in your conversations,” Openai writes on their website, while Google calls on Gemini users not “… to introduce confidential information or any data you wouldn’t want to see a reviewer.”

In that note, here are the five things that no one has to tell the chatgt or a chatbot to him.

Chatgpt has a “temporary conversation” way that operates in a way similar to an incognito mode of an internet browser. Cphoto/Next publication through Getty Images

Identity information

Do not disclose any identifying information to chatgpt. Information such as your social security number, driver’s license and passport numbers, as well as date of birth, address and phone numbers should never be shared.

Some chatbots work to edit them, but it is safer to avoid sharing this information at all.

“We want our models to learn about the world, not private individuals, and we actively minimize the gathering of personal information,” WSJ told an Openai spokeswoman.

Medical results

While the health care industry assesses confidentiality for patients to protect their personal information as well as discrimination, chatbots are not usually included in this particular protection of confidentiality.

If you feel the need to ask chatgpt to interpret the lab work or other medical results, King suggested harvesting or editing the document before loading it, keeping it “only in the test results”.

There are five specific things that should not tell the chatgt or a chatbot of it. Reuters/Dado Ruvic/Illustration

Account

Never detect your bank account numbers and investment. This information can be hacked and used to monitor or enter funds.

Entry information

It seems that there may be reason to provide a chatbot with the username and passwords of your account due to increased their ability to perform useful tasks, but these agents are not vaults and do not keep account credentials secure. It is a better idea to put that information in a password manager.

Owner’s corporate information

If you are using chatgpt or other chatbots for work-such as designing e-mail or editing documents-there is the possibility of incorrectly exposing client data or non-public trade secrets, WSJ said.

Some companies agree on a enterprise version of it either have their own personalized programs with their protection to protect from these issues.

If you still want to get personal with that chatbot, there are ways to protect your privacy. According to WSJ, your account must be protected with a strong password and multi-factor certificate.

Privacy -conscious users should delete any conversation after finishing, Jason Clinton, the leading security officer of anthropic information, told The Outlet, adding that companies usually give up “deleted” data permanently after 30 days.

#discover #chatgpt #protect #privacy
Image Source : nypost.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top