Skip to main
University-wide Navigation

If you’ve recently needed help troubleshooting an issue or have tried contacting customer service through a website, chances are you had the opportunity to talk with a chatbot.

Although Artificial Intelligence (AI) tools like chatbots are valuable at helping troubleshoot common issues, it’s important to be mindful of the information you share with any AI chatbot.

AI tools can be particularly useful when tailoring your resume and job searching, however, experts say you should never input your personal information. Read: How to utilize AI and protect your personal information while job searching

During Data Privacy Week, University of Kentucky Information Technology Services (UK ITS) wants the UK community to safeguard their personal information when using AI technology.  

UK ITS Director of Privacy, and Governance, Risk, and Compliance Michael P. Sheron said it’s important to keep your personal information private because once you share any information, there’s no option to delete it from AI tools.

Sheron advises everyone to keep the following information secret when using chatbots.

Personal Identifiable Information. Never share information like your Social Security number, driver’s license or passport details. Hackers could misuse this data in a breach, leading to identity theft or financial fraud with long-term consequences.

Login Credentials. Never input usernames, passwords or any login information. Use password managers instead — they are safer and can create strong, unique passwords for you.

Contact Information. Avoid sharing your phone number, address or email address. Even if you think it might help the chatbot provide better answers, sharing this information increases the risk of phishing attempts or unauthorized access to your accounts.

Work-Related Data. Even the developers of AI Chatbots often prohibit sharing work-related information with AI platforms. Inputting sensitive data could unintentionally expose it to training datasets or other users. Always check your organization’s policies and avoid using AI for confidential projects.

Medical Information. Avoid sharing personal health details, including X-rays, scans or medical records. Consumer AI apps are not bound by healthcare privacy laws like HIPAA, meaning your data could be used in training datasets or accessed by others.

Intellectual Property. Protect your creative work and ideas by not sharing them with AI. Once uploaded, there’s a risk they could be exposed to others via breaches or public datasets.

Financial Information. Don’t share paystubs, bank details, investment accounts, or credit card information. While AI can offer general financial advice, it’s safer to consult a financial advisor for personal matters to avoid the risk of hacking or data misuse.

For more cybersecurity tips, news and alerts, sign up for ITS newsletters and follow us on social media.