Chatbots are gaining traction and are increasingly being exposed to consumers. Whether you’re communicating with your favorite brands on Facebook Messenger or any other messaging app, it is likely that you are going to come across a chatbot. Chatbots are clever pieces of technology that can answer basic-level customer queries – such as HR questions – or pass on information to humans who can assist further.
Unsurprisingly, chatbots are privy to lots of sensitive information from consumers and, as a result of this, they pose a huge risk if they are mishandled, or data is not stored securely. Because chatbots can be provided with pieces of information such as SSNs, phone numbers, addresses and billing information, it is vital that any organization that uses them takes the necessary steps in securing chatbot data and user privacy. Naturally, many consumers will be frivolous with the information they provide to chatbots and will often disclose far more than they need to.
Not all chatbots are the same, however, and different ones have different functions, vulnerabilities, and collect different pieces of information. When implementing a chatbot, it is important that you understand exactly how they work, which data points they analyze, how this information can be accessed, and by whom.
#1: Control Data & Who Can View It
Not only does an IT department need to be prepared for unexpected data to be present within the storage system of a chatbot, but they need a way to filter this information separately from the relevant information and regularly dispose of it. Chatbots are very clever, and many enterprise-level chatbot solutions can distinguish between different pieces of data and data points and prevent these from being stored alongside the information you actually need.
Additionally, a human review of chatbot information is the only way chatbots can possibly be improved, but you need to be in full control of who has access to this information. If a human review is going to be a part of your process – and it probably is – then you need to ask yourself who’s going to be performing these reviews and try and whittle these people down to only those who need access.
#2: Your Obligations Under the GDPR
If you’ve not heard about the GDPR yet, where have you been? The GDPR (General Data Protection Regulation) is a piece of EU law, which came into force on May 25, 2018. This applies to you if anybody located within the European Union discloses their personal data to your chatbot(s); it doesn’t matter whether you’re located in the EU, U.S. or Australia, you have to comply with it.
Although this mostly has an impact on the people who develop chatbots, you still have an obligation to ensure that any data harvested and processed through your chatbot is done so according to GDPR regulations. Although the data disclosed to your chatbot will be provided voluntarily by the end-user, this way ticking the “consent” box for you to process their data, you still need to make clear what kind of data you are collecting, and for what purpose it is going to be used.
#3: Optimise Your Chatbot
If you are finding that a lot of people using your chatbot are disclosing information that they do not need to, you may want to optimize your chatbot so that it makes it perfectly clear to the end-user which data you need. For example, if you need a customer’s reference number and their date of birth, you could program your chatbot so that it says this to the end-user:
“Please provide me with your Customer Reference Number and date of birth, and only these two pieces of information.”
If the end-user reads this, they will know that they only need to provide those two pieces of information and not, for instance, the first line of their address, as is the case with many companies when it comes to seeking help in relation to a customer account. The more intuitive your chatbot is, the better; if your end-users are too frequently providing you with excess pieces of data, then it’s a sign that you need to optimize your chatbot and make it easier to use.
Chatbots are becoming increasingly popular with companies who want to streamline their customer service while cutting down costs. Because they are privy to a lot of sensitive user data – often data that is not necessary – they carry an inherent risk of being compromised and having this data leaked. As a result of this, especially with the onset of the GDPR, it is important that you ensure the security of this data and operate strict controls over who has access to it. Also, chatbots should be optimized so that they are only collecting information which they need and not what the end-user thinks it needs.