Keeping Clients Happy and Secure:
Best Practices for Research Tax Credit Consultants and Their Clients When Using Open AI Software
Open AI software systems have been around for quite a while, but they have gained a tremendous amount of popularity in recent months, especially ChatGPT. The speed and frequency in which information is being accumulated and communicated to the public is happening at such a rapid speed that most people feel overwhelmed by the sheer volume of chat groups, articles, and posts. In addition to people exploring the ability of using such software in their everyday lives, consultants are also looking into AI software’s functionality in different studies, such as the Research Tax Credit (RTC).
When considering how to use Open AI software like ChatGPT in RTC studies, taxpayers and consultants need to be aware of the importance of confidentiality when handling sensitive data. ChatGPT and Open AI software systems are open sourced – meaning that they are not secure and data that is uploaded can eventually be accessed, used for the systems to gain knowledge (or ‘machine learn’) from, and ultimately are used for future response generation.
Although it’s convenient and efficient to use ChatGPT as a tool to aid summarizing and analyzing data and trends with respect to the Research Tax Credit, it’s critical to be mindful that uploading technical data may be risky for a number of reasons.
The Risks of Using Open AI Software Systems
RTC Consultants need to be aware of the risks and take caution when working with client files, whether financial or technical. It’s important to keep in mind that sensitive and confidential information can include financial information (i.e. wages) and can also include technical documentation, such as plans for a part on the latest space shuttle. This Information is property of the taxpayer and it is all confidential.
Below are a few examples of risks to consider and be mindful of when using ChatGPT and other Open AI software systems.
- Risk of data breaches
Uploading files to any Open AI software source always brings the risk that privileged information may somehow become accessible and available on the web. Nothing that is uploaded to any Open AI source is shielded and secure.
- Risks of violating client confidentiality agreements
Consultants and vendors need to be aware of their agreements with their clients. They may be in breach of clauses in their agreements if they upload and share their privileged files to any outside software.
- Risks of violating intellectual property agreements
RTC claims usually include client SMEs divulging state of the art technologies that they are working on. Uploading any of these types of files onto an Open AI source system will violate IP clauses in agreements between consultants and employees.
How to Protect Confidential and Sensitive Data
Protecting clients’ confidential and sensitive information is critical for consultants to maintain the taxpayer’s trust and credibility. Here are a few simple steps that can be taken to ensure that client data is protected:
- Use secure systems.
Anyone working with highly technical data should always work within secure systems. This includes firewalls, passwords, and other security features put in place by your IT department.
- Use encryption and authentication protocols.
Many companies require double encryption authentication protocols to ensure only authorized users are accessing servers and files.
- Use appropriate access controls.
Understanding and following security levels for accessing server folders or files is important to ensure that the appropriate individuals have access. This will help ensure good intentions don’t over cloud errors that could lead to security breaches.
- Update software and security patches regularly.
Companies should always ensure that regular software updates are completed and that security patches are up to date to ensure data is safe and secure.
- Implement a secure client interface to protect client data.
Consultants should ensure a safe and secure data share system is in place to allow clients to transfer files without worrying that their sensitive wage or technical files are at risk of being hacked or lost.
- Avoid uploading specific identifiable data to Open AI software.
Consultants may want to consider a ‘worse-case’ scenario and pause to ask if any of the uploaded files were leaked, could anything be linked back to any client employee, specific client, or proprietary information. If the answer is yes, then do not use the software system.
Put the Client First and Use Caution When Using AI Software
While the use of Open AI software systems is here to stay, individuals are just beginning to explore the best ways to use ChatGPT to build efficiencies in the workplace. This includes prompts to assist in marketing, engineering, report writing, and data analytics with comparative analytics.
The entire premise of how these types of Open AI software systems work is that the more information fed or uploaded into them, the more the analytics can draw from the data to generate the next response. With respect to RTC consultants, this idea is important to understand since the documents uploaded (many of which contain confidential and sensitive data) will then be used by the software to create new responses for the public.
Not only is this potentially providing an open forum for proprietary client information to be made public, but it can also be considered a breach of contractual agreements between the RTC Consultants and the clients. To avoid any unintended breeches, companies ought to use best practices and be responsible when using AI software systems.
Put the client first, protect your relationship, and understand the potential liability that any potential breach could put your company (or yourself personally) at risk of. Given the nature of the work we do, RTC Consultants need to be privy to significant amounts of technological and innovative information. Extra caution should be taken to ensure security of data is held to the utmost of standards. Questions? Get in touch with DST Advisory Group.