We’re still in the midst of the ‘bigexit’. Individuals are changing jobs at an unprecedented rate following the COVID-19 pandemic, and the risk of leavers taking information from their employer is greater than ever before.
Stealing something physical is easy to detect and prove - walking out with the company phone or laptop after handing in your notice isn’t subtle. What isn’t so obvious, or always thought about, or appropriately protected against, are the download of proprietary data and the carefully curated work of your organisation, which other businesses can subsequently takeadvantage of.
When anything is done across your organisation, evidence of that activity is typically retained. The best place to look is within the log files distributed across your infrastructure. Databases typically have some type of auditing mechanism to ensure any addition, deletion, amendment, and download are recorded. If this is not the case, you can turn to your network data. Looking at this at a user level can indicate how much is being downloaded, by whom and when.
These are great tools for auditing something which has already happened, but it doesn’t stop the ship from sailing. If prevention is better than cure, the question remains, what can be done to stop it?
As soon as a leaver is identified, it’s possible to revoke access to company systems, although this type of solution can have a negative impact on the business. This is especially the case where a smooth information handover is required for other resources, and for business continuity within that team ordivision. Also, not all leavers are bad leavers. It’s quite easy to impact the good will built up over years across the employer - employee relationship, and jeopardise a productive future if the leaver immediately feels cut off and isolated.
This is the perfect problem for Artificial Intelligence (AI) to solve; monitoring the behaviour of users across the environment in anon-intrusive manner to ensure that no one is downloading data as they leave is well with in its capability. We need to be careful with current AI systems, as those that have already learned about existing behaviour may not be able to identify this. If leavers are already taking information with them, the AI will have learned this, and associate it with ‘normal’ behaviour. This is a key pitfall with many of the products being deployed across the marketplace.
What’s more, many of the existing solutions can be gamed. Consistent downloads of data, spread out over a lengthy time period, would allow a bad leaver to slip through the cracks. To the AI, this would look like the user is operating normally throughout their day-to-day role.
This means our AI solutions cannot operate alone and needs to be more robust, accounting for not just behaviour, but context as well.
To help our clients make informed decisions about new technologies, we have opened up our research & development facilities and actively encourage customers to try the latest platforms using their own tools and if necessary together with their existing hardware. Remote access is also available
Boston are exhibiting at Gitex 2024!