Who should be on an insider risk team?
28 February 2017 | 0
Left to chance, unless you happen to bump into someone leaving the building with a box full of documents, you might never catch an insider red-handed. That is where an insider risk team comes in — group of employees from various departments who have created policies that create a system to notice if those confidential items have left the building.
“Insider risk is a real cybersecurity challenge. When a security professional or executive gets that call that there’s suspicious activity — and it looks like it’s someone on the inside who turned rogue — the organisation needs to have the right policies and playbooks, technologies, and right team ready to go,” said Rinki Sethi, senior director of information security at Palo Alto Networks.
“In short, an insider programme should be able to curate data points that reveal a toxic risk score of ‘who’ might be high concern for malicious activity,” Chris Camacho, Flashpoint
Steve Mancini, senior director of information security at Cylance, takes the disgruntled employee’s point of view, indicating that they need to be provided outlets and recourse for their grievances before miscreant actions occur. “Fellow employees and managers need to be trained to spot the signs of disgruntled employees and given channels to report concerns in a manner that does not judge the potentially disgruntled employee, but instead put the right people in their path to help them resolve whatever grievance they have before it escalates.”
But not all companies are that advanced in spotting what an angry employee might do in retaliation. Policies would cover those obvious situations of an employee making an inordinate amount of photocopies or an alert that notices a USB drive is being plugged into a computer, but it gets tricky dealing with those scenarios that are not out in the open for all to see. It is the insider risk team that must come up with every hypothetical scenario in order to stay ahead of that disgruntled employee who only wants to fulfil a vendetta.
“Insider risk tends to happen less frequently than external threats, but the negative impact can be tenfold. Having the right insider risk team with risk management expertise is a must to assess the situation, pinpoint the culprit and execute your counterattack plan,” Sethi said.
Who should be on this team?
Many security experts made it clear that watching for signs of an insider threat is everyone’s responsibility. But in terms of the team’s makeup, it should be representative of the entire company.
The team should include the technical IT and Security teams, as well as non-technical stakeholders such as members of the C-suite, the legal counsel and human resources, said Veriato’s CSO David Green.
“The latter three will likely be unfamiliar with the fact that traditional security solutions don’t always work to prevent insider threats because, first, they are largely focused on perimeter security, and, second, they aren’t intended to identify or prevent problems stemming from insiders who are authorised to access sensitive data or systems,” he said. “But these departments should come together to discuss the various challenges associated with insider threats and establish policies and procedures to prevent and detect them while protecting employee privacy.”
Here’s what each department should bring to the table:
C-Suite: A member of the executive team should be present because you’ll need buy-in from the executive team to ensure the other departments represented on the insider risk team have the authority to establish a risk-based monitoring programme and sign off on an Acceptable Use Policy (if one isn’t already established); set boundaries of what’s acceptable behaviour and what’s not; and tie the plan to the company’s strategic objectives and help outline a security policy.
Legal: The legal team should be present to ensure all employee/user monitoring activities meet any local, state and federal laws. They should also help define what is permissible to monitor, such as email and instant messages, the web sites employees visit, online apps they use or any content they download or print. Recording employees as they log into their bank accounts online could be a legal risk for the company if something happened to the employee’s account. Also, since IT might not be permitted to review the activity of higher-level employees, legal will work with the security team to determine which roles within the organisation can review which sets of activity.
Human Resources: HR can help create the processes necessary to ensure there is a warranted and documented need for any monitoring, and that the security team is made aware of these issues without breaking any privacy laws. For example, they might be aware of an employee leaving (a potential risk) or an employee’s personal or financial issues that might make them high-risk and worth investigating. The HR team (or any of the department) would communicate this threat through the pre-determined risk level of the position, not the name of the individual employee.
IT/Security: IT – or whomever will be involved in both evaluating possible technology solutions and implementing the selected solution, will provide the other non-technical team members with context around which users have access to what sensitive data, as well as what’s possible when it comes to monitoring activity – all of which will be invaluable when putting the planning and preparation output of this team into practice. Technologies such as user behaviour analytics, for example, look at patterns of behaviour, and do not require inspection of the content of an employee’s activity to deliver on its promise of detecting insider threats. User activity monitoring software lets you capture and review the specific actions of an employee’s activity, including their emails or texts, if needed. There are versions of both that enable you to configure the types of activity monitored to align to your organisation’s goals, with privacy protections woven throughout to address HR concerns.
“The risk of malicious activity from the seemingly trusted insider is still an ongoing reality for organisations worldwide. IT can’t implement a full insider risk programme on its own – or keep one working properly,” Green said.
Each organisation needs to establish an “insider risk” team that specifically addresses the associated challenges — from determining who has (or should have) access to confidential corporate and client data and what each positional “risk level” should be to what constitutes inappropriate user behaviour, how their activity will be monitored and how the organisation will communicate which behaviour is acceptable and the ramifications for breaking “the rules,” he added.
Scottie Cole, network and security administrator at AppRiver, said insider risk teams are vital to an organisation’s security. However, insider risk teams don’t necessarily have to be dedicated, full-time positions, but rather a broad spectrum of positions to bring the most holistic security angle.
For an insider risk team to be successful it takes collaboration across the company, said Shawn Burke, Global CSO at Sungard Availability Services. Procurement for vendor due diligence, Human Resources for screening, internal communication and consequence protocols, and Risk Committee for overall response strategy. However, General Counsel and the Chief Compliance Officer are key stakeholders as insider monitoring must comply with a spate of new state and national privacy legislation.