Undergraduate, graduate, and professional students of cybersecurity from around the world gathered earlier this year to participate in a cybersecurity competition that simulated the international policy challenges associated with a global cyberattack. While the goal was to practice sound policy decisions, the majority of competing teams unintentionally led the U.S. into starting an international war. Given a variety of diplomatic and other means of responding to cyberattacks, participants largely took the aggressive approach of hacking back in response to cyberattacks from China, and to disastrous consequences.
While the competition’s participants are all students today, they may well go on to be corporate directors and government leaders of tomorrow. Based on current debate about how organizations in the private sector should respond to cyberattacks, it seems the actions taken by these students may well be representative of a broader trend. In fact, there is enough of a push for organizations to be legally authorized to “hack back” that earlier this year a member of Congress proposed a bill to empower people “to defend themselves online, just as they have the legal authority to do during a physical assault.”
As a business leader, I believe this measure would do more harm than good.
What Is Hack Back?
Hack back, which is sometimes called counterstrike, is a term used to refer to an organization taking offensive action to pursue, and potentially subdue, cyberattackers that have targeted them. For the purposes of this article, I am specifically talking about action taken by private sector organizations that affects computers external to their own network. We are not discussing government actions, which tend to occur within existing legal frameworks and are subject to government oversight.
Hack back activities go beyond defensive measures that organizations may put in place to protect their environments. It is generally understood that hack back activities extend beyond the victim’s own network, systems, and assets, and may involve accessing, modifying, or damaging computers or networks that do not belong to the victim. Directors should note that today it is illegal under the Computer Fraud and Abuse Act for private parties to access or damage computer systems without authorization from the technology owners or an appropriate government entity, even if these systems are being used to attack you. That is what proponents of hack back want to change, and the proposed bill goes some way towards doing this.
The Case for “Self Defense”
In response to the legal restriction, proponents of a law to legalize hacking back at cyberattackers often argue that the same principle should apply as that which allows US citizens to defend themselves against intruders in their homes—even with violent force. While it may sound reasonable to implement equal force to defend a network, the Internet is a space of systems designed specifically for the purpose interacting and communicating. Technology and users are increasingly interconnected. As a result, it’s almost impossible to ensure that defensive action targeted at a specific actor or group of actors will only affect the intended targets.
The reality of the argument for hacking back in self-defense is unfortunately more akin to standing by your fence and lobbing grenades into the street, hoping to get lucky and stop an attacker as they flee. With such an approach, even if you do manage to reach your attacker, you’ll almost certainly cause terrible collateral damage. Can your organization afford to clean up such a mess? What would be the repercussions for your reputation and position in the marketplace?
Another significant challenge for private sector organizations looking to hack back is that, unlike governments, they typically do not have the large-scale, sophisticated intelligence gathering programs needed to accurately attribute cyberattacks to the correct actor. Attackers constantly change their techniques to stay one step ahead of defenders and law enforcement, including leveraging deception techniques. This means that even when there are indications that point to a specific attacker, it is difficult to verify that they have not been planted to throw off suspicion, or to incriminate another party.
Similarly, it is difficult to judge motivations accurately and to determine an appropriate response. There is a fear that once people have hack back in their arsenal, it will become the de facto response rather than using the broad range of options that exist otherwise. This is even more problematic when you consider that devices operating unwillingly as part of a botnet may be used to carry out an attack. These infected devices and their owners are as much victims of the attacker as the primary target. Any attempt to hack back could cause them more harm.
The Security Poverty Line
Should hack back be made a lawful response to a cyberattack, effective participation is likely to be costly, as the technique requires specialized skills. Not every organization will be able to afford to participate. If the authorization framework is not stringent, many organizations may try to participate with insufficient expertise, which is likely to be either ineffective or damaging, or potentially both. However, there are other organizations that will not have the maturity or budget to participate even in this way.
These are the same organizations that today cannot afford a great deal of in-house security expertise and technologies to protect themselves, and currently are also the most vulnerable. As organizations that do have sufficient resources begin to hack back, the cost of attacking these organizations will increase. Profit-motivated attackers will eventually shift towards targeting the less-resourced organizations that reside below the security poverty line, increasing their vulnerability.
A Lawless Land
Creating a policy framework that provides sufficient oversight of hack-back efforts would be impractical and costly. Who would run it? How would it be funded? And why would this be significantly more desirable than the status quo? When the U.S. government takes action against attackers, they must meet a stringent burden of proof for attribution, and even when that has been done, there are strict parameters determining the types of targets that can be pursued, and the kind of action that can be taken.
Even if such a framework could be devised and policed, there would still be significant legal risks posed to a variety of stakeholders at a company. While the Internet is a borderless space accessed from every country in the world, each of those countries has their own legal system. Even if an American company was authorized to hack back, how could you ensure your organization would avoid falling afoul of the laws of another country, not to mention international law?
What Directors Can Do
The discussion around hacking back so far has largely been driven by hyperbole, fear, and indignation. Feelings of fear and indignation are certainly easy to relate to, and as corporate directors, powerlessness does not sit well with us. It is our instinct and duty to defend our organizations from avoidable harm.
The potential costs of a misstep or unintended consequences from hack back should deter business leaders from undertaking such an effort. If another company or a group of individuals is affected, the company that hacked back could see themselves incurring expensive legal proceedings, reputational damage, and loss of trust by many of their stakeholders. Attempts to make organizations exempt from this kind of legal action are problematic as it raises the question of how we can spot and stop accidental or intentional abuses of the system.
It’s one thing for students to unintentionally trigger war in the safe confines of a competitive mock scenario, and another thing entirely to be the business leader that does so in the real world. Directors of companies must instead work together to find better solutions to our complex cybersecurity problems. We should not legitimize vigilantism, particularly given the significant potential risks with dubious benefits.
Corey Thomas is CEO of Rapid7. All opinions expressed here are his own.
NACD held its third annual Cyber Summit in Chicago on June 21, 2017, in partnership with the Internet Security Alliance (ISA). This year’s event followed in the wake of cyber incidents such as WannaCry and the hacking of the Democratic National Committee’s email account, as well as Europe’s adoption of the General Data Protection Regulation (GDPR) and the implementation of China’s Cybersecurity Law.
NACD members left the Cyber Summit with valuable lessons to share with their colleagues.
Speakers acknowledged this context and focused on topics such as building a cyber-risk culture, insider threats, cyber-risk regulation, the threat of state-sponsored attacks, and the economics of cybersecurity. (Click here for a list of event sessions and speakers.)
Five key takeaways emerged for director attendees at the 2017 NACD Cyber Summit:
1. Actively learn from cyber incidents at other companies. A bill that aims to require cyber expertise on public company boards has surfaced twice in Congress since 2015. However, Melissa Hathaway—president at Hathaway Global Strategies and senior advisor at Harvard Kennedy School’s Belfer Center for Science and International Affairs—believes boards do not necessarily need to have a director who is an expert in cybersecurity. Hathaway, who delivered a keynote at the cyber summit, suggests boards regularly hold conversations about current events in cybersecurity, and review a cyber-event case study at each quarterly meeting.
2. Work toward a public-private partnership. Hathaway emphasized the benefit of forming a public-private partnership in the United States to serve as a medium for information sharing about cyberattacks. Canadians have already formed such an organization. The Canadian Cyber Threat Exchange is an independent nonprofit that functions as a middleman between the public and private sectors. According to Hathaway, the U.S. government itself has been a victim of a number of cyberattacks exposing personal data, which has cost it credibility with the private sector. Thus far, U.S. corporations have been largely reluctant to share information about cyberattacks with a government that may not be seen as equipped to adequately respond. At the same time, the government classifies data on cyberattacks that limits information sharing with the private sector.
3. Consider having the CISO report directly to the board. The 2016–2017 NACD Public Company Governance Survey indicates that only 31 percent of boards receive reports directly from the chief information security officer (CISO), despite the increased prevalence and importance of the role. Bret Arsenault, corporate vice president and CISO at Microsoft, indicated that the frequency of meetings between the CISO and the board depends on the board’s existing cyber knowledge. As Microsoft’s CISO, Arsenault conducts a quarterly review with both the full board and the audit committee, in addition to meeting with the CEO and the full leadership team for a half hour once each week. Having all members of senior management involved in the conversation helps set the tone at the top around cyber culture. See the 2017 Cyber-Risk Oversight Handbook for guidance on building a relationship with the CISO (p. 38) and questions for the board to ask management about cybersecurity (p. 21).
4. Strengthen a culture of secure behaviors. In providing oversight of cybersecurity, one aspect of the board’s role is to ensure that the organizational culture reinforces healthy cybersecurity behaviors. For this culture to take hold, it is essential that any cybersecurity-related issues be explained to the board—and employees—in a clear, understandable way. For example, the CISO should speak in business terms to the board and avoid using technical language, according to Arsenault. John Lhota, managing principal for global cybersecurity consulting services at SecureWorks, also suggested using gamification for employee cyber education programs. Directors should evaluate whether a culture of awareness about the importance of cybersecurity truly exists, beginning at the board level. See NACD’s Cyber-Risk Oversight Handbook for tools on assessing the board’s cybersecurity culture (p. 27) and establishing board-level cybersecurity metrics (p. 28).
5. Ensure access rights are limited and continuously monitored. Directors should discuss with management what the company’s most critical data assets—or, “crown jewels”—are, and who could access them. Many high-profile breaches have been carried out by employees or contractors with access to company networks. Robert Clyde, vice chair of ISACA and managing director for Clyde Consulting LLC, indicated the hiring process can aid in selecting trustworthy employees, but employees with administrative privileges (i.e., the ability to install certain software, access certain files, or change configuration settings) can become very destructive if they retaliate against the company after a job loss or make a mistake. The board should check with the CISO to make sure there are a very small number of employees that have administrative privileges on an everyday basis, with slightly more given access in an emergency. Adding secondary approvals—so that two people must be involved in a process—further constrains the possibility of someone accidentally deleting data or removing it on purpose. Access for those with administrative privileges should be amended the second those individuals change jobs, according to Robert Zandoli, director of the ISA and global chief information security officer at BUNGE Ltd.
For more information on providing cybersecurity oversight, please see the following NACD resources:
Investors now see corporate governance as a hallmark of the board’s effectiveness and one of the best sources of insight into the way companies operate. In response to this trend, Farient Advisors LLC, in partnership with the Global Governance and Executive Compensation Group, produced the report 2017—Global Trends in Corporate Governance, an analysis of corporate governance practices in the areas of executive compensation, board structure and composition, and shareholder rights covering 17 countries across six continents.
NACD, Farient Advisors LLC, and Katten Muchin Rosenman LLP cohosted a meeting of the NACD Compensation Committee Chair Advisory Council on April 4, 2017, during which Fortune 500 compensation committee chairs discussed the report’s findings in the context of the current proxy season. The discussion was held using a modified version of the Chatham House Rule, under which participants’ quotes (italicized below) are not attributed to those individuals or their organizations, with the exception of cohosts. A list of attendees’ names are available here.
Global Governance Trends
2017—Global Trends in Corporate Governance finds that governance standards around the world have strengthened in response to financial crises and breakdowns in corporate ethics and compliance. Those crises and breakdowns have led to greater pressure from governments and investors, who are demanding economic stability and safe capital markets. In regard to executive compensation, the report notes a number of global governance trends:
Source: Farient Advisors, 2017—Global Trends in Corporate Governance, p. 18.
Most of the 17 countries surveyed (94%) require executive compensation disclosure, although the disclosures made and the quality of these disclosures varies from country to country. Surveyed countries that had the least developed disclosures are South Africa, China, Brazil, and Mexico.
Say-on-pay voting is mandatory in most developed countries, although there is variance on whether the votes are binding or not. For developed countries where the vote is voluntary (e.g., Canada, Belgium, Germany, and Ireland), it still remains a leading practice.
Common leading practices are to use competitive benchmarks, such as peer groups to establish rationales for pay, and to provide investors with information on components of pay packages and performance goals.
2017 Proxy Season Developments
Meeting participants shared a number of observations and practices from the current proxy season:
Continuous improvement on disclosures Council participants indicated they are sharing more information with shareholders, in a more consumable way. “We want to be in the front ranks as far as providing information to shareholders,” said one director. “Instead of asking ‘why should we share that?’ we’re starting to ask ‘why not?’” Another director added, “Over the last few years we’ve moved from a very dense legalistic document to something that’s much more readable. Our board set up a process to do a deep-dive review every two years; this fall is our next review. It’s a way to ensure our disclosures keep pace with current practices and also reflect where we are as a company and board.”
Council members also discussed the status of Dodd-Frank rulemaking, given the new presidential administration and SEC commission. S. Ward Atterbury, partner at Katten Muchin Rosenmann LLP, said, “While it’s unclear exactly what the SEC will do with Dodd-Frank requirements in the future, investors have spoken on some of the issues, especially on things like say on pay and pay for performance. There may be less formal regulation, but the expectations on companies and boards are still there [to provide pay-for-performance disclosure].”
Growing interest in board processes According to one director, “We’re hearing more interest about CEO succession as it relates to strategy. Investors are asking us to describe our process—they understand we can’t discuss specifics.”
Director Pay Dayna Harris, partner at Farient Advisors LLC, discussed the increased focus on director pay: “Given the recent law suits regarding excessive director compensation and an increase in director pay proposals in 2016, Institutional Shareholder Services (ISS) created a new framework for shareholder ratification of director pay programs and equity plans.” ISS’ framework evaluates director pay programs based on stock ownership guidelines and holding requirements, equity vesting, mix of cash and equity, meaningful limits on director pay, and quality of director pay disclosure. ISS’ updated factors for evaluating director equity plans include relative pay magnitude and meaningful pay limits.
Environmental, social, and governance (ESG) issues Meeting participants agreed that social issues, such as ESG and gender pay equity, are increasing in popularity among investors. In particular, nonbinding shareholder proposals on climate change received majority support this year at Exxon Mobil Corp., Occidental Petroleum Corp., and PPL Corp.
Refining approaches to outreach and engagement with investors Meeting participants discussed leading practices for engaging shareholders. Some directors indicated that investors have turned down their offers to speak on a regular basis because of time constraints. One delegate emphasized that just making the offer to meet with shareholders is appreciated, even if that offer is turned down. One director said, “We invited one of our major long-term shareholders to speak at one of our off-site [meetings] as part of a board-education session. It was a different type of engagement and very valuable.”