While prominent companies and healthcare institutions around the world were reacting to a ransomware attack known as WanaCryptor 2.0, or WannaCry, a young man working for a cybersecurity firm in southeast England landed on a solution that cost just $10.69. He found the so-called “kill switch” in the malware’s code that involved the simple purchase of an unregistered domain name. He promptly registered the domain, halting WannaCry’s spread. The identity of this cyberknight remains anonymous, but one notable fact about his background has emerged: he’s only 22 years old.
According to a 2015 study by the Center for Cyber Safety and Education, the average age of security practitioners is 45 years old. Many security professionals will leave the workforce within the next 20 years, but younger professionals are not seeking careers in cybersecurity at a pace sufficient to keep up with companies’ demands. Developing a workforce that will be prepared to meet companies’ increasingly complex cybersecurity needs means companies—and educators—will need to build a bigger, more inclusive talent pipeline for people interested in the practice.
When I spoke with cybersecurity expert Summer C. Fowler for the cover story of the May/June 2017 issue of NACD Directorship magazine, I asked about her work at Carnegie Mellon University to recruit diverse candidates to the programs she leads at the CERT Division of the Software Engineering Institute. One look at her Twitter profile illustrates that she’s a passionate supporter of the Cyburgh, PA Initiative, a program developed in partnership between Carnegie Mellon and the Pittsburgh Technology Council to advance the city’s status as a leader in cybersecurity technology. The initiative could not be successful without being inclusive.
“The issue of building a talent pipeline is such a challenge because of what we’re offering by way of schooling,” Fowler said about the role of university-level education in developing the cybersecurity talent pipeline. She then drew a parallel between the education and training of doctors in the 1970s to the challenges the cybersecurity sector has with finding diverse candidates. “When you look back to the early 1970s, the medical field was exactly the same. Only about 11 percent of doctors were women. There also were not many minority doctors in this country. We’re investigating what changes in the medical community were made to bring in more women and underrepresented minorities, so that we can do the exact same thing with computer science and engineering fields.”
Fowler pointed out that there needs to be further delineation of roles in the cybersecurity industry to clarify the hierarchy of talent desired. “When we talk about cybersecurity, we all think about a Ph.D. from Carnegie Mellon or from Stanford,” Fowler said. “We need to get better at differentiating the roles and what training requirements are. When we get there, I think that striation of roles will naturally open a pipeline to more people who are interested in the field because it would be seen as this daunting job that requires a Ph.D.”
Still another challenge exists: getting diverse talent interested in the topic to begin with. I shared with Fowler an anecdote from my own high school experience. My path diverged from that of a male friend who was interested in white-hat hacking, which is the technology industry term for the benevolent hacking of systems to detect vulnerabilities. While I was curious about the world of professionals who were defending against cyberattacks, I had no outlet for learning about programming at the time. No one at my public high school in inner-city Memphis was engaging young women in learning about computer science in 2004, and my friend had family who supported and encouraged his interest.
Fast forward nearly 13 years later, and my friend is a practicing white-hat hacker for a Fortune 500 company. I, on the other hand, earned my bachelor’s degree in creative writing, and have since revived my interest in the topic and write about it from a governance perspective. Could I have been working at the same company with the helpful nudges of invested educators, or with after school programs for young women like Girls Who Code that are sponsored by interested corporations? Fowler seems to think the answer is “yes.”
She suggests that the solution now will not be to bring girls and young women to technology, but to bring discussions of technology to them within contexts that interest them. “Instead of saying to girls, ‘You need to belong to the computer science club,’ talk to them about what computer science might mean to ballet, or to whatever program they’re involved in.” She suggested discussing breaches to the entertainment industry with young people interested in acting or movies, for instance, as a way to pique their interest in a field they might not have considered before.
Ultimately, one of the greatest challenges to building the cybersecurity pipeline will involve developing aptitude tests, then encouraging promising young people to pursue a career down that pipeline. “It’s also a matter of figuring out what the specific competencies are. We’ve done a really good job for lots of different types of jobs at being able to say, ‘Let’s perform an assessment to see what your skills are and what you’d like to be doing.’ That process enables us to say, ‘Wow, you would make a great attorney, or you would make a really good financial analyst.’ We don’t have that in the realm of cybersecurity.
Building out more roles in cybersecurity and advocating for the inclusion of the role into other career aptitude tests would help young people—and perhaps even more women—to get excited to join the ranks of cyberkinghts in the future.
Katie Swafford is assistant editor of NACD Directorship magazine and also serves as editor of NACD’s Board Leaders’ Blog.
The following blog post is one installment in a series related to board oversight of corporate culture. The National Association of Corporate Directors (NACD) announced in March that its 2017 Blue Ribbon Commission—a roster of distinguished corporate leaders and governance experts—would explore the role of the board in overseeing corporate culture. The commission will produce a report that will launch during NACD’s Global Board Leaders’ Summit Oct. 1–4.
As many as 95 percent of breaches to companies’ data have a human element associated with them. It is no wonder, then, that security teams call people “the weakest link” in securing an organization and choose other investments for defense. Despite companies’ deep investments in security technology over the years, security breaches continue to increase in frequency and cost.
The conventional approach misses a significant opportunity to utilize people as a defense strategy against the ever-changing threat landscape. In fact, only 45 percent of respondents in the National Association of Corporate Director’s 2016-2017 Public Company Governance Surveyreported that their boards assessed security risks associated with employee negligence or misconduct. Organizations that have fostered intentional security cultures from the boardroom to the server room have managed to transform employees into their strongest asset in defending against attacks, gaining advantages in both protecting against and detecting cyber threats.
What is security culture?
From the boardroom to the server room, people could be your greatest security asset.
Culture-competent boards and management teams understand that culture is the set of behaviors that employees do without being told. In simpler terms, it’s “the way things are done around here.” There are many sub-cultures within an organization, and security culture is one that often looks quite different from the expectations set by policy. Security culture has the power to influence the outcome of everyday business decisions, leaving an employee to judge for themselves the importance of security in a decision. For instance, some frequent questions that employees might encounter include:
Is it ok to release insecure code or should we test more, resulting in a delay?
Do I feel safe to report that I may have incorrectly shared a critical password?
Do I prioritize a secure vendor over a less expensive one?
Each of these decisions, when chosen without security in mind, add to the organization’s security debt. While likely that none of these decisions on their own will lead to the downfall of the organization, each risky action increases the probability of being targeted and successfully compromised by cyber-attackers. On the other hand, if the decisions to the questions presented above are chosen with a secure mindset, over time an organization can expect to see more secure code, better data handling processes, and an increased ability to detect cyberattacks, just to name a few examples. A positive, security-first culture makes it more difficult for an attacker to find and exploit vulnerabilities without detection, incentivizing a different choice in target. Directors at companies across industries should carefully evaluate whether management has established a security-first culture as part of their greater cyber-risk oversight strategy.
It is worth realizing that security-minded employees will not solve all security headaches. However, a company’s talent is an essential third leg of the business stool, partnered with technology and processes. An organization that does not invest in training and empowering its employees to prioritize security is only defending itself with two-thirds of the options available to it.
How do you practice it?
The first step boards and executives can take to shape security culture is to identify the most critical behaviors for your employees. Historically speaking, security culture programs used to be based on compliance and asked, “How many people completed a training?” or “How much time is an employee spending on education?” These are not the right questions. Instead, we should ask, “What will my people do differently after my program is in place?”
Prioritize behaviors by their impact on the security of your organization, customers, and data. Ideally this will distill down into two to three measurable actions that boards and executives can encourage employees to take in the short-term to be security minded. Most mature security culture programs have the following three capabilities to help develop these behaviors: measure, motivate, and educate.
1. Measure It is critical to have measures in place to show progress against culture change. When an organization can measure its key desired behaviors, it can start answering critical questions such as:
– Are my campaigns effective at changing this behavior?
– What groups are performing better? Why?
– Has the company already met its goals? Can I focus on the next behavior?
Measuring culture is notoriously tricky because of its qualitative nature, but it can be done using measures such as the number of malware infections, incident reports, or even surveys that test for the knowledge of, and adherence to, policy and process. Surveys should also test for employees’ perception of the burden of security practices, as well as a self-assessment of individual security behavior.
2. Motivate Effective behavior change requires motivation. Spending the time explaining the purpose behind each security measure goes a long way in getting employees on board. As an example, sharing case studies of successful attacks and lessons learned helps demonstrate to employees that the threat is real and applicable to their work. Some other great ways of providing motivation to follow through on security behaviors are public recognition of outstanding behavior, gamification, or rewards for success.
3. Educate Employees cannot act to change their behavior if they are not fully trained to do so. Ensure employees have the knowledge and tools to complete the security tasks. Ideally, the information presented should be tailored by role and ability level to make it as relevant and interesting to the employee as possible. One key focus should be on educating senior executives on the trade-offs between risk and growth in a company. Consider providing scenarios based on real cyber-attacks that explore the long-term impact of risky business decisions. Add these discussions opportunities into existing leadership courses to help model security-mindset as a valued leadership trait.
Senior level engagement
While the above is a framework that boards and executives can use to drive security behavior change from the bottom up, leadership has an important role in setting the security culture as well. Executives can publicly share the value of security as an employee themselves, which will reinforce the importance they see in proper security culture to the organization and to the customers they serve. Executives should hold their businesses accountable for executing on key security behaviors and publicly call out examples that have impacted the security of the organization, either positively or negatively. Finally, boards should press executives to ensure that the focus of their people-centric security program is on the highest area of risk, not just what is easy to measure.
Masha Sedova is the co-founder of Elevate Security, a company delivering interactive and adaptive security training based on behavioral science. Before Elevate, Masha was a security executive at Salesforce.com, where she built and led the security engagement team focused on improving the security mindset of employees, partners, and customers.
China’s legislature approved its Cybersecurity Law this past November, solidifying China’s regulatory regime for cyberspace and potentially disrupting foreign companies that use or provide telecommunications networks in China. The law takes effect June 1, 2017, and reflects China’s desire for “cyber-sovereignty” (regulating the Internet in China according to national laws, despite the global nature of the World Wide Web). As the Chinese Communist Party (CCP) faces pressure from slowing economic growth and foreign influence, the Cybersecurity Law is one in a series of laws the Chinese government has implemented recently to uphold state security.
Significant Provisions of the Law
Though the wording of the law is vague, it formalizes many current practices and aims to consolidate cybersecurity authority under the Cybersecurity Administration of China. While the government is expected to offer more clarification on the law through implementation rules, how the law is played out in practice will be the ultimate indicator of the law’s severity. These three aspects of the law have the greatest potential to affect multinational companies (MNCs) doing business in China, according to an NACD analysis:
1. Data localization: Article 37 of the law is one of the most contentious and requires that “critical information infrastructure” (CII) operators store personal information and other important data they gather or generate in mainland China to be storedin mainland China. CII operators must have government approval to transfer this data outside the mainland if it’s “truly necessary.” The definition of CII is a catch-all, including public communication and information services, power, traffic, water, finance, public service, electronic governance, in addition to any CII that would impact national security if data were compromised.
Impact: The broad applicability of the CII definition raises the concern that any company using a telecommunications network to operate or provide services in China would be required to store data in mainland China, possibly even affecting those that store data to clouds with servers located outside mainland China.
2. Support for Chinese security authorities: Article 28 requires “network operators” to provide technical support to security authorities for the purposes of upholding national security and conducting criminal investigations. Network operators are broadly defined as those that own or administer computer information networks or are network service providers, which may include anyone operating a business over the Internet or networks.
Impact: The loose definition of “technical support” creates the concern that MNCs will be required to grant Chinese authorities access to confidential information, compromising private information and intellectual property that may be shared with state-owned competitors. Although not stated in the final version of the law, there is also the possibility that companies may be required to provide decryption assistance and backdoor access to authorities upon request.
3. Certified network equipment and products: For network operators, Article 23 indicates that “critical network equipment” and “specialized network security products” must meet national standards and pass inspection before they can be sold or supplied in China. A catalogue providing more specification on these types of products will be released by the government administrations handling cybersecurity. Under Article 35, CII operators are also required to undergo a “national security review” when purchasing network equipment or services that may affect national security.
Impact: Chinese companies and government agencies have historically relied on computer hardware and software manufactured by foreign companies, although this is now shifting in favor of domestic IT products. Opportunities for hacking and espionage put China at risk of losing sensitive information to foreign governments or companies, and China has already started conducting reviews of the IT security products used by the central levels of government. This provision of the Cybersecurity Law demonstrates China’s resolve to mitigate this risk and may pose a significant barrier to foreign IT equipment manufacturers selling products in China.
How Directors Can Prepare
China’s Cybersecurity Law has been criticized by the foreign business community, and, depending on the law’s implementation, it may make doing business in China for MNCs not only more complex but also riskier. Tom Manning, a China specialist at the University of Chicago Law School and director of Dun & Bradstreet, CommScope, and Clear Media Limited, advises boards to consider the effect of the Cybersecurity Law in the greater context of China’s rise: “The Chinese economy is increasingly more self-sufficient. Domestic companies are growing stronger and are more capable, while multinational companies are finding it more difficult to compete.”
Manning suggests boards conduct an overall China risk assessment, with the Cybersecurity Law as the focal point. While some companies may determine the risk of doing business in China is too high, Manning says, others might decide they need to invest more in China to be profitable. Ultimately, creating alliances with domestic firms, who have a greater influence over the government’s implementation of the law, may be key. “Leading domestic companies have a stake in seeing a better definition of the law, and their interests aren’t unaligned with multinational companies,” Manning says. “Chinese Internet companies can explain to the government how the law will affect their business models and be more effective in doing so than Western companies.”
Although how the law will be enforced remains to be seen, boards can consider the following questions when evaluating the impact of China’s Cybersecurity Law:
Are we storing information generated or gathered in mainland China on servers in mainland China? Do we need to create separate IT systems for China-specific data? Are we reliant on cross-border data transfers, and how would we approach this need with the Chinese government?
What is our risk exposure stemming from the potential loss of intellectual property or encryption information as a result of this law? How would our business be affected should our Chinese competitors gain access to this information?
For computer hardware or software manufactures, are we willing to share our source code with the Chinese government?
For technology firms, how does the law alter the playing field for our company to compete in China against domestic firms?
What additional investments do we need to make in order to comply with this law?