Category: Technology

Building a Cybersecurity Talent Pipeline

Published by

While prominent companies and healthcare institutions around the world were reacting to a ransomware attack known as WanaCryptor 2.0, or WannaCry, a young man working for a cybersecurity firm in southeast England landed on a solution that cost just $10.69. He found the so-called “kill switch” in the malware’s code that involved the simple purchase of an unregistered domain name. He promptly registered the domain, halting WannaCry’s spread. The identity of this cyberknight remains anonymous, but one notable fact about his background has emerged: he’s only 22 years old.

According to a 2015 study by the Center for Cyber Safety and Education, the average age of security practitioners is 45 years old. Many security professionals will leave the workforce within the next 20 years, but younger professionals are not seeking careers in cybersecurity at a pace sufficient to keep up with companies’ demands. Developing a workforce that will be prepared to meet companies’ increasingly complex cybersecurity needs means companies—and educators—will need to build a bigger, more inclusive talent pipeline for people interested in the practice.

Summer Fowler

Summer Fowler

When I spoke with cybersecurity expert Summer C. Fowler for the cover story of the May/June 2017 issue of NACD Directorship magazine, I asked about her work at Carnegie Mellon University to recruit diverse candidates to the programs she leads at the CERT Division of the Software Engineering Institute. One look at her Twitter profile illustrates that she’s a passionate supporter of the Cyburgh, PA Initiative, a program developed in partnership between Carnegie Mellon and the Pittsburgh Technology Council to advance the city’s status as a leader in cybersecurity technology. The initiative could not be successful without being inclusive.

“The issue of building a talent pipeline is such a challenge because of what we’re offering by way of schooling,” Fowler said about the role of university-level education in developing the cybersecurity talent pipeline. She then drew a parallel between the education and training of doctors in the 1970s to the challenges the cybersecurity sector has with finding diverse candidates. “When you look back to the early 1970s, the medical field was exactly the same. Only about 11 percent of doctors were women. There also were not many minority doctors in this country. We’re investigating what changes in the medical community were made to bring in more women and underrepresented minorities, so that we can do the exact same thing with computer science and engineering fields.”

Fowler pointed out that there needs to be further delineation of roles in the cybersecurity industry to clarify the hierarchy of talent desired. “When we talk about cybersecurity, we all think about a Ph.D. from Carnegie Mellon or from Stanford,” Fowler said. “We need to get better at differentiating the roles and what training requirements are. When we get there, I think that striation of roles will naturally open a pipeline to more people who are interested in the field because it would be seen as this daunting job that requires a Ph.D.”

Still another challenge exists: getting diverse talent interested in the topic to begin with. I shared with Fowler an anecdote from my own high school experience. My path diverged from that of a male friend who was interested in white-hat hacking, which is the technology industry term for the benevolent hacking of systems to detect vulnerabilities. While I was curious about the world of professionals who were defending against cyberattacks, I had no outlet for learning about programming at the time. No one at my public high school in inner-city Memphis was engaging young women in learning about computer science in 2004, and my friend had family who supported and encouraged his interest.

Fast forward nearly 13 years later, and my friend is a practicing white-hat hacker for a Fortune 500 company. I, on the other hand, earned my bachelor’s degree in creative writing, and have since revived my interest in the topic and write about it from a governance perspective. Could I have been working at the same company with the helpful nudges of invested educators, or with after school programs for young women like Girls Who Code that are sponsored by interested corporations? Fowler seems to think the answer is “yes.”

She suggests that the solution now will not be to bring girls and young women to technology, but to bring discussions of technology to them within contexts that interest them. “Instead of saying to girls, ‘You need to belong to the computer science club,’ talk to them about what computer science might mean to ballet, or to whatever program they’re involved in.” She suggested discussing breaches to the entertainment industry with young people interested in acting or movies, for instance, as a way to pique their interest in a field they might not have considered before.

Ultimately, one of the greatest challenges to building the cybersecurity pipeline will involve developing aptitude tests, then encouraging promising young people to pursue a career down that pipeline. “It’s also a matter of figuring out what the specific competencies are. We’ve done a really good job for lots of different types of jobs at being able to say, ‘Let’s perform an assessment to see what your skills are and what you’d like to be doing.’ That process enables us to say, ‘Wow, you would make a great attorney, or you would make a really good financial analyst.’ We don’t have that in the realm of cybersecurity.

Building out more roles in cybersecurity and advocating for the inclusion of the role into other career aptitude tests would help young people—and perhaps even more women—to get excited to join the ranks of cyberkinghts in the future.


Katie Swafford is assistant editor of NACD Directorship magazine and also serves as editor of NACD’s Board Leaders’ Blog.

Click here to learn more about NACD’s online cyber-risk oversight program for directors.

Questions to Ask After the WannaCry Attack

Published by

Major General (Ret.) Brett Williams

After last week’s devastating global ransomware attack, now known as WannaCry, directors will once again be questioning management teams to make sure the company is protected. The challenge is that most directors do not know what questions they should be asking.

If I were sitting on a board, this attack would prompt me to ask questions about the following three areas:

  • End of Life (EOL) software;
  • patching; and
  • disaster recovery.

EOL Software. EOL software is software that is no longer supported by the company that developed it in the first place, meaning that it is not updated or patched to protect against emerging threats. WannaCry took advantage of versions of the Microsoft Windows operating system that were beyond EOL and had well-known security vulnerabilities.

Typically, a company runs EOL software because they have a critical application that requires customized software that cannot run on a current operating system. This situation might force you to maintain an EOL version of Windows, for example, to run the software. In the instance of WannaCry, Windows XP and 8 in particular were targeted. Boards should be asking what risks are we taking by allowing management to continue running EOL software. Are there other options? Could we contract for the development of a new solution? If not, what measures have we taken to mitigate risks presented by relying on EOL software?

Other times companies run EOL software because they do not want to pay for the new software or they expect a level of unacceptable operational friction to occur during the transition from the old version to the new. Particularly in a large, complex environment the cross-platform dependencies can be difficult to understand and predict. Again, it is a risk assessment. What is the risk of running the outdated software, particularly when it supports a critical business function? If the solution is perceived as unaffordable, how does the cost of a new solution compare to the cost of a breach? Directors should also ask where are we running EOL software and why.

Patching. Software companies regularly release updates to their software called patches. The patches address performance issues, fix software bugs, add functionality, and eliminate security vulnerabilities. At any one time, even a mid-sized company could have a backlog of hundreds of patches that have not been applied. This backlog develops for a variety of reasons, but the most central issue is that information technology staff are concerned that applying the patch may “break” some process or software integration and impact the business. This is a valid concern.

In the case of WannaCry, Microsoft issued a patch in March  that would eliminate the vulnerability that allowed the malware to spread. Two months later, hundreds of thousands of machines remained unpatched and were successfully compromised.

Directors should ask for a high-level description of the risk management framework applied to the patching process. Do we treat critical patches differently than we treat lower-grade patches? Have we identified the software that supports critical business processes and apply a different time standard to apply patches there? If a patch will close a critical security vulnerability, but may also disrupt a strategic business function, are the leaders at the appropriate level of the business planning to manage disruption while also securing the enterprise? Have we invested in solutions that expedite the patching process so that we can patch as efficiently as possible?

Disaster Recovery. It is considered a disaster when your company ceases to execute core business functions because of a cyberattack. In the case of WannaCry, many businesses, including essential medical facilities in the United Kingdom, could not function. WannaCry was a potent example of how a cyberattack, which is an abstract concept for many business leaders, can have devastating impact in the physical world.

One aspect of disaster recovery is how quickly a company can recover data that has been encrypted or destroyed. Directors should have a strategic view of the data backup and recovery process. Have we identified the critical data that must be backed up? Have we determined the period of time the backup needs to cover and how quickly we need to be able to switch to the backup? Have we tested ourselves to prove that we could successfully pivot to the backup? What business impact is likely to occur?

The hospitals impacted by WannaCry present another angle of the disaster recovery scenario. For these hospitals, the disaster wasn’t limited to the loss of data. Most medical devices in use today interface with a computer for command and control of that device. During this attack, those command and control computers were rendered inoperative when the ransomware encrypted the software that allows the control computer to issue commands to the connected device. In many cases there is no way to revert to “manual” control. This scenario is particularly troubling given the potential to cause bodily harm.

It is easy to see a similar attack in a manufacturing plant where a control unit could be disabled bringing an assembly line to a halt. And it is not hard to imagine a threat to life and limb in a scenario where we rely on computer control to maintain temperatures and pressures at a safe level in a nuclear power plant.

Directors should ask about the process to recover control of critical assets. Can we activate backup systems that were not connected to the network at the time of the attack? If we bring the backup system on line, how do we know it will not be infected by the same malware? Have the appropriate departments practiced recovery process scenarios? What was the level of business disruption? Does everyone in the company know his or her role in getting critical operations back up and running?

Directors provide oversight of the risk management process—they do note execute the process. Understanding how the company is managing risk around EOL software, patching, and disaster recovery sets the right tone at the top and ensures that the company is better prepared for the inevitable next round of attacks.


Major General (Retired) Brett Williams is a co-founder of IronNet Cybersecurity and the former Director of Operations at U.S. Cyber Command. He is an NACD Board Governance Fellow and faculty member with NACD’s Board Advisory Services where he conducts in-depth cyber-risk oversight seminars for member boards. Brett is also a noted keynote speaker on a variety of cyber related topics.

Looking to strengthen your board’s cyber-risk oversight? Click here to review NACD’s Cyber-Risk Oversight Board Resource Center.

Why Are People Part of the Cybersecurity Equation?

Published by
Sedova_Masha

Masha Sedova

The following blog post is one installment in a series related to board oversight of corporate culture. The National Association of Corporate Directors (NACD) announced in March that its 2017 Blue Ribbon Commission—a roster of distinguished corporate leaders and governance experts—would explore the role of the board in overseeing corporate culture. The commission will produce a report that will launch during NACD’s Global Board Leaders’ Summit Oct. 1–4.

As many as 95 percent of breaches to companies’ data have a human element associated with them. It is no wonder, then, that security teams call people “the weakest link” in securing an organization and choose other investments for defense. Despite companies’ deep investments in security technology over the years, security breaches continue to increase in frequency and cost.

The conventional approach misses a significant opportunity to utilize people as a defense strategy against the ever-changing threat landscape. In fact, only 45 percent of respondents in the National Association of Corporate Director’s 2016-2017 Public Company Governance Survey reported that their boards assessed security risks associated with employee negligence or misconduct. Organizations that have fostered intentional security cultures from the boardroom to the server room have managed to transform employees into their strongest asset in defending against attacks, gaining advantages in both protecting against and detecting cyber threats.

What is security culture?

SecurityCulture

From the boardroom to the server room, people could be your greatest security asset.

Culture-competent boards and management teams understand that culture is the set of behaviors that employees do without being told. In simpler terms, it’s “the way things are done around here.” There are many sub-cultures within an organization, and security culture is one that often looks quite different from the expectations set by policy. Security culture has the power to influence the outcome of everyday business decisions, leaving an employee to judge for themselves the importance of security in a decision. For instance, some frequent questions that employees might encounter include:

  • Is it ok to release insecure code or should we test more, resulting in a delay?
  • Do I feel safe to report that I may have incorrectly shared a critical password?
  • Do I prioritize a secure vendor over a less expensive one?

Each of these decisions, when chosen without security in mind, add to the organization’s security debt. While likely that none of these decisions on their own will lead to the downfall of the organization, each risky action increases the probability of being targeted and successfully compromised by cyber-attackers. On the other hand, if the decisions to the questions presented above are chosen with a secure mindset, over time an organization can expect to see more secure code, better data handling processes, and an increased ability to detect cyberattacks, just to name a few examples. A positive, security-first culture makes it more difficult for an attacker to find and exploit vulnerabilities without detection, incentivizing a different choice in target. Directors at companies across industries should carefully evaluate whether management has established a security-first culture as part of their greater cyber-risk oversight strategy.

It is worth realizing that security-minded employees will not solve all security headaches. However, a company’s talent is an essential third leg of the business stool, partnered with technology and processes. An organization that does not invest in training and empowering its employees to prioritize security is only defending itself with two-thirds of the options available to it.

How do you practice it?

The first step boards and executives can take to shape security culture is to identify the most critical behaviors for your employees. Historically speaking, security culture programs used to be based on compliance and asked, “How many people completed a training?” or “How much time is an employee spending on education?” These are not the right questions. Instead, we should ask, “What will my people do differently after my program is in place?”

Prioritize behaviors by their impact on the security of your organization, customers, and data. Ideally this will distill down into two to three measurable actions that boards and executives can encourage employees to take in the short-term to be security minded. Most mature security culture programs have the following three capabilities to help develop these behaviors: measure, motivate, and educate.

1. Measure It is critical to have measures in place to show progress against culture change. When an organization can measure its key desired behaviors, it can start answering critical questions such as:

–  Are my campaigns effective at changing this behavior?
–  What groups are performing better? Why?
–  Has the company already met its goals? Can I focus on the next behavior?

Measuring culture is notoriously tricky because of its qualitative nature, but it can be done using measures such as the number of malware infections, incident reports, or even surveys that test for the knowledge of, and adherence to, policy and process. Surveys should also test for employees’ perception of the burden of security practices, as well as a self-assessment of individual security behavior.

2. Motivate Effective behavior change requires motivation. Spending the time explaining the purpose behind each security measure goes a long way in getting employees on board. As an example, sharing case studies of successful attacks and lessons learned helps demonstrate to employees that the threat is real and applicable to their work. Some other great ways of providing motivation to follow through on security behaviors are public recognition of outstanding behavior, gamification, or rewards for success.

3. Educate Employees cannot act to change their behavior if they are not fully trained to do so. Ensure employees have the knowledge and tools to complete the security tasks. Ideally, the information presented should be tailored by role and ability level to make it as relevant and interesting to the employee as possible. One key focus should be on educating senior executives on the trade-offs between risk and growth in a company. Consider providing scenarios based on real cyber-attacks that explore the long-term impact of risky business decisions. Add these discussions opportunities into existing leadership courses to help model security-mindset as a valued leadership trait.

Senior level engagement

While the above is a framework that boards and executives can use to drive security behavior change from the bottom up, leadership has an important role in setting the security culture as well. Executives can publicly share the value of security as an employee themselves, which will reinforce the importance they see in proper security culture to the organization and to the customers they serve. Executives should hold their businesses accountable for executing on key security behaviors and publicly call out examples that have impacted the security of the organization, either positively or negatively. Finally, boards should press executives to ensure that the focus of their people-centric security program is on the highest area of risk, not just what is easy to measure.

Masha Sedova is the co-founder of Elevate Security, a company delivering interactive and adaptive security training based on behavioral science. Before Elevate, Masha was a security executive at Salesforce.com, where she built and led the security engagement team focused on improving the security mindset of employees, partners, and customers.