My introduction to cybercrime came seven years ago as a bolt from the blue. I Googled myself and found that four of the top five search results showed I was on the Federal Bureau of Investigation’s (FBI) Top Ten Most Wanted List.
The attack came as a bolt from the blue.
After checking outside my front door to make sure no FBI agents were lining up to arrest me, I researched what had happened. I was the victim of an Internet stalker—a previous business associate looking to mar reputations of people this person had had no contact with for nearly two decades.
This experience personally taught me the harm that could be done through the Internet and the unique nature of the risks involved, and sparked my commitment to practicing sound cyber-risk oversight.
Cybersecurity as a Risk
Cyber risks have unique characteristics that not many of the more than 60 different risks reported in public companies’ 10-K reporting share. Most other risks and the damage they cause, although highly detrimental to a company, can be assessed and quantified (consider, for example, the cost of rebuilding after a fire). Cyber risk is different because a victim of a cyberattack may never be able to find out who attacked the company or person, where the attack came from, what was taken, or how long the attack had been going on for.
The most striking feature of cyberattacks is their anonymity. It is very difficult to trace an attacker who wants to stay anonymous. An attacker can create dummy corporations, hijack e-mail accounts, and use multiple servers to become virtually untraceable. Another method that hackers use to hide themselves is the virtual private network, which make it very challenging to track where the attack originated. Say the intrusion appears to have come through a server in Singapore. The attacker actually could be in Estonia. Even if you can trace the perpetrator, getting redress would mean international ligation.
What are they taking? Unless the attacker is confronting you with a ransom demand for your data, you may not know what is being taken or corrupted without extensive and time-consuming forensics.
Lastly, how long has this been going on? For the same reasons that it is difficult to identify what is being stolen, the time of the origination of the attack is hard to assess. Often known as “Logic Bombs,” malicious software can lie dormant for long periods, and sometimes years, before it is activated. The classic example is the disgruntled employee who leaves malware that activates itself on the anniversary of his firing.
You Are Not Invulnerable
One of the worse mistakes a board can make is to assume that they are at a lower state of cyber risk, as their corporation is not a bank or does not store credit card information. If the company transfers money and is connected to the Internet, which means just about every company in the United States and many around the world, the company is at high risk for being attacked. Banks and retailers are at extremely high risk. Low risk simply does not exist in the cyber-risk spectrum.
For most companies, the principal vulnerability is economic. Simply put, attackers are trying to make money. Besides stealing information such as employee health care data, or social security numbers that can be sold on the black market, an increasingly popular form of attack is to lock out the company from its data, or encrypt it and charge a ransom to release it or decrypt it.
Brand and reputation attacks are another vulnerability done more to discredit a company’s reputation for either competitive or political motives. To take an obvious example, imagine the damage to a cybersecurity company’s reputation if its own firewalls were breached. Such an attack would deeply harm the core promise that a cybersecurity company makes to its customers to secure its enterprise.
Hacktivism, as the name connotes, is an attack launched based on the attacker’s beliefs and ideologies. For instance, a company that tests its products on animals could find itself as a hacktivism target. Typically, the attacker will post messages about the cause on the company’s website or contact its customers and suppliers.
Lastly, malicious attacks can be launched to inconvenience and disrupt the company such as in the Logic Bomb attack described above. There is usually no economic effect—vengeance is the principal motive.
Since her “arrival” on the FBI’s Top Ten Most Wanted list, Wendy Luscombe has led a real estate investment trust as CEO, served as a director on European and American boards, and studied cybersecurity and cyber-reputation management. All views and opinions expressed here are the author’s own.
While prominent companies and healthcare institutions around the world were reacting to a ransomware attack known as WanaCryptor 2.0, or WannaCry, a young man working for a cybersecurity firm in southeast England landed on a solution that cost just $10.69. He found the so-called “kill switch” in the malware’s code that involved the simple purchase of an unregistered domain name. He promptly registered the domain, halting WannaCry’s spread. The identity of this cyberknight remains anonymous, but one notable fact about his background has emerged: he’s only 22 years old.
According to a 2015 study by the Center for Cyber Safety and Education, the average age of security practitioners is 45 years old. Many security professionals will leave the workforce within the next 20 years, but younger professionals are not seeking careers in cybersecurity at a pace sufficient to keep up with companies’ demands. Developing a workforce that will be prepared to meet companies’ increasingly complex cybersecurity needs means companies—and educators—will need to build a bigger, more inclusive talent pipeline for people interested in the practice.
When I spoke with cybersecurity expert Summer C. Fowler for the cover story of the May/June 2017 issue of NACD Directorship magazine, I asked about her work at Carnegie Mellon University to recruit diverse candidates to the programs she leads at the CERT Division of the Software Engineering Institute. One look at her Twitter profile illustrates that she’s a passionate supporter of the Cyburgh, PA Initiative, a program developed in partnership between Carnegie Mellon and the Pittsburgh Technology Council to advance the city’s status as a leader in cybersecurity technology. The initiative could not be successful without being inclusive.
“The issue of building a talent pipeline is such a challenge because of what we’re offering by way of schooling,” Fowler said about the role of university-level education in developing the cybersecurity talent pipeline. She then drew a parallel between the education and training of doctors in the 1970s to the challenges the cybersecurity sector has with finding diverse candidates. “When you look back to the early 1970s, the medical field was exactly the same. Only about 11 percent of doctors were women. There also were not many minority doctors in this country. We’re investigating what changes in the medical community were made to bring in more women and underrepresented minorities, so that we can do the exact same thing with computer science and engineering fields.”
Fowler pointed out that there needs to be further delineation of roles in the cybersecurity industry to clarify the hierarchy of talent desired. “When we talk about cybersecurity, we all think about a Ph.D. from Carnegie Mellon or from Stanford,” Fowler said. “We need to get better at differentiating the roles and what training requirements are. When we get there, I think that striation of roles will naturally open a pipeline to more people who are interested in the field because it would be seen as this daunting job that requires a Ph.D.”
Still another challenge exists: getting diverse talent interested in the topic to begin with. I shared with Fowler an anecdote from my own high school experience. My path diverged from that of a male friend who was interested in white-hat hacking, which is the technology industry term for the benevolent hacking of systems to detect vulnerabilities. While I was curious about the world of professionals who were defending against cyberattacks, I had no outlet for learning about programming at the time. No one at my public high school in inner-city Memphis was engaging young women in learning about computer science in 2004, and my friend had family who supported and encouraged his interest.
Fast forward nearly 13 years later, and my friend is a practicing white-hat hacker for a Fortune 500 company. I, on the other hand, earned my bachelor’s degree in creative writing, and have since revived my interest in the topic and write about it from a governance perspective. Could I have been working at the same company with the helpful nudges of invested educators, or with after school programs for young women like Girls Who Code that are sponsored by interested corporations? Fowler seems to think the answer is “yes.”
She suggests that the solution now will not be to bring girls and young women to technology, but to bring discussions of technology to them within contexts that interest them. “Instead of saying to girls, ‘You need to belong to the computer science club,’ talk to them about what computer science might mean to ballet, or to whatever program they’re involved in.” She suggested discussing breaches to the entertainment industry with young people interested in acting or movies, for instance, as a way to pique their interest in a field they might not have considered before.
Ultimately, one of the greatest challenges to building the cybersecurity pipeline will involve developing aptitude tests, then encouraging promising young people to pursue a career down that pipeline. “It’s also a matter of figuring out what the specific competencies are. We’ve done a really good job for lots of different types of jobs at being able to say, ‘Let’s perform an assessment to see what your skills are and what you’d like to be doing.’ That process enables us to say, ‘Wow, you would make a great attorney, or you would make a really good financial analyst.’ We don’t have that in the realm of cybersecurity.
Building out more roles in cybersecurity and advocating for the inclusion of the role into other career aptitude tests would help young people—and perhaps even more women—to get excited to join the ranks of cyberkinghts in the future.
Katie Swafford is assistant editor of NACD Directorship magazine and also serves as editor of NACD’s Board Leaders’ Blog.
The following blog post is one installment in a series related to board oversight of corporate culture. The National Association of Corporate Directors (NACD) announced in March that its 2017 Blue Ribbon Commission—a roster of distinguished corporate leaders and governance experts—would explore the role of the board in overseeing corporate culture. The commission will produce a report that will launch during NACD’s Global Board Leaders’ Summit Oct. 1–4.
As many as 95 percent of breaches to companies’ data have a human element associated with them. It is no wonder, then, that security teams call people “the weakest link” in securing an organization and choose other investments for defense. Despite companies’ deep investments in security technology over the years, security breaches continue to increase in frequency and cost.
The conventional approach misses a significant opportunity to utilize people as a defense strategy against the ever-changing threat landscape. In fact, only 45 percent of respondents in the National Association of Corporate Director’s 2016-2017 Public Company Governance Surveyreported that their boards assessed security risks associated with employee negligence or misconduct. Organizations that have fostered intentional security cultures from the boardroom to the server room have managed to transform employees into their strongest asset in defending against attacks, gaining advantages in both protecting against and detecting cyber threats.
What is security culture?
From the boardroom to the server room, people could be your greatest security asset.
Culture-competent boards and management teams understand that culture is the set of behaviors that employees do without being told. In simpler terms, it’s “the way things are done around here.” There are many sub-cultures within an organization, and security culture is one that often looks quite different from the expectations set by policy. Security culture has the power to influence the outcome of everyday business decisions, leaving an employee to judge for themselves the importance of security in a decision. For instance, some frequent questions that employees might encounter include:
Is it ok to release insecure code or should we test more, resulting in a delay?
Do I feel safe to report that I may have incorrectly shared a critical password?
Do I prioritize a secure vendor over a less expensive one?
Each of these decisions, when chosen without security in mind, add to the organization’s security debt. While likely that none of these decisions on their own will lead to the downfall of the organization, each risky action increases the probability of being targeted and successfully compromised by cyber-attackers. On the other hand, if the decisions to the questions presented above are chosen with a secure mindset, over time an organization can expect to see more secure code, better data handling processes, and an increased ability to detect cyberattacks, just to name a few examples. A positive, security-first culture makes it more difficult for an attacker to find and exploit vulnerabilities without detection, incentivizing a different choice in target. Directors at companies across industries should carefully evaluate whether management has established a security-first culture as part of their greater cyber-risk oversight strategy.
It is worth realizing that security-minded employees will not solve all security headaches. However, a company’s talent is an essential third leg of the business stool, partnered with technology and processes. An organization that does not invest in training and empowering its employees to prioritize security is only defending itself with two-thirds of the options available to it.
How do you practice it?
The first step boards and executives can take to shape security culture is to identify the most critical behaviors for your employees. Historically speaking, security culture programs used to be based on compliance and asked, “How many people completed a training?” or “How much time is an employee spending on education?” These are not the right questions. Instead, we should ask, “What will my people do differently after my program is in place?”
Prioritize behaviors by their impact on the security of your organization, customers, and data. Ideally this will distill down into two to three measurable actions that boards and executives can encourage employees to take in the short-term to be security minded. Most mature security culture programs have the following three capabilities to help develop these behaviors: measure, motivate, and educate.
1. Measure It is critical to have measures in place to show progress against culture change. When an organization can measure its key desired behaviors, it can start answering critical questions such as:
– Are my campaigns effective at changing this behavior?
– What groups are performing better? Why?
– Has the company already met its goals? Can I focus on the next behavior?
Measuring culture is notoriously tricky because of its qualitative nature, but it can be done using measures such as the number of malware infections, incident reports, or even surveys that test for the knowledge of, and adherence to, policy and process. Surveys should also test for employees’ perception of the burden of security practices, as well as a self-assessment of individual security behavior.
2. Motivate Effective behavior change requires motivation. Spending the time explaining the purpose behind each security measure goes a long way in getting employees on board. As an example, sharing case studies of successful attacks and lessons learned helps demonstrate to employees that the threat is real and applicable to their work. Some other great ways of providing motivation to follow through on security behaviors are public recognition of outstanding behavior, gamification, or rewards for success.
3. Educate Employees cannot act to change their behavior if they are not fully trained to do so. Ensure employees have the knowledge and tools to complete the security tasks. Ideally, the information presented should be tailored by role and ability level to make it as relevant and interesting to the employee as possible. One key focus should be on educating senior executives on the trade-offs between risk and growth in a company. Consider providing scenarios based on real cyber-attacks that explore the long-term impact of risky business decisions. Add these discussions opportunities into existing leadership courses to help model security-mindset as a valued leadership trait.
Senior level engagement
While the above is a framework that boards and executives can use to drive security behavior change from the bottom up, leadership has an important role in setting the security culture as well. Executives can publicly share the value of security as an employee themselves, which will reinforce the importance they see in proper security culture to the organization and to the customers they serve. Executives should hold their businesses accountable for executing on key security behaviors and publicly call out examples that have impacted the security of the organization, either positively or negatively. Finally, boards should press executives to ensure that the focus of their people-centric security program is on the highest area of risk, not just what is easy to measure.
Masha Sedova is the co-founder of Elevate Security, a company delivering interactive and adaptive security training based on behavioral science. Before Elevate, Masha was a security executive at Salesforce.com, where she built and led the security engagement team focused on improving the security mindset of employees, partners, and customers.