Culture and Compliance: Board Lessons From Volkswagen

Published by

This blog post is one installment in a series related to board oversight of corporate culture. The National Association of Corporate Directors announced in March that its 2017 Blue Ribbon Commission—a roster of distinguished corporate leaders and governance experts—would explore the role of the board in overseeing corporate culture. The commission will produce a report that will be released at NACD’s Global Board Leaders’ Summit , Oct. 1–4.

A panel discussed how the iconic company became embroiled in scandal.

Wells Fargo & Co., Volkswagen AG (VW), Mylan NV, and Valeant Pharmaceuticals International are just a few of the companies that have recently experienced high-profile corporate crises stemming from ethics and compliance breakdowns. As corporate directors look to learn from these scandals, the John L. Weinberg Center for Corporate Governance, Association of Corporate Council, and Bloomberg Law® this April co-hosted the event Volkswagen Emissions Scandal—Lessons for Investors, Boards, Chief Legal Officers and Compliance & Governance Professionals.* The panel discussed the VW emissions scandal and lessons for boards of directors and general counsel (GCs) on instituting a corporate culture that promotes ethics and compliance.

Corporate Governance Causes of the VW Scandal

Charles M. Elson, director of the University of Delaware’s John L. Weinberg Center for Corporate Governance, notes in an article that three main governance practices at VW created a perfect environment for noncompliant behavior stemming from a lack of independent shareholder representation on the board:

  1. A complicated web of interests with dual-class stock, pyramidal ownership, and family control. The Porsche and Piëch families own just over 50 percent of VW’s voting rights through their preferred class stock in Porsche Automobil Holding SE, which in turn owns shares of VW (known as pyramidal ownership). Ferdinand Piëch, the grandson of Porsche company founder Ferdinand Porsche, was chair of VW’s supervisory board at the time of the scandal and served as CEO from 1993 to 2002. Piëch’s primary goal is said to have been to create the largest automaker in the world, with less regard for creating profit and shareholder value. This directive from the company leader, in an environment where shareholders outside of the family had little influence over the board, created a corporate culture where employees chose noncompliant behavior over failure when designing the “defeat devices” used to cheat U.S. emissions tests.
  2. The government as a major shareholder. VW was a state-owned enterprise until 1960 when it became privatized and left Germany’s Lower Saxony region with a 20 percent stake in the company. Elson opines that the interest of government officials is to be re-elected, often achieved through high employment rates. Therefore, government representatives on the board of VW were driven to create jobs at VW, the largest employer in Lower Saxony, even if adding those jobs was detrimental to profits.
  3. Labor representation on the board (codetermination). German law requires all companies with more than 2,000 employees to fill half of the board with employee representatives. Elson argues that the board’s ability to provide effective compliance oversight was diluted by labor representatives on the board who were essentially monitoring themselves, and hence more focused on obtaining higher compensation and decent working hours for employees.

In light of these conditions at VW, panelists shared a number of leading practices for GCs and directors in creating a compliant corporate culture:

Lessons for GCs

  • “You can’t legislate ethics, but you can promote them,” said one panelist. Be the devil’s advocate and stress the importance of risk management and cultural tones at different levels of the organization, i.e., the so-called tone at the top, mood at the middle, and buzz at the bottom.
  • Ensure your board spends adequate time on compliance issues. Directors are often bogged down by compliance and want to spend more time on strategy, but prioritizing compliance at the board level will create a culture that allows strategy to be carried out successfully.
  • Get the right information to the board at the right time. According to one panelist, “The GC—as well as risk managers and in-house lawyers—need to be tough enough to speak up and report to the board. At Lehman Brothers, the CEO was known as the ‘gorilla on Wall Street.’ He doubled down on real estate, which the risk officer beneath him knew was risky, but their concerns were never known to the board.”
  • Remember that your duty is to the company—not the CEO—even if you’re reporting to him or her. “If [you as] the GC [are] aware of a violation, you need to do the right thing and not be swayed,” said one speaker.

Lessons for Directors

  • Increase your exposure to more employees, including mid-level employees, to get a better sense of the corporation’s culture in practice below the C-suite.
  • Create straight reporting lines from the compliance officer, chief risk officer, and internal auditor to committee chairs. This empowers these officers to speak openly with board members about their concerns without management present. (See NACD’s brief on Audit Committee Oversight of Compliance, which is open to the public for download.)
  • Incentivize compliance through compensation metrics. See NACD’s briefs on Incentives and Risk-Taking and Board-Management Dialogue on Risk Appetite for guidance on designing incentive programs that promote high performance while limiting unhealthy risk-taking.
  • Should your company have one in place, reevaluate multiclass stock structures in light of investor perspectives. Research from the Investor Responsibility Research Center Institute shows that “controlled companies generally underperform on metrics that affect unaffiliated shareholders,” while the “Commonsense Corporate Governance Principles,” released by major institutional investors and others, says that “dual class voting is not best practice.”

 

* The distinguished panel of speakers included: Robert E. Bostrom, senior vice president, general counsel, and corporate secretary at Abercrombie & Fitch Co.; Charles M. Elson, Edgar J. Woolard, Jr. chair in corporate governance, director of the John. L. Weinberg Center for Corporate Governance, and professor of finance at the University of Delaware; Meredith Miller, chief corporate governance officer at UAW Retiree Medical Benefits Trust; Gloria Santona, retired executive vice president, general counsel, and secretary at McDonald’s Corp.; Professor Christian Strenger, academic director, Center for Corporate Governance at the HHL Leipzig Graduate School of Management; Anton R. Valukas, chairman at Jenner & Block LLP; and The Honorable James T. Vaughn, Jr., justice of the Delaware Supreme Court. Italicized comments above are from panelists that participated in this event. However, this discussion was conducted under the Chatham House Rule, so quotes are not attributed to individuals or organizations.

Building a Cybersecurity Talent Pipeline

Published by

While prominent companies and healthcare institutions around the world were reacting to a ransomware attack known as WanaCryptor 2.0, or WannaCry, a young man working for a cybersecurity firm in southeast England landed on a solution that cost just $10.69. He found the so-called “kill switch” in the malware’s code that involved the simple purchase of an unregistered domain name. He promptly registered the domain, halting WannaCry’s spread. The identity of this cyberknight remains anonymous, but one notable fact about his background has emerged: he’s only 22 years old.

According to a 2015 study by the Center for Cyber Safety and Education, the average age of security practitioners is 45 years old. Many security professionals will leave the workforce within the next 20 years, but younger professionals are not seeking careers in cybersecurity at a pace sufficient to keep up with companies’ demands. Developing a workforce that will be prepared to meet companies’ increasingly complex cybersecurity needs means companies—and educators—will need to build a bigger, more inclusive talent pipeline for people interested in the practice.

Summer Fowler

Summer Fowler

When I spoke with cybersecurity expert Summer C. Fowler for the cover story of the May/June 2017 issue of NACD Directorship magazine, I asked about her work at Carnegie Mellon University to recruit diverse candidates to the programs she leads at the CERT Division of the Software Engineering Institute. One look at her Twitter profile illustrates that she’s a passionate supporter of the Cyburgh, PA Initiative, a program developed in partnership between Carnegie Mellon and the Pittsburgh Technology Council to advance the city’s status as a leader in cybersecurity technology. The initiative could not be successful without being inclusive.

“The issue of building a talent pipeline is such a challenge because of what we’re offering by way of schooling,” Fowler said about the role of university-level education in developing the cybersecurity talent pipeline. She then drew a parallel between the education and training of doctors in the 1970s to the challenges the cybersecurity sector has with finding diverse candidates. “When you look back to the early 1970s, the medical field was exactly the same. Only about 11 percent of doctors were women. There also were not many minority doctors in this country. We’re investigating what changes in the medical community were made to bring in more women and underrepresented minorities, so that we can do the exact same thing with computer science and engineering fields.”

Fowler pointed out that there needs to be further delineation of roles in the cybersecurity industry to clarify the hierarchy of talent desired. “When we talk about cybersecurity, we all think about a Ph.D. from Carnegie Mellon or from Stanford,” Fowler said. “We need to get better at differentiating the roles and what training requirements are. When we get there, I think that striation of roles will naturally open a pipeline to more people who are interested in the field because it would be seen as this daunting job that requires a Ph.D.”

Still another challenge exists: getting diverse talent interested in the topic to begin with. I shared with Fowler an anecdote from my own high school experience. My path diverged from that of a male friend who was interested in white-hat hacking, which is the technology industry term for the benevolent hacking of systems to detect vulnerabilities. While I was curious about the world of professionals who were defending against cyberattacks, I had no outlet for learning about programming at the time. No one at my public high school in inner-city Memphis was engaging young women in learning about computer science in 2004, and my friend had family who supported and encouraged his interest.

Fast forward nearly 13 years later, and my friend is a practicing white-hat hacker for a Fortune 500 company. I, on the other hand, earned my bachelor’s degree in creative writing, and have since revived my interest in the topic and write about it from a governance perspective. Could I have been working at the same company with the helpful nudges of invested educators, or with after school programs for young women like Girls Who Code that are sponsored by interested corporations? Fowler seems to think the answer is “yes.”

She suggests that the solution now will not be to bring girls and young women to technology, but to bring discussions of technology to them within contexts that interest them. “Instead of saying to girls, ‘You need to belong to the computer science club,’ talk to them about what computer science might mean to ballet, or to whatever program they’re involved in.” She suggested discussing breaches to the entertainment industry with young people interested in acting or movies, for instance, as a way to pique their interest in a field they might not have considered before.

Ultimately, one of the greatest challenges to building the cybersecurity pipeline will involve developing aptitude tests, then encouraging promising young people to pursue a career down that pipeline. “It’s also a matter of figuring out what the specific competencies are. We’ve done a really good job for lots of different types of jobs at being able to say, ‘Let’s perform an assessment to see what your skills are and what you’d like to be doing.’ That process enables us to say, ‘Wow, you would make a great attorney, or you would make a really good financial analyst.’ We don’t have that in the realm of cybersecurity.

Building out more roles in cybersecurity and advocating for the inclusion of the role into other career aptitude tests would help young people—and perhaps even more women—to get excited to join the ranks of cyberkinghts in the future.


Katie Swafford is assistant editor of NACD Directorship magazine and also serves as editor of NACD’s Board Leaders’ Blog.

Click here to learn more about NACD’s online cyber-risk oversight program for directors.

Questions to Ask After the WannaCry Attack

Published by

Major General (Ret.) Brett Williams

After last week’s devastating global ransomware attack, now known as WannaCry, directors will once again be questioning management teams to make sure the company is protected. The challenge is that most directors do not know what questions they should be asking.

If I were sitting on a board, this attack would prompt me to ask questions about the following three areas:

  • End of Life (EOL) software;
  • patching; and
  • disaster recovery.

EOL Software. EOL software is software that is no longer supported by the company that developed it in the first place, meaning that it is not updated or patched to protect against emerging threats. WannaCry took advantage of versions of the Microsoft Windows operating system that were beyond EOL and had well-known security vulnerabilities.

Typically, a company runs EOL software because they have a critical application that requires customized software that cannot run on a current operating system. This situation might force you to maintain an EOL version of Windows, for example, to run the software. In the instance of WannaCry, Windows XP and 8 in particular were targeted. Boards should be asking what risks are we taking by allowing management to continue running EOL software. Are there other options? Could we contract for the development of a new solution? If not, what measures have we taken to mitigate risks presented by relying on EOL software?

Other times companies run EOL software because they do not want to pay for the new software or they expect a level of unacceptable operational friction to occur during the transition from the old version to the new. Particularly in a large, complex environment the cross-platform dependencies can be difficult to understand and predict. Again, it is a risk assessment. What is the risk of running the outdated software, particularly when it supports a critical business function? If the solution is perceived as unaffordable, how does the cost of a new solution compare to the cost of a breach? Directors should also ask where are we running EOL software and why.

Patching. Software companies regularly release updates to their software called patches. The patches address performance issues, fix software bugs, add functionality, and eliminate security vulnerabilities. At any one time, even a mid-sized company could have a backlog of hundreds of patches that have not been applied. This backlog develops for a variety of reasons, but the most central issue is that information technology staff are concerned that applying the patch may “break” some process or software integration and impact the business. This is a valid concern.

In the case of WannaCry, Microsoft issued a patch in March  that would eliminate the vulnerability that allowed the malware to spread. Two months later, hundreds of thousands of machines remained unpatched and were successfully compromised.

Directors should ask for a high-level description of the risk management framework applied to the patching process. Do we treat critical patches differently than we treat lower-grade patches? Have we identified the software that supports critical business processes and apply a different time standard to apply patches there? If a patch will close a critical security vulnerability, but may also disrupt a strategic business function, are the leaders at the appropriate level of the business planning to manage disruption while also securing the enterprise? Have we invested in solutions that expedite the patching process so that we can patch as efficiently as possible?

Disaster Recovery. It is considered a disaster when your company ceases to execute core business functions because of a cyberattack. In the case of WannaCry, many businesses, including essential medical facilities in the United Kingdom, could not function. WannaCry was a potent example of how a cyberattack, which is an abstract concept for many business leaders, can have devastating impact in the physical world.

One aspect of disaster recovery is how quickly a company can recover data that has been encrypted or destroyed. Directors should have a strategic view of the data backup and recovery process. Have we identified the critical data that must be backed up? Have we determined the period of time the backup needs to cover and how quickly we need to be able to switch to the backup? Have we tested ourselves to prove that we could successfully pivot to the backup? What business impact is likely to occur?

The hospitals impacted by WannaCry present another angle of the disaster recovery scenario. For these hospitals, the disaster wasn’t limited to the loss of data. Most medical devices in use today interface with a computer for command and control of that device. During this attack, those command and control computers were rendered inoperative when the ransomware encrypted the software that allows the control computer to issue commands to the connected device. In many cases there is no way to revert to “manual” control. This scenario is particularly troubling given the potential to cause bodily harm.

It is easy to see a similar attack in a manufacturing plant where a control unit could be disabled bringing an assembly line to a halt. And it is not hard to imagine a threat to life and limb in a scenario where we rely on computer control to maintain temperatures and pressures at a safe level in a nuclear power plant.

Directors should ask about the process to recover control of critical assets. Can we activate backup systems that were not connected to the network at the time of the attack? If we bring the backup system on line, how do we know it will not be infected by the same malware? Have the appropriate departments practiced recovery process scenarios? What was the level of business disruption? Does everyone in the company know his or her role in getting critical operations back up and running?

Directors provide oversight of the risk management process—they do note execute the process. Understanding how the company is managing risk around EOL software, patching, and disaster recovery sets the right tone at the top and ensures that the company is better prepared for the inevitable next round of attacks.


Major General (Retired) Brett Williams is a co-founder of IronNet Cybersecurity and the former Director of Operations at U.S. Cyber Command. He is an NACD Board Governance Fellow and faculty member with NACD’s Board Advisory Services where he conducts in-depth cyber-risk oversight seminars for member boards. Brett is also a noted keynote speaker on a variety of cyber related topics.

Looking to strengthen your board’s cyber-risk oversight? Click here to review NACD’s Cyber-Risk Oversight Board Resource Center.