In October 2017, former Equifax CEO Richard Smith stood before Congress to explain how a major cyberattack could have happened on his watch — how a still unknown set of attackers hacked into their company’s data, and gained access to the sensitive information of more than 140 million American consumers. Drivers’ license numbers, social security numbers, credit card information. It was all compromised.

As Smith said in that testimony, “The people affected by this are not numbers in a database. They are my friends, my family, members of my church, the members of my community, my neighbors. This breach has impacted all of them. It has impacted all of us…The investigation continues, but it appears that the breach occurred because of both human error and technology failures. These mistakes–made in the same chain of security systems designed with redundancies–allowed criminals to access over 140 million Americans’ data.”

Human Error.

Unfortunately, it’s far from the single case of human error playing a role in cybersecurity breaches. According to CompTIA, the IT industry trade association, human error accounts for 52 percent of breaches. IBM’s 2017 Cyber Security Intelligence Index puts that number much higher, noting that 95 percent of security incidents are a result of human error.

With the average company experiencing nearly 200,000 security threats per day, it is critical that we are equipping students with the analytical and communications skills to be able to spot, and stop threats before they occur. The cybersecurity professionals of this year and in the coming years must be able to adapt to ever-changing threats, with a keen and consistent attention to detail.

On a nearly-daily basis, I come across news supporting the pursuit of careers in cybersecurity. And it’s easier than ever to understand why. Here’s how a recent VentureBeat story presented the argument:

1. Jobs. There are more than 300,000 cybersecurity jobs up for grabs in the U.S. in 2018. And the Bureau of Labor Statistics predicts a steep 28 percent growth rate for cybersecurity positions by 2026 — that’s 300 percent higher than the prediction for all occupations.

2. It’s a job seeker’s market. For every U.S. job opening requiring cybersecurity skills, only 2.5 qualified candidates exist. In comparison, the national average across all job types is 6.5 candidates.

3. Highly competitive salaries. With 200,000 threats per day, companies are willing to pay top dollar for cybersecurity and information security expertise. According to the Bureau of Labor Statistics, the median salary for information security analyst came in at $95,510 in 2017. That’s above the $84,580 average for other computer-related positions and well more than twice the national median salary for all U.S. jobs ($37,690).

4. Job security. We’re still in the infancy of this sector, and the security breaches we’ve seen in recent years aren’t a trend. Students who enter cybersecurity positions now have significant growth potential.

As much as the signs may be pointing to entering the field, for students to succeed in the field, they must show the signs of a details-driven professional. The more opportunities that students can learn how to solve complex issues in dynamic settings, the better equipped they will be in the cybersecurity workplace. Putting the human factor at the center of the educational experience is just what ABET-accredited programs are designed to do well.

At ABET, we see the need to develop the cyber workforce of tomorrow. We’re developing cybersecurity and cybersecurity engineering criteria that require programs to implement both hard skills and transferable skills. We realize that the cybersecurity challenges we face require innovative approaches in education that transcend single disciplines, which is why we’re in the process of establishing broad cybersecurity accreditation criteria.

As I’ve said before, with all the focus on the digital personal assistant, it’s easy to forget the real intelligence behind the artificial: the engineers, designers, developers, and programmers who bring the artificial to life. It’s these details that produce white-hat hackers instead of black-hats, employees that flag security vulnerabilities instead of taking advantage of them and companies that act with urgency to protect consumer data.

Yes, human error is inevitable — but it’s the ways in which we educate and engage our students — it’s in those details — that will have a direct influence on the future state of our information security.