Never have companies collected data at the rate they do now. They process it and turn it into actionable insights. Because of this, data is highly valuable; some say it may become more valuable than money itself. As such, it’s increasingly the case that data breaches impact a businesses’ ability to function – this can lead to punitive policies for any loss of personal data. Data breaches, or other nefarious cyber events, can do real harm to human lives and livelihoods. For companies, the average cost of a data breach today is $4.24 million, which is higher than ever before.
Today, more people have more capable devices than in the past, transmitting more data than ever. (Using an employer-issued device to stream or download can open the door to a security breach on an entire organisation.) What’s more, millions of roles — including key tech jobs — are now sitting vacant. Critically, this means that user accounts are also sitting dormant, affording hackers more leeway to experiment and learn from each hack, which at small organisations can go undetected for considerable periods of time. One can easily see, then, how the high-churn, low-retention digital business environment of today lends itself to increasing risk. The question is this: beyond cursory investments in cybersecurity, how are organisations keeping up with the pace of change and the changing face of threats? Are they keeping up? Can they keep up?
One thing is clear: companies are facing an increased burden of risk when it comes to security, which is a big problem. However, this raises an interesting point: we should all be thinking about cybersecurity less in terms of a physical (or digital, as it were) investment in tech itself, and more in terms of an investment in the knowledge and competency of people in tech roles. Having highly skilled people isn’t just part of a strategy; it is the strategy. Without those people, everything falls apart. Admittedly, finding and keeping capable and driven tech workers today may seem “easier said than done.”
What is demanded of workers today is always changing, as we saw following the move to remote work in 2020. As such, the breadth and depth of skills required by the workforce and in key tech roles today is an enormous concern, and a source of confusion for many businesses. HR departments often struggle to understand how to hire for requisite competencies and staff cybersecurity roles. Tech team managers similarly struggle with how to get everyone working from the same (ever-changing) playbook, which can lead to serious quality and performance issues. Secondary to staffing for cybersecurity roles, risk-averse businesses have been slow to adopt emerging technologies like blockchain and AI-enabled tools, fearing that a poorly managed adoption will hinder business or hurt their reputation. It’s easy to understand their conundrum – if you’re struggling to secure your house, is it wise to fill it with more valuables or build entire rooms that you can’t see into?
For cybersecurity, going it alone no longer works
It’s no secret that the tech industry is rapidly and constantly evolving. This landscape can feel overwhelming for decision-makers in IT. The need for upskilling as the driving force behind any organisation’s cyber strategy, though, has never been clearer, and there is a way to do this right using economies of scale. Every tech team in the country shouldn’t have to reinvent the wheel every time they spec out a new tech job ad, or design a training program for newly onboarded IT or Security Managers. The competencies, training tools, and certifications already exist, and the legwork of ensuring that those standards are bulletproof has been handled. Pathways like apprenticeships are an excellent way to ensure learning happens in a consistent manner and leads to skills that can get the job done. Once workers are on the job, training can and should be happening multiple times a year – if not constantly – rather than every 2-3 years, an outdated but all too common approach to tech upskilling. Nurturing internal talent in this way, or by, for example, offering apprentices a job at the end of their apprenticeship, is not only smart for cybersecurity, but it’s a sound strategy for talent development and retention. It also ensures the consistent presence of skilled teams and eliminates the need to employ new staff. In short, keeping the focus on upskilling (looking left) is smarter and more efficient than focusing on new hires (looking right).
For cyber teams today, what’s essential is this: they must fully understand digital best practices and have a continually evolving understanding of cyber hygiene (things like zero-trust policies and two-factor authentication), because the norms themselves are changing rapidly. Even now, too many organisations are still trying to go it alone when it comes to cybersecurity; with teams working in isolation and relying on best guesses to keep them safe. Industry recognised training and certifications, as well as on-ramps like apprenticeships, can remove the guesswork from cybersecurity upskilling and ensure access to the most up-to-date tools and techniques. Getting everyone on the same page and reading from the same playbook is a first step, and gives organisations a fighting chance to weather inevitable cyber storms. When organisations make early and sound investments in the training and upskilling strategies of their tech workers, they free up resources to use tech as something more than a defense tactic, but as a strategy for growth and evolution. However, when they don’t, they run the very real risk of becoming not only a target, but of losing relevance in an increasingly digital, connected, and data-driven business world.