427 Followers
144 Following
182 Posts
#InfoSec researcher and data storyteller; Cyentia Institute (@cyentiainst) co-founder; Virginia Tech professor; Verizon DBIR and VERIS creator; Advisor for RSA Conference and FAIR Institute; Star Wars fanboi; Coffee addict; Beer and cocktail snob; Father of 5.

I use this account on occasion to share and discuss research findings I consider informative and/or useful. Typically more active on LinkedIn: https://www.linkedin.com/in/drwadebaker

What happens when a Dungeons & Dragons nerd takes 100 million user behaviors and events and builds a human risk character alignment chart?

You don't even need to roll for initiative to find out! Just read this article featuring a fun line of analysis from a recent Cyentia Institute and Living Security report.

https://www.linkedin.com/pulse/whats-your-dd-human-risk-alignment-wade-baker-ph-d--kazme/

Had the opportunity to write a guest post over on Cobalt's blog sharing why measuring vulnerability half-life using survival analysis is a more realistic performance metric for remediation than MTTR. I also compare the half-life of #pentest findings to other #infosec domains of traditional vulnerability management, application security, and third-party risk assessments.

Check it out, and lemme know whatcha think!

#cybersecurity #infosec #appsec #vulnerabilitymanagement #thirdpartyrisk

https://www.cobalt.io/blog/half-life-tells-the-whole-remediation-story

Is your organization now more or less likely to experience a significant #cybersecurity event than it was 10y ago?

Well, that depends. Let's look at some data from Cyentia Institute's recent 2025 Information Risk Insights Study (IRIS).

The chart below depicts the annualized incident probability for firms in each revenue tier. I won't go into the details here of how we modeled this, but the methodology appendix in the report does get into that (link below). And if you want even more detail, Joran Elias has an excellent blog post for Cyentia Institute members (free account). For now, just assume we've used many incidents over many years to model the probabilities you see here.

From the chart, you can see why I say "that depends" to the lead question. The probability of a <$100M firm suffering a #securityincident has more than doubled, while the chance of a $100B+ megacorporation having an event has dropped by a third over the same time frame. Meanwhile, incident probability for organizations in $1B to $100B range have remained relatively static.

Unfortunately, our dataset is silent on the underlying factors behind these #cyberevent trends, but we can engage in some informed speculation. And LinkedIn is the perfect platform for it. I'll start.

To me, this chart hammers home Wendy Nather's concept of the security poverty line. Giant corporations with their giant budgets to hire the best people, buy the best technology, and implement the best processes, are finding success. But the pace of digitalization has outpaced SMBs’ ability to defend their growing attack surfaces and mitigate #cyberrisk .

I have many other thoughts regarding the factors underlying what we see here, but I'd rather hear from you. What do you see as key contributors?

****
Get the IRIS 2025 here: https://www.cyentia.com/iris2025/

You'll have the option to just download it or get it or join Cyentia's free membership program for the report plus a bunch of bonus analytical content.

Are #cybersecurity incidents growing more costly?

Cyentia Institute's recent Information Risk Insights Study points to a 15-fold increase in the cost of #incidents and #databreaches over the last 15 years.

The chart on the left shows the distribution of known/reported financial losses from incidents across the time period of the study. The typical (median) incident costs about $600K, while more extreme (95th percentile) losses swell to $32M. Note that the chart uses a log scale, so the tail of large losses is a lot longer than it appears.

The chart on the right trends the escalating costs of cyber events over time. Median losses from a security incident have absolutely exploded over the last 15 years, rising 15-fold from $190K to almost $3 million! The cost of extreme events has also risen substantially (~5x). So, yeah—cyber events are definitely growing more costly.

That said, this picture looks a lot different among different types and sizes of organizations. How are financial losses and other #cyberrisk factors trending for orgs like yours?

Download the full IRIS 2025 to find out!
Free with no reg req'd - though you can join Cyentia's free membership forum for bonus analytical content related to the report.

https://www.cyentia.com/iris2025/

Are security incidents becoming more common? That's the first question we seek to answer in the upcoming 2025 Information Risk Insights Study. Check out this article for a preview. #breaches #incident #cyberrisk

https://www.linkedin.com/pulse/security-incidents-becoming-more-common-wade-baker-ph-d--navhe

Are security incidents becoming more common?

To many of you, the answer to this question seems so obvious that it’s hardly worth asking. But we’re not ones to let any assumption go unchallenged.

Presenting this morning at #RSAC2025 on findings from a massive study analyzing long-term cyber loss trends.

https://path.rsaconference.com/flow/rsac/us25/FullAgenda/page/catalog/session/1727461550962001LYM8

US25-Header

RSA Conference

Cyber risk is not evenly distributed across users in your workforce. In fact, it's very lopsided. A large majority of risk events in your organization probably tie back to a relatively small population of users.

The attached figures provide some stats supporting that statement:

- Just 1% of users are behind 44% of all clicked phishing emails. 5% of users are responsible for 83.4% of all clicks.

- 1% of users are behind 92% of all malware events! 5% of users are responsible for ALL malware events. The remaining 95% had a clean record.

I don't think the proper response to these statistics is to grab torches and pitchforks and go round up these users to purge them from among us. Rather, these results present an opportunity to have a big impact on risk reduction by doing more focused/effective job of educating, incentivizing, and influencing the behavior we want to see among the riskiest users.

Full report "Exposing Human Risk" from Mimecast and Cyentia Institute is available here (no reg req'd): https://assets.mimecast.com/api/public/content/mimecast-exposing-human-risk

#cybersecurity #cyberrisk #insiderthreat #malware #phishing

@cymaphore That's a good question, and unfortunately, I don't have a clear answer. The primary data for the blue line comes from vuln scans of infrastructure conducted by security teams.

I wonder if it may have something to do with the deployments of those assets in organizations. Critical servers would likely show slower remediation than *nix end user devices (OSX is super quick, for ex).

@dragonfrog These could all be contributors. One clarification:

- even though the jumpiness is for remediation rather than exploitation, the sample of vulns for which we're measuring remediation data is only those with exploitation. So the line would be different for all vulns for *nix. We were trying for an "apples to apples" here so the lines are relative for the same vulns.

@dragonfrog Here's another chart for *nix remediation. It's a larger sample not limited to only vulns with exploitation activity. This one is noticeably smoother.