This post is as much as an internal sit-rep as it is one for others to witness, share and challenge, I'm trying to understand the Top10's value eleven years on, the current top10 (2021) is in draft and open for comment, I've put my comments here, as well as the opening to this conversation on github
Looking over the list I can see a few different maturities in play that are a natural progression on those that have taken the time to be a part of OWASP - something I've failed to do, other than being a bit of a backseat driver.
If we abandon technical attack visibility as a Top10 capture (and we have, aside from SSRF in 2021) we are left with a set of broad architectural principles, and I'm okay with that if that's the direction of travel, but there needs to be a conversation about that and the language we use, why we use it and guide rails so readers are on the same page, it would be great if MITRE and OWASP could find a means to both support one schema to rule them all (evil laugh ?).
I can see why the Top10 is morphing into security architecture but I feel it is a bit of scope creep on the Top10, here's why...
Let's pull bug classes into the 'Cause' category and the 'Effect' is an outcome, we could agree that XSS allowed an attacker to steal the user's data, or SQL Injection bug was exploited allowing an attacker to steal a copy database, that's how I see cause and effect, Cause (bug class) effect (unwanted outcome) - If you wanted to be super pedantic you might argue the parser or the code, or the compiler or the CPU is the root of it all, if you're that person, pop round for a free 5-minute hug (no eye contact). - you're not wrong, but ... not now.
If you're on the same page as me re: cause and effect, you'll have a hard time with most of the issues presented in OWASP top10 (listing effects) aside from SSRF (a cause)
So, if we have moved away from causes... okay, but let's have that conversation. I'm a big fan of the ASVS project it's free and you are not the product, the amount of time you spend wondering if you should read it, you could have read it, so ... go read it, make people read it. align to it, embed it, print it out and kiss it.
The reason I mention the ASVS project is that if the Top10 has shifted left into architectural principles, then it really needs to be 'ASVS-Lite' it's a perfect opportunity for the top10 to have a coming of age party, to move away from listing the technical causes, to the preventative practices that promote good architecture and more resilient apps, they've pretty much done that in 2021 but there hasn't been much of a conversation on how it's matured and what we should expect, so if you have an ASVS-Lite (top10) and the ASVS ... you really want people on ASVS.
The best thing that the Top10 has going for it aside from its author's commitment to improving Appsec globally is its reputation, many non-technical leaders still expect their stuff to be 'top10 proof' many of us have had requests to 'test for OWASP top10' style security assessments, actually they need to be tested against the OWASP Testing Guide! another great project, but that's usually a little too late from a security architecture perspective.
If I had my way, ASVS would be the champion, Testing guide for security assurance testing and validation, and the Top10 to focus on readiness rather than a measurement of failures.
If the top10 was to persist as a list i'd want it to be something like this:
A) Issue > effect > technical defence > policy control
If it wanted to be more left something like this:
B) technical defence > policy control || Issue > effect
The A) Above would be an ideal Top10 for the 'old spirit' of the top10, I think, but I don't get that from the OWASP Top10's of recent, it's a blend of a few things, but it's clear rounding up has become more of a theme, B) from above would be a good change of direction.
We don't get that, taking in mind how I feel about cause and effect, then think about the logical journey for each item to start and finish with a considered set of phases, (A and B in the paragraph above) it's harder to make sense of the Top10 list.
Have a look...
Top10 over the decade+
|A1) Injection||A1) Injection||A1) Injection||A1) Broken Access Control|
|A2) Cross-Site Scripting||A2) Broken Authentication and Session Management||A2) Broken Authentication||A2) Cryptographic Failures|
|A3) Broken Authentication and Session Management||A3) Cross-Site Scripting||A3) Sensitive Data Exposure||A3) Injection|
|A4) Insecure Direct Object References||A4) Insecure Direct Object References||A4) XML External Entities||A4) Insecure Design|
|A5) Cross-Site Request Forgery||A5) Security Misconfiguration||A5) Broken Access Control||A5) Security Misconfiguration|
|A6) Security Misconfiguration||A6) Sensitive Data Exposure||A6) Security Misconfiguration||A6) Vulnerable and Outdated Components|
|A7) Insecure Cryptographic Storage||A7) Missing Function Level Access Control||A7) Cross-Site Scripting||A7) Identification and Authentication Failures|
|A8) Failure to Restrict URL Access||A8) Cross-Site Request Forgery (CSRF)||A8) Insecure Deserialization||A8) Software and Data Integrity Failures|
|A9) Insufficient Transport Layer Protection||A9) Using Components with Known Vulnerabilities||A9) Using Components With Known Vulnerabilities||A9) Security Logging and Monitoring Failures|
|A10) Unvalidated Redirects and Forwards||A10) Unvalidated Redirects and Forwards||A10) Insufficient Logging and Monitoring||A10) Server-Side Request Forgery|
If we could pull the authors of 2018 OWASP Proactive controls and Top10 together and let that output act as a gateway to ASVS under the banner of the top10 there is an opportunity to use the Top10's fantastic reputation to shine a light on the right way to security architect or mature your SSDLC by way of ASVS and it would be the best gift for all.
The OWASP Top 10 2021 is now out of draft, the github issue highlighting my concerns has had no response until a night before it's publication citing appreciation for the feedback and issue closed. here