Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What should the OWASP Top 10 end result look like? (Summit Process session, Mon AM1) #8

Closed
tghosth opened this issue Jun 12, 2017 · 9 comments
Assignees
Milestone

Comments

@tghosth
Copy link
Contributor

tghosth commented Jun 12, 2017

Possible options:

  1. Stay as it is - top 10 list of application security risks based with some aggregation of categories?
  2. Change to a "league table" of specific CWEs purely based on data gathered?
  3. Evolve to consider wider issues in application security - this seems to have been the rationale behind "2017 RC1 A7 - Insufficient Attack Protection"?
  4. Something else...?

My personal preference is option 2 with greater focus given to the OWASP Top 10 Proactive Controls or failing that, option 1.

@jmanico
Copy link
Member

jmanico commented Jun 12, 2017 via email

@raesene
Copy link

raesene commented Jun 12, 2017

There is a challenge, of course, in placing too much reliance on a data driven model, which is that it leads only to findings about which data has been gathered being considered for inclusion in the Top 10, and depending on what the basis is, that might be overly restrictive. This leads to a potential chicken and egg scenario where data isn't gathered widely enough on new and emerging issues leading to them not being eligible for inclusion on the Top 10.

To provide a couple of concrete examples. 2013's new A9 "Using components with known vulnerabilities" got quite a bit of push back as I recall as there wasn't at the time a lot of historical data supporting its inclusion.

However if you look at the period from 2013 - now, I'd suggest that the wide range of high profile issues that we've had which would fall into that category makes it seem like it was a good choice for inclusion.

Also looking at things like the current draft's proposed A7, this type of proactive control would never be eligible under a model which used vulnerability data as the primary source of metrics to decide inclusion...

Not to say that data doesn't have a place in analysing what's happening in the AppSec world and what makes sense for inclusion, but just to sound a note of caution on the potential downsides of placing more reliance on it.

@jmanico
Copy link
Member

jmanico commented Jun 12, 2017 via email

@raesene
Copy link

raesene commented Jun 12, 2017

Well I guess I'd look at universal applicability as part of the criteria (so what percentage of application is potentially affected)

If we take XML parsers as an example, issue with them only apply where an XML parser is used obviously, so it's a subset of applications that could potentially have the issue. Anecdotally from the perspective of myself as an application security tester I'd say that I'm seeing fewer of those than I used to (several stacks are focusing more on JSON/REST setups), but then perhaps prevalence is something that we could gather data about.

By comparison pretty much every application I test lacks any form of active defence or response capability and I know for a fact that my life as a "pseudo bad-guy" would be made far more difficult if they did deploy even basic mechanisms to deter automated attacks.

If I was comparing AppSec to the overall Infosec industry I could take the realisation in the general space that preventative controls alone are not enough, and that focus needs to be placed on detective and reactive controls to provide a strong security model.

AppSec is unfortunately badly lacking in detective and reactive controls, in my experience, and that's where I think more could be done to drive developer awareness.

Cheers

Rory

@jmanico
Copy link
Member

jmanico commented Jun 12, 2017 via email

@raesene
Copy link

raesene commented Jun 12, 2017

Interesting ideas Jim. I think it's fair to say we have a difference of opinion here, but then I think there's a wide range of opinions to be considered as part of the Top 10 process, and I'll be very interested to see where it all goes.

As to your point about the developer effort to deploy some form of detective/reactive controls in applications, well that's exactly where I'd see OWASP being able to help, to make that easier.

Obviously we have the work already done by the excellent OWASP AppSensor project and with more attention being paid to the top (perhaps due to inclusion in the Top 10 ;o) ) it should be possible to ease that initial effort for developers and make it easier for them to include this kind of control in their applications...

@jmanico
Copy link
Member

jmanico commented Jun 12, 2017 via email

@tghosth
Copy link
Contributor Author

tghosth commented Jun 13, 2017

This was discussed in the session. The outcome based on @vanderaj 's summary was basically:

  • "...there should always be room for forward looking inclusions. "
  • "Audience is everyone in appsec, and not just developers"
  • "The basis for the OWASP Top 10 is 'risks'. I (@vanderaj) have suggested we adopt the ISO 31000:2015 standard definition for risk."
  • "There will be a transparent and documented decision to ensure that up to 2 of the OWASP Top 10 issues will be forward looking, and that the community should drive the consensus for what they will be."

@tghosth tghosth changed the title What should the OWASP Top 10 end result look like? What should the OWASP Top 10 end result look like? (Summit Process session, Mon AM1) Jun 13, 2017
@vanderaj vanderaj modified the milestone: RC2 Jun 13, 2017
@vanderaj vanderaj self-assigned this Sep 22, 2017
@vanderaj
Copy link
Member

I'm pretty sure this issue is now closed as this is exactly what's happening. Please follow along in GitHub as we modify A7 / A10 and re-order based on data from Friday onwards.

sslHello pushed a commit that referenced this issue May 24, 2018
sslHello pushed a commit that referenced this issue Dec 12, 2018
sslHello pushed a commit that referenced this issue Feb 25, 2019
sslHello pushed a commit that referenced this issue Dec 15, 2021
Update translation to Indonesia according on OWASP top 10 master
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants