Monday, January 01, 2007

2007 web application security project ideas

Update: Dmitry from SecuriTeam has posted his silty TODO list as well.

Over the past few months I’ve released MOST of my dust-collecting browser hacks I thought people would find interesting. It’s fun speaking with people about them, seeing them out the code, and in many cases take it to the next level. And when conversing with others and thinking deeply about the industry, it’s common to brainstorm new “project” ideas that would be cool to work on. I’ve collected a ton of great stuff. The problem is there’s simply not enough time to work on them all. The result again is a dust-collecting list of the undone. It’s time to empty few of those out too.

Standardized Severity Rating System for Web Application Vulnerabilities
Only a tiny fraction of people in the web application security industry use the available severity rating systems like DREAD, TRIKE, and CVSS. The reasons why are not known exactly, but the feeling I get is that the systems really don’t apply to this field. Something specific needs to be developed.

Web Application Security Professional Certification
There is a huge shortage of experienced and knowledgeable web application security professionals. As a result there’s a big need for training of developers, info sec’s, and managers to fill the gap. And when organizations hire for webappsec roles, it’s difficult for them to find those who really know what their doing and weed out the rest. There’s a building need for CISSP, SANS certification, etc for web application security roles.

Client-side solutions against XSS and CSRF
We all know it, there’s not a whole lot we can do to protect ourselves from XSS and CSRF attacks. And we’re the people in the know, so think about what everyone else is going to have to put up with. It doesn’t appear to me that the browser vendors are doing much of anything to remedy the situation either. So if you have any bright ideas on the subject, now is the time to voice them. Perhaps there could be prototyped in Plug-ins before they go into the major releases.

Open source web application scanner
In the network security field there’s tons of respectable free and open source vulnerability scanners like Nessus, SAINT, etc. They serve a variety of purposes, not the least of which is to push the commercial guys to make their products better and worth the money people spend. There needs to be the same choice in the web application security world. I’m talking way beyond Nikto and the other CGI scanners. There needs to be an OS product that can login, crawl, inject, forced browse, etc. This is not easy to build, but an idea who’s time has definitely come.


I'll publish a few more later. If you have other ideas you want to share, your welcome to.

13 comments:

Sven / Disenchant said...

Hi Jeremiah,
I fully agree with this stuff.

FYI: I'm actually working on the point of “Standardized Severity Rating System for Web Application Vulnerabilities” [1]. Because I hadn't that much time in the last two months I'm still in the research phase but it will come :)

[1] http://www.disenchant.ch/blog/webapplication-security-risk-calculation/21

Regards,
Sven / Disenchant

Mike Andrews said...

Generally I agree with your posts, but not this one so I'll break with my usual lurker tradition and comment.

Having any "standardized" severity rating system is neigh-on impossible. Each of the models suffer from subjectivity problems (what values do you plug in - how do you give a number for "damage" - what is the difference between a 6 and a 7 - etc, etc), and are only really useful if one set of people (or very like-minded/highly trained people) do all the "scoring". With this in mind, I think that you will only get standardization within organizations by using one model, which I think to some degree is already happening. Achieving this outside, for all findings/vulnerabilities, probably isn't possible IMHO.

As for a certification, the world does not need another cert :) It might be an honorable idea and get more people thinking about web security (and more better identifying those that know the field), but certs are easily played and lose value very quickly IMO (lots of individuals with meaningless letters after their name - I have one set, but never use them - judge people by their experience and knowledge rather than some studying and a quick test). What I *do* think we need is lots more information out there on web-based vulns, and easier systems for people to follow and test their apps. Not just point-and-click automated tools, but honest-to-goodness methodologies (not checklists or brief write-ups).

Finally, an OSS scanner would be a great thing to have out there. I even donated a codebase to OWASP to help start the beretta project off (http://www.owasp.org/index.php/OWASP_FOSBBWAS_(code_name_Beretta)). However, I had to quickly withdraw from that project, because of this - http://tinyurl.com/2zndg - which amazingly still hasn't been challenge yet. I'm not a lawyer, but I believe that anyone that works on such a project inside the US is risking themselves litigation by Watchfire the current patent holder (or anywhere else that the patent is registered). The guys working on it at the moment are probably getting away with with it because Europe doesn't have software patents - probably one of only a small number of things they actually do right! Just this sad fact alone means seeing an OSS scanner in the US is unlikely IMO unless someone has the money to challenge the patent for the inevitable lawsuit for loss of earnings that Watchfire/SPI (a licensee of the patent as I understand) would be sending your way :(

So sorry Jeremiah, although I love reading you posts, generally agree with you and find you have great insights in this field, I would have to beg-to-differ on this one :)

Martin said...

Hi Jeremiah,

on the topic of client side protection: There are at least two approaches for client side XSS protection that I know of: Noxes (http://www.seclab.tuwien.ac.at/papers/noxes.pdf) and NoMoXSS (http://www.seclab.tuwien.ac.at/projects/jstaint/).

Furthermore, (shameless plug) we (Justus Winter and myself) developed RequestRodeo, a client-side protection solution to protect users against CSRF. Here is the project side with the code: http://savannah.nongnu.org/projects/requestrodeo And here is a link to the paper: http://www.owasp.org/images/4/42/RequestRodeo-MartinJohns.pdf

Jeremiah Grossman said...

Sven,

Cool! It looks like your thinking about the problem correctly. Good luck with the project and keep us appraised. Also, don't be shy about soliciting feedback from the web security mailing list.

Jeremiah Grossman said...

Hi Mike,

> Generally I agree with your posts, but not this one so I'll break with my usual lurker tradition and comment.

If everyone agreed with all my posts all the time I'd have to question the value and insightfulness I was providing. Either that or I'd be telepathic.

> Having any "standardized" severity rating system is neigh-on impossible. Each of the models suffer from subjectivity problems.

You might be right, but I think subjectivity is probably OK. If a flexible system were developed, and the subjectivity variables were assigned by the organization itself (or a paid expert), then the output would be what they are looking for. As it stands people are just looking for ANYTHING workable to help in prioritization, we’ll get perfection later.

> As for a certification, the world does not need another cert :)

I hear ya. The last thing we need is another crappy certification to snicker at. The thing is though organizations are looking for solid webappsec people. They’re asking what to look for as a minimum bar to qualify candidates before the interviewing process begins. This need will only increase over the next year. So maybe certification isn’t the answer, but then what is?

> Finally, an OSS scanner would be a great thing to have out there.

Yes, I’m well aware of the Watchfire patent and the sad state of affairs that is the U.S. patent system. Generally speaking you can’t code a for loop anymore without violating someone’s patent. And I refuse to let that stop me from developing software or working on a project. Otherwise I better pick a new career. I’ll continue to write code that brings value, fills a need, and I’ll take the battles as they come. If Watchfire wants to be known for viciously attacking open source communities, so be it. I’m sure they’d suffer equally in return. At least, that’s my opinion on that matter.

> http://tinyurl.com/2zndg
You sure that’s the link you meant?

Anyway, thanks for commenting and please break you silence more often. :)

dre said...

good post and great replies so far!

for severity rating systems - i have read through all of the work on DREAD (MS Threat Modeling), TRIKE (went to the toorcon preso 1.5 years ago), and CVSS (read all the stuff on the FIRST page). I read a few more sources, specifically on vulnerability management. it seems that threat modeling (more STRIDE instead of DREAD) applies fully, while rating systems do not. at least that's the impression i'm getting. there are categories of websites, and each type of website has different threats. universal metrics are going to be difficult to apply without making this distinction.

as far as the cert goes - i also recommend against such a concept. certs are a part of a more global picture of instructional capital. what you want is training - and sharing of ideas. this means conferences and local meetings. there is no conference covering just "web application security", and there should definitely be more than one at this point (one "all expenses paid" and one that costs like $50 for students at a hotel with cheap rates). local chapters of OWASP have started up (there is one is my city now)... so people should start going to these events and presenting regularly.

in regards to client-side security: i also wanted to mention noxes, but someone else already has. httpOnly and content-restrictions should also find a way into plug-ins and browsers quickly and easily (and they should scale as to not affect performance as well). features in WAF and XML gateways should make their way into cablemodems, dsl routers, WiFi routers, and personal firewall software if possible (again, while affecting performance minimally as much as possible). layered defense-in-depth has seemed to work over time with regards to system and network vulnerability protection.

Chris E said...

Standardizing severity ratings is something that the industry has needed for some time now. As an application security consultant for 6 years (my previous job), there were countless times where individuals disagreed on the severity of a particular vulnerability or class of vulnerability. An objective baseline for each type of vulnerability is a good foundation but there has to be some flexibility built in for subjectivity (within reason) and context. For example, avoiding denial of service vulnerabilities is far more important to some applications than others, so in those cases it would need to be weighted somewhat higher.

As for a certification I have to disagree wholeheartedly. The last thing we need is another CISSP-like cert for people to pad their resumes with. The best penetration testers I've worked with had no industry certifications and most opposed the idea that they were any indication of skill. In my experience, the more industry certifications a person has on their resume to prop themselves up, the less likely they are to stand on their skills alone. There will obviously be exceptions to this rule, but by and large I've found it to be fairly consistent, at least within the penetration testing space. Software development or equipment-specific certifications may be more effective, but I have little perspective on that. We need some way to identify solid web application people but I would advocate giving it some more thought before going down the certification route.

Jeremiah Grossman said...

wow, some great comments here, hard to keep up.

Chris,

> We need some way to identify solid web application people but I would advocate giving it some more thought before going down the certification route.

Solid advice. There must be some way to get a decent minimum bar barometer.... *thinking*

Chris W said...

A good place to start on standardizing severity levels is the CVSS framework. This is being used by the NVD and has been adopted by Oracle. I believe strongly in not reinventing the wheel. If there are aspects to web vulnerabilities that are different from the universe of vulnerabilities then CVSS should be strengthened not redone.

CVSS has a base score of computing the impact based on confidentiality, integrity, and availability. Then uses a bias based on the business context of the application to more heavily weigh one of the dimensions.

I would be interested in working with people on this.

-Chris W.

Sylvan von Stuppe said...

Actually, on the OS scanner, I would say that being able to log in is one of the LAST things it needs to be able to do. Login functionality is so different from app to app that staying logged in is the one thing all of the off-the-shelf tools CAN'T seem to do right.

I think it would be more profitable to use existing cookies and parameter values to perform analysis during an expert crawl. And of course, the tool needs to be able to crawl the rest as well.

Sylvan von Stuppe said...

Also, I think there need to be some better training sites available than "Hack this site" and webgoat. There needs to be a high-publicity project available for breaking that does more than just have a vulnerability for defacement.

Jeremiah Grossman said...

> Actually, on the OS scanner, I would say that being able to log in is one of the LAST things it needs to be able to do.

Last on my tiny list maybe, but certainly not the bottom. And hen I say login, I really mean login w/ logout detection. The last thing you have to do is have invalid scans and have to baby sit the scanner for hours on complex sites.

> Login functionality is so different from app to app that staying logged in is the one thing all of the off-the-shelf tools CAN'T seem to do right.

Tell me about it. We struggle with it in our own technology as well. Crawling properly in many ways is just as hard as login.

> I think it would be more profitable to use existing cookies and parameter values to perform analysis during an expert crawl. And of course, the tool needs to be able to crawl the rest as well.

I think I can agree there. Have to start somewhere and no reason to bite off more than one can chew.

> There needs to be a high-publicity project available for breaking that does more than just have a vulnerability for defacement.

Great idea. I wholeheartedly agree. So much work, so little time. If someone decides to start any of these projects, I'll do whatever I can do publicize them.

vanderaj said...

Jeremiah,

there's a PR going out this afternoon from a largish organization on a web app sec certification. When it happens, I'll shoot you a copy.

Andrew