Thursday, June 05, 2008

Site Security Policy – open for comments

OK gang, this is one of those rare moments where feedback from community will directly influence a security feature that’ll make a real difference. First some background...

About 6 months ago Brandon Sterne left a cushy infosec position at eBay for Mozilla to solve an extremely important Web security problem he couldn’t while he was there. The same exact problem a lot of major website properties have including Google, Yahoo, MySpace, Microsoft, Facebook and so on. Where business requirements say that users must be able to upload dynamic content (HTML/JavaScript) where it’ll interact with other users. The other being including CDN content (advertising) supplied by multiple unknown upstream providers. We all know the damage this stuff do when abused.

Unfortunately browsers lack any mechanism to specify what the content on its website should be able to do and where its supposed to originate. When accepting user-supplied dynamic content on a website, it’s all or nothing. Website owners need more granularity. This is where the idea of content-restrictions came from years ago, ironically by RSnake whom also worked for eBay years back. The idea never really got off the paper and into browser code despite a lot experts, including myself, pleading for even a limited implementation. This is where Brandon comes in and this presentation on “Web Application Security and the Browser” he recently gave during Yahoo Security Week.

Brandon is in the process of creating Site Security Policy, a specification for people to comment on and proof-of-concept extension for people to play around with. He’s got policy provisions worked in to help prevent XSS, CSRF, and even Intranet Hacking. Brandon even has some cool client-side IDS stuff worked in. The vision is to later formalize the specification through W3C and integrate the feature natively into the browser once trouble spots are ironed out.

Comment away!

17 comments:

Anonymous said...

This sounds like a job for Giorgio Maone. It is an interesting concept, and hopefully I am not misunderstanding how it would be implemented (the response was in an HTTP HEAD request), but wouldn't it be a bit easier if it was almost standardized in a way similar to robots.txt, crossdomain.xml, or even favicons to an extent?

Jeremiah Grossman said...

I believe Giorgio is helping Brandon out actually, but I could be wrong. Either way, that type of feedback is exactly what they need to know. No one is right or wrong here as its never been done before. However, what we do know is that the solution needs to provide just enough value to be worthwhile and easy enough for all major vendors to implement in a reasonable amount of time. Otherwise, limited adoption potential.

Anonymous said...

@ Awesome Andrew:

We have considered the question of "Headers vs External File" for policy delivery. It's one that is certainly still open for debate and one that I don't personally feel too strongly about. I will say that there was some backlash against the favicon/crossdomain.xml model because some people feel it creates "log spam" for sites that don't have one.

Also, NoScript is a huge source of inspiration for this project. I've used it for a long time. Part of the spirit of this proposal, though, is to put the decision about what scripts should and shouldn't execute in the hands of websites. There are many users who may not be sophisticated enough to make those decisions. As it pertains to XSS mitigation, I tend to think of Site Security Policy as "NoScript with site-defined white lists".

Anonymous said...

For me I'm a fan of the config file over headers (I guess both could be supported?). Having an extra entry for every host in a log file isn't a terrible compromise when the alternative involves changing code (some frameworks will allow adding of headers but not all).

The ability to restrict certain actions from being 'able to be performed' is the single largest impact in appsec that I can think of. In 5 years I suspect we'll be saying to each other 'Remember back in the day when script includes from third parties had full DOM access?' :)

My related rant on this topic from back in November
http://www.cgisecurity.com/2007/11/08

- Robert

Anonymous said...

Well I fully and whole-heartedly agree with the comments and opinions made by both Robert, and Giorgio on these matters. In my mind it would be ideal and most beneficial to allow for both methods or options to be selected rather than solely one over the other. Regardless my thoughts on the subject is that perhaps if the "standardized-file" method were to indeed be implemented that any 404 or 408 HTTP status codes received by the client as a direct result of the initial request for what would appear to be a nonexistent, or temporarily unreachable resource could trigger a possible "global denial" scenario preventing any third-party scripts from executing as an additional security measure. Giorgio disagrees however, or at least does not necessarily find this idea to suitable as it would inevitably "break the 'original concept' of the Web (a collection of freely linkable documents)."
With respect to log SPAM though I do not believe it would be any more of a nuisance than those damn favicon requests, which are generated whether one is explicitly defined, or fails to exist, and again as Giorgio stated, "The log spam objection is a moot point if you know how to setup log filters." I have also noticed an increased number of request for "sitemap.xml" on my own domains lately as well even though none are in use at this point in time. My only other true concern would be in regard to the fact that the very sample people that have inspired such a need for this additional layer of security, which is a good majority of web application developers and administrators (both "amateur", or hobbyists, and "professional") who remain uneducated in the area of Cross-Site vulnerabilities, would more than likely either not understand how to utilize these methods (which is why I developed the "de facto" rule), or would remain unaware of them anyway. I believe the latter condition would truly be more likely than one just completely disregarding such a feature, or paying no mind to it whatsoever, but considering how many other classifications there are for vulnerabilities in the web application space alone we must take into account the fact that the targeted consumer (I believe this term applies loosely) will probably stay incognizant. Realistically how mature has web application security become when compared to its state earlier in the decade, and yet how many websites are still vulnerable to something as basic as directory indexing, predictable resource locations, or information leakage?

dre said...

@ Jeremiah:

Keep up the good posts! This is highly educational and I believe does something good for the world.

Many of us have all had our eyes on content-restrictions for many years now. SSP for Firefox is a dream made reality.

The Grumpy Hacker said...

Simple. RFC 3514 compliance.

(You did say no one is right or wrong here...)

Anonymous said...

I've commented on Giorgio's blog about this and I firmly believe that there's no point in doing it unless "all No restrictions" is removed and "None" is the default.

I'd go for a text file because many developers would find it easier to implement and if this file doesn't exist than the browser wouldn't load the site and raise a error.

The spec. also needs to be familiar to a developer, maybe using HTML like tags, e.g. <site url="externalsite.com" script="yes" iframe="no" cookies="yes" />

Anonymous said...

In addition the security policy could be part of XHTML validation therefore making it easier to understand for web designers to implement. Their site won't pass validation unless the policy exists therefore encouraging change.

Jeremiah Grossman said...

People are moving away from XHTML though as its too hard to write it in strict form. Its true though, this could be wedged into perhaps HTML5, if that ever takes off.

Anonymous said...

Hi Jeremiah/Brandon,

I would want to suggest the usage of an external file, for the following reasons:

* Allows browser cache, so it can potentially cost less bandwidth
* Easier to implement on sites where no scripting is available
* Removes the HEAD before POST request, making it faster, and costing less bandwidth.
* Allows the policy to be enforced for a complete domain, instead having to add it to each and every script.

The last one is the most important to me..

Anonymous said...

One more comment about HEAD before POST..

It would break a lot of existing scripts.. Since the request will always happen before a POST..

John said...

I certainly understand the reasoning behind adding a policy like this, but I feel the need to chime in on the behalf of users and developers who see user control of their content as a good thing. Obviously from a security perspective, it's not. But it's only in the last few years that things like bookmarklets, Greasemonkey, etc have given the user the ability to manipulate what is, after all, the content on their screen.

If this policy is added, I would like at least the ability to white-list a script source at the browser level that can override the site policy. I'm sure the complaint will be that this defeats the point, but it doesn't have to if the interface is designed correctly. I'm just concerned that, with the ocean of bathwater this will throw out, a few babies might be considered an acceptable loss when they really shouldn't be.

Anonymous said...

How can anybody favor config files over headers or meta element http-equiv attributes?

Want a config file? Set a mod_headers directive in your server config or .htaccess!

Anonymous said...

@Anonymous

A meta element is pointless because if you can inject to the page you could modify the tag.

Not everyone has access to their server config or .htaccess. Yes server admins can do this but who is developing the web sites...

Web designers don't want to be messing around with server configs, they need a simple file that they can understand and know what it does.

Remember not all of them have a high technical background

mme said...

Is there a CAPS extension in the works, too?

Huntsville seo company said...

The purpose ARe site Site Security Policy identify assets including, but not restricted to hardware, private information, intellectual property, and personnel and identify potential risks to those assets.