Tuesday, June 24, 2008

Twitters users angry about SQL Injection hacks on their websites

The mass SQL injection attacks have impacted the lives of a lot of Twitter users out there. I did a search for “SQL Injection and the results are page after page of misery, time wasted cleaning things up, and cursing up a storm. You can really feel their pain and the worst is probably not yet over. Still gotta fix all that legacy code. Here are some of my favorites tweets…

shartley: Cleaning yet another SQL injection attack. I'm F'n sick of cleaning up after lazy programming that took place during my year away.

jamesTWIT: To the hacker who designed the SQL injection bot. I hope you die and not a fast death...something slow and painful. Like caught in a fire!

chadmonahan: Dearest SQL Injection people, I don't like you. Yours, CM

programwitch: F'n SQL injection hacks.

Anirask: Damnit. Our main website is down cause of SQL Injection attacks. You figure devs would sanitize their inputs against this shit..

9 comments:

Rob Ragan said...

To all those angry twitter users: Find the SQL Injection before the bot finds it for you. Check out http://www.communities.hp.com/securitysoftware/blogs/spilabs/archive/2008/06/24/finding-sql-injection-with-scrawlr.aspx

Jeremiah Grossman said...

Odd, based on the licensing restrictions, Scrawl basically seems unusable. Maybe for like a REALLY small online store or something...

* Will only crawls up to 1500 pages
* Does not support sites requiring authentication
* Does not perform Blind SQL injection
* Cannot retrieve database contents
* Does not support JavaScript or flash parsing
* Will not test forms for SQL Injection (POST Parameters)"

Rob Ragan said...

Scrawlr is limited. However research has shown that the SQL Injection bots are not very sophisticated. i.e. Only targeting ASP pages, only auditing parameters found in requests using the GET verb.

Of course there are ways of doing more comprehensive testing for SQL Injection. I'm sure followers of your blog can attest to that.

Jeremiah Grossman said...

Oh I dunno about that, whoever developed this payload is definitely no noob.
http://isc.sans.org/diary.html?storyid=4565

Payload aside, I was more talking about the 1,500 page count limit. Unless your vulnerable webapp is within those URLs, well, your outta luck I guess. And its tough to compete with the crawling capabilities of Google since that essentially what's being used for target list acquisition.

Don't get me wrong, I'm not saying you should be giving anything more away from free, it just is what it is.

kuza55 said...

Lol, pwned.

Erwin Geirnaert said...

If you Google for m.js filetype:.aspx you will see that there are already aspx sites infected.

People need help asap.

It's a good thing that tools are out there and best practices on fixing it, but I have a flashback of the I Love You virus.

This is the beginning of a new decade in web application security

Matt Presson said...

I love how some of the twitter users think the solution is sanitation while it is actually validation and parametrization. Oh well, maybe they will learn.

Jeremiah Grossman said...

@Erwin, I agree. I think the webappsec industry's slammer/blaster is upon us. Could happen at any moment.

@matt, do ya blame them really though? I mean, so much conflicting information out there, really hard for them to sort through it all.

Rob Ragan said...

Rejoice! There is no crawl limit!
The limit is a lie.