Saturday, March 31, 2012

facebook blocks spam URLs, but there method looks useless

Facebook has a user-security service checking for the spam/malicious nature of URLs posted by its users and blocking those if they belong to Facebook's blacklisted list.
More about it at

Some important text from the link above:
These automated systems don't just prevent spam and other annoyances. They also protect against dangerous websites that damage your computer or try to steal your information. ..........
Sometimes, spammers try to hide their malicious links behind URL shorteners like Tiny URL or, and in rare cases, we may temporarily block all use of a specific shortener. If you hit a block while using a URL shortener, try a different one or just use the original URL for whatever you're trying to share.
These systems are so effective .......... 
In my very recent post on Facebook, I was just trying to post the very awesome Google search link displaying the 3D Graph as a heart*x))*cos(100*y)%2B1.5*sqrt(abs(x))+%2B+0.8+x+is+from+-1+to+1%2C+y+is+from+-1+to+1%2C+z+is+from+0.01+to+2.5
and facebook denied accepting my link saying it belongs to the 'spammy link' section of link url
so actually,
initially just without thinking it from security perspective I converted it to a short url
and tried that up, and yeah..... it works (that's why I'm writing about it, obviously).

So, how it works 
the way I could think it works is plainly by matching the URL (except for the GET parameter passed on to it) from the blacklist of the URLs that Facebook maintains for it.

The Problem
to bypass such a system is real real easy... just get a link redirected from any in the batch of URL Shorteners, Page Translaters, Proxy or..... Simple get up a new machine on cloud and get it to bounce the URL back to desired URL.

Even if FB's awesome team succeeds in blacklisting in ever growing services of proxy and url-shorteners.

This technique of theirs wouldn't be able to catch your newly specially launched service before you a some decent response time.

What I think, would solve it
An intelligent security facilitator like Facebook would keep send that blacklist list on client side for several reasons.
So they must be checking the URL post request at their FB-Servers and then responding back with any concerns related to it.

In such a scenario WHY don't they simple get the URL's crawled back to the last URL responding without any HTTP referrer.
Say, the same method I use in to fetch the final URL from a short-ened or redirected URL for people requiring validation of a suspicious link.

This way they will never have to blacklist the URL Shortener services or any other valid URL bases for that matter, just to avoid their chance of redirection to malicious links.