Cloaking and Faking the Referrer

Faking the Referrer

To straight-out fake the referrer we’ll need to take advantage of PHP and cURL. The function below is a simple example of how to request a page from a website, while sending fake information.

<?php
function fake_it($url, $ref, $agent) 
{ 
  $curl = curl_init(); 
  $header[0] = "Accept: text/xml,application/xml,application/xhtml+xml,"; 
  $header[0] .= "text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5"; 
  $header[] = "Cache-Control: max-age=0"; 
  $header[] = "Connection: keep-alive"; 
  $header[] = "Keep-Alive: 300"; 
  $header[] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7"; 
  $header[] = "Accept-Language: en-us,en;q=0.5"; 
  $header[] = "Pragma: "; // browsers keep this blank. 
 
  curl_setopt($curl, CURLOPT_URL, $url); 
  curl_setopt($curl, CURLOPT_USERAGENT, $agent); 
  curl_setopt($curl, CURLOPT_HTTPHEADER, $header); 
  curl_setopt($curl, CURLOPT_REFERER, $ref); 
  curl_setopt($curl, CURLOPT_ENCODING, 'gzip,deflate'); 
  curl_setopt($curl, CURLOPT_AUTOREFERER, true); 
  curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1); 
  curl_setopt($curl, CURLOPT_TIMEOUT, 5000); 
 
  $html = curl_exec($curl);
  curl_close($curl);
 
  // returns the content provided by the site
  return $html;
}
 
//Below would send a request to the url, with the second parameter as the referrer
//By default its usually a good idea to use your own browser as the User-Agent
echo fake_it('http://www.shoemoney.com/', 'http://www.dennis-yu.com/how-I-scamed-yu/',
$_SERVER['HTTP_USER_AGENT']);
 
//You can of course fake the User-Agent further by supplying an option such as say...
//GoogleBot... because who would bat an eye at google-bot when you're scraping them
echo fake_it('http://www.wickedfire.com/', 'http://www.google.com/search?q=HuurDuur', 
 'Googlebot/2.1 (+http://www.google.com/bot.html)');
?>

Now the obvious downside to the above method is that while you can fake the referrer you cannot however fake the server’s IP address. So a couple thousand hits from various referrers and user-agents, but the same IP would seem rather suspicious. However most people would not bat an eye if they saw “GoogleBot” scraping all their content.

And course if you have multiple servers, IPs and so forth you might decide to do something sinister like overload your competitor with worthless keywords via a fake google search referrer.

So there you have it, a method to cloak your referrers, and a method to fake them. Both of course have drawbacks. But I’m sure you can figure out something useful.

2 comments

  1. Victory says:

    is it possible to just stick to POST/form-submit on all browsers?

  2. kbeezie says:

    @Victory
    It’s possible, but you would have to update the javascript to take into account the DOM selection differences between browsers.