There are a variety of applications out there that will simulate web traffic. But there’s nothing like the idea of true traffic. Load a page, click on a link, wait for the next page to load, click on another link, etc. Traditional load simulators simply are not real world enough in most cases. There are a variety of more real world simulators but they are typically cost prohibitive for the use I recently encountered a need for. So I started looking at using Automator.
In its simplest form, you can just fire up Automator, click on the Record button and then perform an action. However, this is going to perform the same action over and over and over. Let’s say you have a MySQL database and you want to loop through calling a lot of records from the web site. Well, the Record button in Automator is actually going to look for the same pattern. Therefore, you could manually go through all the records while recording. But this is going to be a little time intensive and any misclicks will get replicated into your Automator workflow. Especially if, for each record, you have to click on a button to open it up, which is pretty typical.
. Ottomate is a collection of six Automator actions that will perform GUI-level actions without using the record button. Therefore, you can build a button on a page and tell Ottomate to click it. Then save that and you can call it up from a shell script. So let’s say we have an Automator action called clickbutton that we built using Ottomate. Well, we can call it using the following command:
This is where a quick and easy while loop can come into play. Let’s say that one record for a standard WordPress deployment loads in a browser as:
Hacking at Random 2009 Conference
And another loads in the browser as:
The page variable (p=) is increasing one per new article (boy that sure is a lot of articles btw). This makes it pretty simple. We’re going to create a script called test.sh. In the script we’re going to use variable x, which will be that page variable that we’re going to loop through. Because the shell isn’t going to be a fan of the “?” character we’re then going to go ahead and backspace it out by putting a in front of it. So we can actually open the next link in descending order in Safari using the following command from the shell:
At this point we could put the follwoing into the shell script and have it open the site and click on the desired link:
We could paste that in over and over and over, manually changing the page (p=). Or we could just loop it, converting the page to a variable. So now we’re going to variabalize the page by converting the number to a variable called x:
This would result in a little shell script like so:
Next, you put a quick while loop around it. Here, we will increase the x variable by one per iteration until it reaches 3086:
while [ $x -lt 3086 ]
x=`expr $x + 1`
Now, a couple of things about this kind of thing. First, the site needs to complete loading before Automator can click on the link. So in your Automator workflow you can assign time out variables. But you can also use the sleep command in your shell script. Either way, when you’re simulating real world testing, you can look through your web logs to determine the average amount of time someone spends on one site before moving over to the next page for a good simulation number. Additionally, Safari is simply going to keep opening new pages. While you might be able to open a few hundred you’re likely going to need to occasionally close the window, or close Safari as an application. If you are just closing the Safari window, you can place a Command-W into your workflow. If you want to close Safari entirely you would use the following command in your script (yes, it’s case sensitive):
Also, don’t be afraid of getting a little more complicated. You can use nested loops and random number generators to augment the sleep times and order that sites load. This way you can be a little more human-like and a little less likely to have cached data or identical patterns changing your findings. You will also want to run the scripts from a number of hosts in order to determine the load placed by concurrent computers. In a way this is how you would create your own little Mac OS X based botnet
(perhaps in a future article I’ll explain how to deploy the botnet payload). This script runs in the foreground. Anything you do with Automator is going to. Therefore, all of the hosts running it will display the pages on the screen and therefore pretty much not be useable for other tasks while the script is running. According to the type of site you’re looking at you could ditch using Automator and use Lynx
, which is easily installed using MacPorts
(or from source) if you wanted it to run in the background, although clicking buttons is a little more complicated to script if the browser cannot interpret certain types of said buttons… 😉
This article brings up one of the reasons that Captcha
and other anti-spam techniques are such good ideas once you go live with a site: it’s just too easy to write a bot these days (and higher unemployment rates mean more people with spare time to do so). The text in this article took way longer to write than the script. I’m not really that great at scripting, but you could easily do something like perform a Google search of all MediaWiki sites, go to a specified link on the site and edit that page with garbage (oddly enough, really common). Same rings true with advertisements for all sorts of scams in your forums and of course click fraud. Either way, this walkthrough is meant for testing your own site for load and maybe showing how easy it is to automate web tasks, not to propagate FUD
or show how to engage in click fraud
or create a malicious botnet. Have fun with it.