Performance Blog

Load Testing: Emulated or real browsers?

Posted on: February 15, 2012

Enterprise applications are typically tested for load, performance and scalability using a driver that emulates a real client by sending requests to the applications similar to what a real user would. For web applications, the client is usually a browser and the driver is a simple Http client. The emulated http clients can be extremely light-weight allowing one to run several hundred or even thousand (depending on the think time) driver agents on a single box. Of course, tools vary widely – some tools can be quite heavy-weight so it’s important to do a thorough evaluation first. But I digress.

As web pages got larger and incorporated more components, performance testing tools started including a recording tool which could capture the http requests as a user navigates the application using a regular browser. The driver agent would then “playback” the captured requests. Many tools also allow modification of the requests to allow for unique logins, cookies etc. to correctly emulate multiple users. Such a “record-and-playback”  methodology  is part of most enterprise load testing tools.

Today’s web applications are complex and sophisticated with tons of javascript that track and handle all sorts of mouse movements, multiple XHR  requests on the same page, persistent connections using the Comet model, etc. If javascript generates dynamic requests, composing the URLs on the fly, the recorded scripts will fail. Of course, if the performance testing tool provides a rich interface allowing the tester full flexibility to modify the load driver, it is still possible to create a driver that can drive these rich web2.0 applications.

Browser Agents

Increasingly, many in the performance community are abandoning the old-style http client drivers in favor of browser agents i.e. an agent/driver that runs an actual full-featured browser. The obvious advantage to going this route is the dramatic simplification of test scripts – you can give it a single URL and the browser will automatically fetch all of the components on the page.  If the page contains javascript that in turn generates more requests – no problem. The browser will handle it all.

But at what cost?

If you’re thinking that this sounds too easy, what’s the catch … you’re right. There is a price to pay for this ease of use in both CPU and memory resources on the test driver systems. A real browser can consume 10s to 100s of megabytes of memory and significant CPU resources as well. And this is just for driving a single user! Realistically, how many browsers can you run on a typical machine, especially considering that driver boxes are typically older, slower hardware?

So what can we do to mitigate this problem?

Emulated Browsers with Real Javascript Engine

A compromise solution is to use a thin browser that does not have all of the functionality of a real browser but does include a real javascript engine. An example is HtmlUnit, which is a Java library that is lighter-weight than a real browser like IE or Firefox. The caveat here is that your performance testing tool must provide the capability to make calls to arbitrary third-party libraries. Many tools have very simplistic scripting capability which may not allow using HtmlUnit.

Summary

Many people seem to think that just because they have javascript or XHR requests, they need to use a real browser for scalability testing. This is untrue – in almost all but the most complex cases, you can still use an emulated client (the exception is if you have requests that are generated from javascript based on complex logic that is not easy to reproduce). Keep in mind that the purpose of load/scalability testing is to measure throughput. To do so, you want the lightest possible client so you can run the maximum number of emulated users with the minimum amount of hardware. Using a real browser should be the last option to consider.

About these ads

3 Responses to "Load Testing: Emulated or real browsers?"

Basically you’ve summed up what I did find out in a thesis, too.
Personally, I don’t trust HtmlUnit that much. It seems to be very likely to fail with modern and very common, yet complex libraries, like jQuery or Sencha ExtJS and probably others, too. One reason may be that it uses the JavaScript interpreter Rhino. I wouldn’t call Rhino a ‘real JavaScript engine’. Rhino is not used in any real browser. It is “simply” a JavaScript interpreter without all the browser objects like “window” or “document” for the DOM.

Those browser objects are provided by HtmlUnit. But as you indicated, HtmlUnit is an emulator. Its implementation of “window” and the DOM is purely custom. HtmlUnit is a unique browser product “with its own quirks” like the people of Selenium use to say. However, at the moment, I don’t see any good alternative to HtmlUnit, either. This is actually sad.

I often wished there were already some more “integrators” in place for Java. An integrator in my terms would most favourably integrate WebKit which is used in Chrome or Safari. Really promising at this point is “WebKitDriver”, a Selenium WebDriver for WebKit. Unfortunately the project seems not to be very actively pushed forwards, at the moment. At the time of writing I couldn’t build it the way their docs describe.

Apart from those solutions for the Java platform some good headless browsers might be found in the JavaScript space. There are also emulators like Zombie.js or integrators like PhantomJS which integrates Webkit. Unfortunately, I couldn’t figure out so far, how PhantomJS could be run an controlled in parallel so that it can be used for load or scalability testing.

Personally I don’t think that HtmlUnit will keep up with the changes the web will face in the near future and this is mainly due to its conceptual shortcomings with using Rhino and implementing a fully custom web stack. Maintaining this stack and keeping pace with the legions of developers at Apple, Mozilla, Google or Microsoft, is maybe more than the group behind of HtmlUnit is capable of dealing with. But maybe I’m wrong.

As much as I like HtmlUnit and value the work done by its developers, I see a need for better alternatives. And in my opinion if using real browsers is unavoidable to ensure the quality of modern web applications nowadays, then their must be put more pressure on the browser vendors to make their PLATFORMS also runnable in a lightweight headless mode which would be a tremendous step towards more efficient toolchains and development environments.

Super post’!
2012.10.06. 22:32:33

Thanks for the post. I opened a discussion on G+. https://plus.google.com/108668658731420474501/posts/dLQAEiBZgBE

I guess sooner or later using real browsers for load testing will be unavoidable. Even though it will cost a ton of resources.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Pages

Latest Tweets

  • 15-20 min talks doesn't let speakers cover much - disappointed in IOT day thus far. @dataweeksf 4 days ago
  • Researchers Advance Artificial Intelligence for Player Goal Prediction in Gaming lnkd.in/bKzdHsd 1 week ago
  • RT @bgracely: And one more thing…. We've set up a special section of the Genius Bar for those of you that assume the Apple Watch is waterp… 1 week ago
  • Will iPay put Square out of business? 1 week ago

Categories

Archives

Follow

Get every new post delivered to your Inbox.

Join 229 other followers

%d bloggers like this: