This blog has moved to Medium

Subscribe via email


Sometimes a little sleep is ok

Thread.sleep() has always been considered a “bad thing” in programming, something you do when you don’t have a good alternative. You could sleep() when you’re polling for a long operation, but the better solution would always be to get a notification when the operation completes, to avoid blocking a thread and generally wasting time – if the operation finished in 600 milliseconds, and you poll only every second, then you completely wasted 400 milliseconds, right?

Well, it turns out there is at least one use case where you really should waste those extra 400 milliseconds – User Interface.

I recently implemented a web UI component that checks if another website is reachable, and only then proceeds. If the website is not reachable, the user has to stay on that page and try again (perhaps he mistyped the URL). I implemented a server method that checks if the URL is alive, and called it from the client side via AJAX. The code looks something like this:

showLoadingIcon(); // displays a "loading" GIF
$.post(url, function(result) {
  hideLoadingIcon();
  if (result.good) {
    // proceed
  } else {
    // Display error message
  }
});

Well, it turns out that while this code is perfectly correct and efficient, it feels erratic from a usability perspective. If the page took a long time to access, it would behave fine, but if the page was very quick, and the method returned within say 200 milliseconds, the “loading” icon would flicker, creating an annoying experience for the user.

The solution is adding a manual, “artificial” sleep(). If the loading was long, no sleep was necessary, but if it was too short, I added a call to setTimeout, to ensure the loading image is always spinning for at least a full second.

While this is not really an example of Thread.sleep() (that doesn’t really have a javascript equivalent), this does show that sometimes delaying an interaction is required to reduce the user’s pain or confusion.

6 Comments

  1. Avish:

    So your solution for “it happens too fast and my long-loading icon is flickering” is to artificially make it slower? Wouldn’t it be better to just address the flickering, by e.g. showing the spinner only after a certain time or fading it in/out so it doesn’t flicker? You’re literally saying “dear user, this could’ve been instantaneous but for your convenience I’ve made it take longer so you could appreciate my nice spinner”. I don’t think that’s a very good idea.

  2. ripper234:

    The whole page is supposed to close and move to the next one when the callback finishes.
    No point in making just the “loading” icon fade out, if the entire page changes or closes.

    Also, the loading gif is animated in the first place, adding a second dynamic effect like fading it out isn’t good in my subjective opinion.

    I’m arguing that the delay does not deter from the user experience, but rather makes it smoother and more consistent.

    From the comments here, and on Facebook and G+, it looks like I’m in the minority.

  3. M. A. Hanin:

    One thing I’ve learned is that its OK to be religious about different aspects of software development, but being fanatic is usually counter-productive. Even in “AntiPatterns”, the author wisely specifies exceptions – cases where an AntiPattern might be acceptable. That includes Gotos, DoEvents/ProcessMessages, Waits/Sleeps… don’t judge a tool by its misuse.

  4. ripper234:

    @Hanin – Definitely! I always support practicality over religious “it should be this way because … the book says so” arguments.

  5. sun:

    Another exception to the rule: Part of a project I am working on crawls other sites and I send the 6 crawling threads to sleep after every page read to be polite to the crawled sites.

  6. ripper234:

    @sun – Nice use case.

    When you’re writing web crawlers, often you should be mindful of your resources, and since sleep() wastes a thread, you might want to look at other solutions like async-IO.

    Still, I agree that if resources on your own crawler are not an issue, a quick-and-dirty sleep() on the crawler threads might be the best solution. Your comment takes me back a bit to my days of writing web crawlers at Delver 1.0. (We were a social search engine once)