26.10.12

REST chattiness

Over the last few days I have given a try to very fine-granular implementation of the graph-data access over REST. As a result of that my client (backbone+marionette) -> server (nginx+passenger+RoR) turned out to be rather chatty, which is a potential performance threat. Here's what my chrome reported about it:


I think that handling over 110 requests in about 3 seconds sounds prety good. Especially taking into account that rendering happens after fetching list of issues related to the project - about 500ms after the start


The tested set-up is as follows:
  • Server-side: ubuntu 11.04 VM running inside VirtualBox
    • 4 cores of core 2 (6Mbyte of cache)
    • 1GB of ram
    • 8 nginx worker processes
    • 100Mbit wired ethernet
  • Proxy - CentOS VM running on some infrastructure
    • node.js based http-proxy forwarding http and web-sockets
    • single core, 1GB ram, 100 or 1000Mbit ethernet
  • Client -side - OSX with Google Chrome (24.0 - Canary) 
    • 8 Cores of i7
    • 8GB of ram (ton of applications running)
    • 100Mbit, wired ethernet
I don't know about you, but this sounds fairly good to me.