On a recent project at Tradier, we relied heavily on Redis hashes. We really like Redis for it’s versatility - it’s various data types create lots of possibilities for solutions. We were thoroughly impressed by it’s tolerance to a write-heavy data stream that we were pushing into it. But as fast as we were able to write data in, we found it’s performance while doing multi-read’s to be a little more cumbersome than we’d like.
Using the Redis Ruby Gem, we first turned to pipelined requests. Pipelined requests return a future, meaning in order to fully query and load, you essentially have to loop twice:
While this does the job, it’s tedious and with large key-sets not as performant as we’d like it to be. What we’d really like is something closer to Memcached’s multi-get support. As we considered other solutions, we decided to take a look at Redis’ scripting support to see if it could help. Not really knowing much about Lua, we were pretty surprised by how powerful Lua was. Using Lua, we can make a single request to Redis, passing all of the keys as an argument to the Lua script:
The code was fairly simple. We can loop through the
KEYS and invoke the
collate method we defined to load the hash data. The challenge then became passing this data back to our ruby code. We found that while Lua objects will not easily serialize back to a Ruby object, Redis’ Lua implementation offers up some options: namely cjson and cmsgpack. We need just return from the script and we’re now returning data back:
What we found is that between pipelined requests, lua + json, and lua + messagepack, messagepack was the best performer of the three solutions. Our final solution ended up something like this:
Of course, no post like this would be complete without a benchmark (using 10K keys):
user system total real lua + json 0.350000 0.010000 0.360000 ( 1.242315) lua + msgpack 0.260000 0.020000 0.280000 ( 1.146377) redis pipelined 1.070000 0.020000 1.090000 ( 1.759858)
Overall, we were pleasantly surprised by Redis and Lua and it’s definitely a solution we’ll turn to in the future as well.