Summary scores: performance at a glance


Have you considered any aggregate performance scores for rubybench? There are a couple interesting questions…

Which tests are most relevant, that is, which tests best reflect ruby performance under different circumstances (e.g. Rails web app, sinatra/rack web app, number crunching, etc)?

At a glance, how does a version of ruby compare to other versions of ruby. Is ruby getting faster? Is garbage collection less disruptive (or more)?

I see value in the specific tests, but as first time visitor or as a general ruby developer, it’s not obvious how a version of ruby compares to others, holistically.

Have you discussed aggregate metrics or a small set (1-3) of metrics that give viewers an at a glance view of performance?



The macro benches are good at telling a lot of this stuff (the suite of Discourse benches), you can tell GC progress by looking at the 90th / 99th percentiles.

At the moment the biggest low-hanging-fruit missing there is memory usage, @tgxworld how can we wire this in to the Discourse benches?

I am kind of against bundling every microbench in to one big number, it feels contrived and is not really demonstrating anything “real”.

People are much more interested in “real” problems like:

  1. How fast will my website run?
  2. How long will my test suite take (we should add this @tgxworld )
  3. What is the memory impact of an upgrade?


I’ll get it up soon. We actually have the memory results / time loading rails but they are all in the labels when you scroll over the chart.


Hmm I think Discourse benchmarks can already answer point 1 and 3. I’ll have to look into point 2. Not sure how I’ll do it right now.


I share your concerns as well. Currently, RubyBench only serves the Ruby core team but isn’t useful to the general public. I wish/want to cater to both groups but with our limited human resource right now, I think we should focus on working with Ruby core first since it has a greater impact on the whole Ruby ecosystem.

Honestly, not yet. :stuck_out_tongue_closed_eyes:


That makes sense, but assumes people know to check out the discourse benches for to see how a rails website might perform. I had a suspicion that was the best benchmark for that since I read your blog, but I wasn’t sure and didn’t really know which of the discourse benches was the best. As far as most of the benchmarks go, I have no idea unless I go in and look at the code. I don’t know why all of the “so” tests are grouped together, or why there are 4 “vm” groupings.

I guess my question is really around the first time visitor experience. Right now it’s geared towards people who know what these benchmarks are. Maybe that’s good enough, though I think it’s not going to be very interesting to everyday ruby developers. It seems like there’s an opportunity to explain to visitors why they should care about a set of benchmarks, or any specific benchmark.

If you’ve chatted about it, I’d be interested in knowing where it landed. We might be able to help implement some ideas (or rev on ideas).



I think it would be awesome to improve the “first impressions” here.

Perhaps a great first move would be a PR to improve the blue blurb in , maybe even make people go one level deeper to see benchmarks thus giving us a nice overview page?


@sam @xianpants I’ve been thinking about the summary problem and would like to propose an alternate view. How about a table like this? (pardon my poor sketching)

I think it’ll make more sense for people to view the percentages for all benchmark types instead of looking at the graphs of each individual benchmark type