GSOC Project : Improving RubyBench

The top level goals for the project suggest that we need a framework similar to ‘git bisect’ for our
benchmarks. Can someone please elaborate on that? As in, what is the final outcome expected such that an equivalent of git bisect is created for our benchmarks?

3 Likes

Hi @shahsaurabh0605!

Congrats on being accepted!

For a start, we should start thinking about how we would be able to quickly benchmark across multiple commits.

From my previous discussions with @sam, we want to build a docker image for each commit made to Ruby and tag it with the commit’s SHA1. So a few things to think about here:

  • How can we automate the building of those images?
  • How do we build each image fast? (One solution is to build each image using the previous image as the base so we don’t have to recompile every thing each time)
  • Are we able to store everything on Docker hub?

Once this is done, we can start thinking about how to make the magic for bisecting happen :slight_smile:

cc @prathamesh

2 Likes

Hi @system,

Thanks a lot for selecting me.

I am currently busy with my university examinations which are starting from 29th April and ending on 8th May. So will that be fine if we continue our discussions on the project from there on?

Once my examinations are done, I have no other commitments this summer and can completely dedicate my time on the project.

Thanks.

1 Like

No worries! Good luck for you exams! Just ping us here once you’re ready!!

1 Like

I can imagine the base image size accumulated…since ruby binaries are indeed different for every commit. But gemsets seems to be possible to reuse.

- base_system
|
| - Ruby 2.1.x gemsets
  | - Ruby 2.1.0 commit xxx binary
  | - Ruby 2.1.1 commit xxx binary
| - Ruby 2.2.x gemsets
  | - Ruby 2.2.0 commit xxx binary
  | - Ruby 2.2.1 commit xxx binary

Hi @system,

I am currently new to rubybench and benchmarking so I would like to quickly learn everything to contribute. I tried to understand how benchmarking works and rubybench generates results going through various repositories. I read about Rubyfy.Me from Development of benchmarking suite for various ruby implementations to gain a deeper insight. Is there any similar thesis or documentation written for rubybench so that I can throughly understand the workflow and how docker is used?

When I went through the benchmarks on rubybench, Ruby releases Benchmarks shows graphical results across different releases and ‘latest’ commit. On the other hand, Ruby commit Benchmarks shows the results of commits few months ago instead of latest commits. So is your suggestion to build a docker image for each commit made to ruby headed here?

1 Like

Note, just going to mention this here.

I think the bisecting stuff is very interesting and important, however feel that the most important thing is getting our benchmarks in order.

I feel there is a huge gap around our Rails benchmarks, in particular have a look at say:

https://rubybench.org/rails/rails/commits?result_type=activerecord/mysql2_scope_all_with_default_scope&display_count=200

  • What if this was implemented raw against the mysql gem, how fast would it be? How many objects would be allocated?

  • What if this was implemented in Sequel? How fast would it be? How many objects would be allocated?

I feel we should split off all “database” specs from the Rails umbrella and make the suite of tests compare “raw” to “sequel” to “Active Record”, as it stands it is totally unclear how much performance is left on the table cause there is nothing to compare “Active Record” to.

2 Likes

To be able to quickly benchmark across multiple commits we need build our docker image fast. Here as far as I understand this, we are currently building a docker image for each commit but our selection of the base image is quite diverse. For each single commit we need to start from the very beginning and build our image.

At present, let’s say for ruby_trunk, we have our base image and then we clone our repositories in the dockerfile. One possible solution can be to have our base image ready with cloned repositories so that we can just use ‘git pull’ command in the dockerfile to fetch the latest commits. This will save a lot of time and images can also be stored on dockerhub.

cc @prathamesh

1 Like

This doesn’t sound right. The docker image is only built once and we run the container based on the relevant docker image whenever we need to benchmark something. For each commit, what we are doing now is to pull the latest changes from Ruby’s repository, compile it and then run out benchmarks. The problem with this approach is that compiling Ruby takes up alot of time. Instead, we will want to build an image for each commit of Ruby and then tag it with the relevant commit sha.

I’m not quite following in here. Aren’t we already doing this? :slight_smile:

Now I understand this properly. I think we can keep the base image as it is and change our dockerfile. We can maintain a variable such that it points to the commit’s sha1. As soon as there is a commit in ruby we can build the docker image. Now the variable will point to new commit and hence the execution of the dockerfile starts from there on, creating a new image. runner will benchmark for this new commit and exit once its done. This way we can build docker image for each commit.

But as you mentioned recompiling EVERTHING is the main hurdle here. I am not able to figure out how can I use this previous docker image to get rid of recompiling everything. Do you have any ideas?

We can have a base image that compiles a particular commit of Ruby and then every subsequent commit will build a new image based on the base image. This way, you just pull in the differences and make && make install which will only recompile the libraries that have changed.

The task at hand here is that we want to be able to automate the build of the image for each commit.

Once we have set up a new image for the new commit we can programmatically trigger the builds on dockerhub through Remote Build Triggers. Can this be done?

1 Like

Yup that could work :slight_smile:

1 Like

Now, what I plan to do is this:

We can our base image set for a particular commit in ruby. We will specify the curl of dockerhub in the runner script to trigger the build. So whenever there is a commit in ruby, the runner script will run, fetch the commit and trigger the build. This will keep on building the image layers for specific commits and compilation will take less time.

A few doubts I have here:
---- Can I create a dockerfile for any commit as my base image? Probably the latest commit right now.

---- I am unable tounderstand the runner script completely. As soon as there is a commit in ruby, runner script responds immediately and does its work. It runs once for this course of action. Am I correct?

---- I am confused where the curl url of dockerhub will go. It will be obviously after I fetch my commit in runner but where exactly?

We will want to backfill at least 2000 commits. So probably start with that :slight_smile: If possible, see if you can build the next commit based off the previous image.

Check out https://github.com/ruby-bench/ruby-bench-web/blob/master/app/jobs/remote_server_job.rb. This is how we trigger the benchmark runs.

Basically, we have cron job on the server that pulls the latest commits from GitHub - ruby/ruby: The Ruby Programming Language and pushes it to GitHub - tgxworld/ruby: The Ruby Programming Language. On GitHub - tgxworld/ruby: The Ruby Programming Language, we have a Github hook that will hit an endpoint on rubybench.org that triggers one of the remote server job you see. That is how we are running the benchmarks.

Ok I think you are confused here. The docker images which contains the runner scripts are just meant for running the benchmarks. Ideally, that should be the only thing they need to do.

What I think you can do for this week is to fork Ruby and try to see if you can get a Docker image for each commit made to Ruby. Basically, try to get this working Set up Automated Builds | Docker Docs with your fork of the Ruby repository.

By the way, don’t hesitate to ask as many questions as you need. :grin:

For now, I have created a repository Ruby-Docker which works as follows:

In Docker_base I have created the Dockerfile for the base image which goes back to 2000 commits. While Automated_Docker just uses the previous image and pulls the latest commit.

Then I created an automated build on dockerhub. Firstly, I manually trigger my base image to build. As soon as there is a commit in ruby, trigger command executes in remote_server_job.rb. (For now I don’t have the webhook set, so as soon as there is a commit in ruby I give this command from terminal.) This triggers the automated build for the other Dockerfile. For successive commits in ruby, new image gets formed from the previous image as the base.

Comment on the improvements to be done!!

The base image has to be built before remote_server_job is triggered because remote_server_job will eventually have to use that image to run the benchmarks. Basically, don’t care about anything in ruby-bench-web for now.

Ok can you get an image for Ruby for a particular commit on Docker Hub to show that it works?

Surely, remote_server_job has to use the built image to run the benchmarks. So here I triggered the dockerhub and built the image first. Then we can pull this built image and run the benchmarks.

To show that currently each image is built for the particular commit in ruby you can view the build status:
There are three recent successes in the build details. Firstly, head over to the third last success. This shows that I pulled the particular commit in ruby. In the second last success I simply trigged the process. In the logs this clearly shows that the repository is up-to-date. Then I did a commit in my forked ruby repo and again triggered the build. Thus in this last success logs show that this image just pulled the latest commit.

So on the whole, with every commit I triggered the build and this built an image layer for that particular commit.

I forgot to mention that I am just using git pull in my Dockerfile for demonstration. We will use runner script in that place.

1 Like

Ah icic. Ok in your automated build Dockerfile, you need to be compiling Ruby. Plus, each image you build has to be tagged with the commit sha1.

Ok now that the automated build is going, is it possible to be doing the following yet?

Assuming commit 1234 was just pushed to Ruby trunk, I should have an image rubybench/ruby:1234 where I can do docker run rubybench/ruby:1234 ruby -v and be running Ruby built with commit 1234 as the latest.