I analysed our test suite speed in order to reduce build time. Here are my results :)

Hello all, this is my first post here - nice to e-meet you all :wave: :slight_smile:

I’m a developer at Funding Circle. I love writing tests. And I have recently become obsessed with optimising tests for speed. The reason being that faster tests mean faster build times which in turn means more productivity and happier devs/product team :raised_hands:

After a conversation with @filipefurtado over on Slack I decided to run an analysis of OFN’s test suite focusing on its speed. The hope was to come out with some actions that I/we could work on in order make the test suite faster.

I’ve now finished my analysis, and I am excited to present my results here (see bottom of post).
Before you jump into the slides, some quick notes:

  • There are only 9 sparse slides, but for the time-sensitive among you, there is a TLDR slide at the beginning
  • This is only my second time running such an analysis. If there is anything I’m missing, please do let me know :pray: I’m very keen to know people’s thoughts on this topic. I am fairly new to tech in general, so I am still in my learning phase :nerd_face:
  • I’m keen to start speeding up OFN’s test suite, so let me know if you agree with my “conclusion” slide and I can get cracking on with that work :muscle:

View analysis HERE

4 Likes

Thank you for that analysis! It’s good to know that there are no obvious improvements to make. The two suggestions from your analysis are already in progress:

  • Convert features specs to system specs.
  • Work on slowest examples to speed them up.

So I don’t think that we need to take any additional action here. Do you agree?

Hey shen-sat, a big thanks on this :raised_hands: :star_struck:
For your interest and initiative, but also for sharing the results and how to reproduce them.

I think your timing could not be better: we are assuming the migration from feature → system specs is relevant in speeding up our build, but I totally agree it is important to be able to measure and demonstrate this. Your procedure gives us a tool to do that.

In that sense, benchmarking the build before spec migration and then repeating this data collection after we’re done should provide an idea on the impact of this effort.

Just to be sure: at the time of data collection we only had one system spec merged, right? This is what I understand from the TAG_PROF=type analysis you shared here (slide 5).

Also, I’ve found very interesting that you’ve looked for causes like common patterns/bottlenecks and factory cascades (slides 6 a 7) - something to watch for. Before your slides I was totally new to this so, again, thank you for sharing!

Hi @maikel and @filipefurtado, thank you so much for the feedback :slight_smile:

@maikel - apart from the conversion of feature to system specs, I was unaware of any open PRs relating to speeding up the test suite. If you know of any, would you mind sharing the PR links? I’d love to look at them :nerd_face:

@filipefurtado - you are correct, at the time of analysis, we only had one system spec. How does the following plan sound in terms of benchmarking:

  • I’ll run the suite one more time with TAG_PROF=type analysis, this time with the most up-to-date version of master. This will give us a benchmark for all feature tests.
  • I’ll then run the test suite with the usual --profile flag. This will give us a benchmark for the slowest individual feature spec.
  • I’ll then work on converting the slowest feature spec → system spec. Once done, we can look at the impact of the conversion in terms of individual test time and overall feature specs time.
    What do you reckon?

Sounds great @shen-sat ,

On the points you mention:
i) sounds good, would be great to have that up-to date version, and see if and how it evolved
ii) Following our discussions at the time, I’ve opened this epic, which gathered some outcomes on the slowest performing specs (using the --profile flag), on a particular build run. This might not be up to date, but worth mentioning here I think.
iii) Sure, all help is welcome on this effort :muscle: This is WIP so feel free to chip in. Some useful references are the section on system specs in the testing handbook, and the respective epic.

Also: I was thinking of adding a chapter on spec performance on the testing handbook, based on your findings and linking to this post.

Hi @filipefurtado apologies for the late reply - life has been mad this year/of late!

All your points make sense to me :+1: And thank you for the links :pray:t5:I won’t have time to start migrating the slowest feature spec to a system spec until the New Year - I hope this is ok?

That’s a cool idea to add a chapter on spec performance! I’d be happy to add to it once I’ve successfully migrated the slowest feature spec to a system spec - I’m sure there will be some interesting findings to geek out about :nerd_face:

1 Like