Running JMH benchmark from Eclipse

What issue I’m trying to solve

A few months ago, we started to use JMH in our project to test and find performance issues.
The tool provides multiple modes and profilers, and we found this useful for our purposes.

Intellij IDEA, which I’m using, has a useful Intellij IDEA plugin for Java Microbenchmark Harness. The plugin’s functionality is similar to JUnit Plugin. It simplifies benchmarks development and debugging by allowing running some benchmarks together or separately.

But… half of our team uses Eclipse as a major IDE and the IDE does not have any plugins or support for the tool.
Even if we may run it with the main method, it is inconvenient to change the include pattern and do not forget to revert the changes before committing them into git.

So, after small brainstorming, we decided to write a custom JUnit Runner with functionality to run benchmark.

JUnit 4 runners

JUnit 4 has API to run any class as a test suite.
All you need is only to:

  1. write class extending org.junit.runner.Runner class
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
<span>//...</span>
<span>}</span>
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
    <span>//...</span>
<span>}</span>
public class BenchmarkRunner extends Runner { //... }

Enter fullscreen mode Exit fullscreen mode

  1. implement constructor and few methods
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
<span>public</span> <span>BenchmarkRunner</span><span>(</span><span>Class</span><span><?></span> <span>benchmarkClass</span><span>)</span> <span>{</span>
<span>}</span>
<span>public</span> <span>Description</span> <span>getDescription</span><span>()</span> <span>{</span>
<span>//...</span>
<span>}</span>
<span>public</span> <span>void</span> <span>run</span><span>(</span><span>RunNotifier</span> <span>notifier</span><span>)</span> <span>{</span>
<span>//...</span>
<span>}</span>
<span>}</span>
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
   <span>public</span> <span>BenchmarkRunner</span><span>(</span><span>Class</span><span><?></span> <span>benchmarkClass</span><span>)</span> <span>{</span>
   <span>}</span>

   <span>public</span> <span>Description</span> <span>getDescription</span><span>()</span> <span>{</span>
       <span>//...</span>
   <span>}</span>  

   <span>public</span> <span>void</span> <span>run</span><span>(</span><span>RunNotifier</span> <span>notifier</span><span>)</span> <span>{</span>
       <span>//...</span>
   <span>}</span>
<span>}</span>
public class BenchmarkRunner extends Runner { public BenchmarkRunner(Class<?> benchmarkClass) { } public Description getDescription() { //... } public void run(RunNotifier notifier) { //... } }

Enter fullscreen mode Exit fullscreen mode

  1. add the runner to your test class
<span>@RunWith</span><span>(</span><span>BenchmarkRunner</span><span>.</span><span>class</span><span>)</span>
<span>public</span> <span>class</span> <span>CustomCollectionBenchmark</span> <span>{</span>
<span>//...</span>
<span>}</span>
<span>@RunWith</span><span>(</span><span>BenchmarkRunner</span><span>.</span><span>class</span><span>)</span>  
<span>public</span> <span>class</span> <span>CustomCollectionBenchmark</span> <span>{</span>
    <span>//...</span>
<span>}</span>  
@RunWith(BenchmarkRunner.class) public class CustomCollectionBenchmark { //... }

Enter fullscreen mode Exit fullscreen mode

Implementing Benchmark JUnit runner

First, we need to provide information about tests to the JUnit engine.

JUnit runners have the getDescription() method for it. But how to get information about test classes and test methods?

From JavaDoc:

When creating a custom runner, in addition to implementing the abstract methods here you must also provide a constructor that takes as an argument the Class containing the tests.

So, we may get the class as a constructor argument and get all information with reflection‘s help.

<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
<span>private</span> <span>final</span> <span>Class</span><span><?></span> <span>benchmarkClass</span><span>;</span> <span>// (1) </span>
<span>private</span> <span>final</span> <span>List</span><span><</span><span>Method</span><span>></span> <span>benchmarkMethods</span><span>;</span> <span>// (2) </span>
<span>public</span> <span>BenchmarkRunner</span><span>(</span><span>Class</span><span><?></span> <span>benchmarkClass</span><span>)</span> <span>{</span>
<span>this</span><span>.</span><span>benchmarkClass</span> <span>=</span> <span>benchmarkClass</span><span>;</span>
<span>this</span><span>.</span><span>benchmarkMethods</span> <span>=</span> <span>Arrays</span><span>.</span><span>stream</span><span>(</span><span>benchmarkClass</span><span>.</span><span>getDeclaredMethods</span><span>())</span>
<span>.</span><span>filter</span><span>(</span><span>m</span> <span>-></span> <span>m</span><span>.</span><span>isAnnotationPresent</span><span>(</span><span>Benchmark</span><span>.</span><span>class</span><span>))</span>
<span>.</span><span>collect</span><span>(</span><span>Collectors</span><span>.</span><span>toList</span><span>());</span>
<span>}</span>
<span>//...</span>
<span>}</span>
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
    <span>private</span> <span>final</span> <span>Class</span><span><?></span> <span>benchmarkClass</span><span>;</span> <span>// (1) </span>
    <span>private</span> <span>final</span> <span>List</span><span><</span><span>Method</span><span>></span> <span>benchmarkMethods</span><span>;</span> <span>// (2) </span>

    <span>public</span> <span>BenchmarkRunner</span><span>(</span><span>Class</span><span><?></span> <span>benchmarkClass</span><span>)</span> <span>{</span>  
        <span>this</span><span>.</span><span>benchmarkClass</span> <span>=</span> <span>benchmarkClass</span><span>;</span>  
        <span>this</span><span>.</span><span>benchmarkMethods</span> <span>=</span> <span>Arrays</span><span>.</span><span>stream</span><span>(</span><span>benchmarkClass</span><span>.</span><span>getDeclaredMethods</span><span>())</span>  
                <span>.</span><span>filter</span><span>(</span><span>m</span> <span>-></span> <span>m</span><span>.</span><span>isAnnotationPresent</span><span>(</span><span>Benchmark</span><span>.</span><span>class</span><span>))</span>  
                <span>.</span><span>collect</span><span>(</span><span>Collectors</span><span>.</span><span>toList</span><span>());</span>  
    <span>}</span>
    <span>//...</span>
<span>}</span>
public class BenchmarkRunner extends Runner { private final Class<?> benchmarkClass; // (1) private final List<Method> benchmarkMethods; // (2) public BenchmarkRunner(Class<?> benchmarkClass) { this.benchmarkClass = benchmarkClass; this.benchmarkMethods = Arrays.stream(benchmarkClass.getDeclaredMethods()) .filter(m -> m.isAnnotationPresent(Benchmark.class)) .collect(Collectors.toList()); } //... }

Enter fullscreen mode Exit fullscreen mode

Now I have all the required information to provide a test suite description to the JUnit engine:

  • Actual benchmark class
  • All methods marked with @Benchmark in this class
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
<span>//... </span>
<span>@Override</span>
<span>public</span> <span>Description</span> <span>getDescription</span><span>()</span> <span>{</span>
<span>Description</span> <span>result</span> <span>=</span> <span>Description</span><span>.</span><span>createSuiteDescription</span><span>(</span><span>benchmarkClass</span><span>);</span>
<span>benchmarkMethods</span><span>.</span><span>stream</span><span>()</span>
<span>.</span><span>map</span><span>(</span><span>m</span> <span>-></span> <span>Description</span><span>.</span><span>createTestDescription</span><span>(</span><span>benchmarkClass</span><span>,</span> <span>m</span><span>.</span><span>getName</span><span>()))</span>
<span>.</span><span>forEach</span><span>(</span><span>result:</span><span>:</span><span>addChild</span><span>);</span>
<span>return</span> <span>result</span><span>;</span>
<span>}</span>
<span>//...</span>
<span>}</span>
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>  
    <span>//... </span>
    <span>@Override</span>  
    <span>public</span> <span>Description</span> <span>getDescription</span><span>()</span> <span>{</span>  
        <span>Description</span> <span>result</span> <span>=</span> <span>Description</span><span>.</span><span>createSuiteDescription</span><span>(</span><span>benchmarkClass</span><span>);</span>  
        <span>benchmarkMethods</span><span>.</span><span>stream</span><span>()</span>  
                <span>.</span><span>map</span><span>(</span><span>m</span> <span>-></span> <span>Description</span><span>.</span><span>createTestDescription</span><span>(</span><span>benchmarkClass</span><span>,</span> <span>m</span><span>.</span><span>getName</span><span>()))</span>  
                <span>.</span><span>forEach</span><span>(</span><span>result:</span><span>:</span><span>addChild</span><span>);</span>  
        <span>return</span> <span>result</span><span>;</span>  
    <span>}</span>  
    <span>//...</span>
<span>}</span>
public class BenchmarkRunner extends Runner { //... @Override public Description getDescription() { Description result = Description.createSuiteDescription(benchmarkClass); benchmarkMethods.stream() .map(m -> Description.createTestDescription(benchmarkClass, m.getName())) .forEach(result::addChild); return result; } //... }

Enter fullscreen mode Exit fullscreen mode

Once we run the test we may see something like this:

Let’s run our benchmarks.
We need to implement the method org.junit.runner.Runner.run(RunNotifier) where RunNotifier is responsible to notify the engine about test runs.

The idea is we run sequentially one-by-one benchmark methods, each in separate org.openjdk.jmh.runner.Runner.

<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>
<span>...</span>
<span>@Override</span>
<span>public</span> <span>void</span> <span>run</span><span>(</span><span>RunNotifier</span> <span>notifier</span><span>)</span> <span>{</span>
<span>for</span> <span>(</span><span>Method</span> <span>benchmarkMethod</span> <span>:</span> <span>benchmarkMethods</span><span>)</span> <span>{</span>
<span>Description</span> <span>testDescription</span> <span>=</span> <span>getBenchmarkMethodDescription</span><span>(</span><span>benchmarkMethod</span><span>);</span>
<span>try</span> <span>{</span>
<span>notifier</span><span>.</span><span>fireTestStarted</span><span>(</span><span>testDescription</span><span>);</span>
<span>Options</span> <span>opt</span> <span>=</span> <span>new</span> <span>OptionsBuilder</span><span>()</span>
<span>.</span><span>include</span><span>(</span><span>".*"</span> <span>+</span> <span>benchmarkClass</span><span>.</span><span>getName</span><span>()</span> <span>+</span> <span>"."</span> <span>+</span> <span>benchmarkMethod</span><span>.</span><span>getName</span><span>()</span> <span>+</span> <span>".*"</span><span>)</span>
<span>.</span><span>jvmArgsAppend</span><span>(</span><span>"-Djmh.separateClasspathJAR=true"</span><span>)</span>
<span>.</span><span>build</span><span>();</span>
<span>new</span> <span>org</span><span>.</span><span>openjdk</span><span>.</span><span>jmh</span><span>.</span><span>runner</span><span>.</span><span>Runner</span><span>(</span><span>opt</span><span>).</span><span>run</span><span>();</span>
<span>notifier</span><span>.</span><span>fireTestFinished</span><span>(</span><span>testDescription</span><span>);</span>
<span>}</span> <span>catch</span> <span>(</span><span>Exception</span> <span>e</span><span>)</span> <span>{</span>
<span>notifier</span><span>.</span><span>fireTestFailure</span><span>(</span><span>new</span> <span>Failure</span><span>(</span><span>testDescription</span><span>,</span> <span>e</span><span>));</span>
<span>return</span><span>;</span>
<span>}</span>
<span>}</span>
<span>}</span>
<span>private</span> <span>Description</span> <span>getBenchmarkMethodDescription</span><span>(</span><span>Method</span> <span>benchmarkMethod</span><span>)</span> <span>{</span>
<span>return</span> <span>Description</span><span>.</span><span>createTestDescription</span><span>(</span><span>benchmarkClass</span><span>,</span> <span>benchmarkMethod</span><span>.</span><span>getName</span><span>());</span>
<span>}</span>
<span>}</span>
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>{</span>

    <span>...</span>

    <span>@Override</span>
    <span>public</span> <span>void</span> <span>run</span><span>(</span><span>RunNotifier</span> <span>notifier</span><span>)</span> <span>{</span>
        <span>for</span> <span>(</span><span>Method</span> <span>benchmarkMethod</span> <span>:</span> <span>benchmarkMethods</span><span>)</span> <span>{</span>
            <span>Description</span> <span>testDescription</span> <span>=</span> <span>getBenchmarkMethodDescription</span><span>(</span><span>benchmarkMethod</span><span>);</span>
            <span>try</span> <span>{</span>
                <span>notifier</span><span>.</span><span>fireTestStarted</span><span>(</span><span>testDescription</span><span>);</span>
                <span>Options</span> <span>opt</span> <span>=</span> <span>new</span> <span>OptionsBuilder</span><span>()</span>
                        <span>.</span><span>include</span><span>(</span><span>".*"</span> <span>+</span> <span>benchmarkClass</span><span>.</span><span>getName</span><span>()</span> <span>+</span> <span>"."</span> <span>+</span> <span>benchmarkMethod</span><span>.</span><span>getName</span><span>()</span> <span>+</span> <span>".*"</span><span>)</span>
                        <span>.</span><span>jvmArgsAppend</span><span>(</span><span>"-Djmh.separateClasspathJAR=true"</span><span>)</span>
                        <span>.</span><span>build</span><span>();</span>

                <span>new</span> <span>org</span><span>.</span><span>openjdk</span><span>.</span><span>jmh</span><span>.</span><span>runner</span><span>.</span><span>Runner</span><span>(</span><span>opt</span><span>).</span><span>run</span><span>();</span>

                <span>notifier</span><span>.</span><span>fireTestFinished</span><span>(</span><span>testDescription</span><span>);</span>
            <span>}</span> <span>catch</span> <span>(</span><span>Exception</span> <span>e</span><span>)</span> <span>{</span>
                <span>notifier</span><span>.</span><span>fireTestFailure</span><span>(</span><span>new</span> <span>Failure</span><span>(</span><span>testDescription</span><span>,</span> <span>e</span><span>));</span>
                <span>return</span><span>;</span>
            <span>}</span>
        <span>}</span>
    <span>}</span>

    <span>private</span> <span>Description</span> <span>getBenchmarkMethodDescription</span><span>(</span><span>Method</span> <span>benchmarkMethod</span><span>)</span> <span>{</span>
        <span>return</span> <span>Description</span><span>.</span><span>createTestDescription</span><span>(</span><span>benchmarkClass</span><span>,</span> <span>benchmarkMethod</span><span>.</span><span>getName</span><span>());</span>
    <span>}</span>
<span>}</span>
public class BenchmarkRunner extends Runner { ... @Override public void run(RunNotifier notifier) { for (Method benchmarkMethod : benchmarkMethods) { Description testDescription = getBenchmarkMethodDescription(benchmarkMethod); try { notifier.fireTestStarted(testDescription); Options opt = new OptionsBuilder() .include(".*" + benchmarkClass.getName() + "." + benchmarkMethod.getName() + ".*") .jvmArgsAppend("-Djmh.separateClasspathJAR=true") .build(); new org.openjdk.jmh.runner.Runner(opt).run(); notifier.fireTestFinished(testDescription); } catch (Exception e) { notifier.fireTestFailure(new Failure(testDescription, e)); return; } } } private Description getBenchmarkMethodDescription(Method benchmarkMethod) { return Description.createTestDescription(benchmarkClass, benchmarkMethod.getName()); } }

Enter fullscreen mode Exit fullscreen mode

Options mean follow:

  • include – benchmark we would like to include in the run.
  • jvmArgsAppend("-Djmh.separateClasspathJAR=true") – specific option, telling JMH to build classpath.jar to avoid the The filename or extension is too long error

As you may see, we notify RunNotifier when we start and finish our test, successful or not.
Looks good, but we are running all tests even when choosing to run only a single one.

Filtering

Our runner should implement the org.junit.runner.manipulation.Filterable interface to allow to JUnit engine to tell our code what tests should be run.
The interface has only a single method void filter(Filter) and org.junit.runner.manipulation.Filter argument has the shouldRun(Description) method we may use to know if the test requested to run.
The method doesn’t return anything, so looks like we need to store filtered results and use them later.

<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>implements</span> <span>Filterable</span> <span>{</span>
<span>//...</span>
<span>private</span> <span>List</span><span><</span><span>Method</span><span>></span> <span>readyToRunMethods</span><span>;</span> <span>// <= add new field to store filter result </span>
<span>@Override</span>
<span>public</span> <span>Description</span> <span>getDescription</span><span>()</span> <span>{</span>
<span>Description</span> <span>result</span> <span>=</span> <span>Description</span><span>.</span><span>createSuiteDescription</span><span>(</span><span>benchmarkClass</span><span>);</span>
<span>readyToRunMethods</span><span>.</span><span>stream</span><span>()</span> <span>// <= use the field here</span>
<span>.</span><span>map</span><span>(</span><span>this</span><span>::</span><span>getBenchmarkMethodDescription</span><span>)</span>
<span>.</span><span>forEach</span><span>(</span><span>result:</span><span>:</span><span>addChild</span><span>);</span>
<span>return</span> <span>result</span><span>;</span>
<span>}</span>
<span>//...</span>
<span>@Override</span>
<span>public</span> <span>void</span> <span>run</span><span>(</span><span>RunNotifier</span> <span>notifier</span><span>)</span> <span>{</span>
<span>for</span> <span>(</span><span>Method</span> <span>benchmarkMethod</span> <span>:</span> <span>readyToRunMethods</span><span>)</span> <span>{</span> <span>// <= and here</span>
<span>//... </span>
<span>}</span>
<span>}</span>
<span>@Override</span>
<span>public</span> <span>void</span> <span>filter</span><span>(</span><span>Filter</span> <span>filter</span><span>)</span> <span>throws</span> <span>NoTestsRemainException</span> <span>{</span>
<span>List</span><span><</span><span>Method</span><span>></span> <span>filteredMethods</span> <span>=</span> <span>new</span> <span>ArrayList</span><span><>();</span>
<span>for</span> <span>(</span><span>Method</span> <span>benchmarkMethod</span> <span>:</span> <span>benchmarkMethods</span><span>)</span> <span>{</span>
<span>if</span> <span>(</span><span>filter</span><span>.</span><span>shouldRun</span><span>(</span><span>getBenchmarkMethodDescription</span><span>(</span><span>benchmarkMethod</span><span>)))</span> <span>{</span>
<span>filteredMethods</span><span>.</span><span>add</span><span>(</span><span>benchmarkMethod</span><span>);</span>
<span>}</span>
<span>}</span>
<span>if</span> <span>(</span><span>filteredMethods</span><span>.</span><span>isEmpty</span><span>())</span> <span>{</span>
<span>throw</span> <span>new</span> <span>NoTestsRemainException</span><span>();</span>
<span>}</span>
<span>this</span><span>.</span><span>readyToRunMethods</span> <span>=</span> <span>filteredMethods</span><span>;</span>
<span>}</span>
<span>}</span>
<span>public</span> <span>class</span> <span>BenchmarkRunner</span> <span>extends</span> <span>Runner</span> <span>implements</span> <span>Filterable</span> <span>{</span>

    <span>//...</span>

    <span>private</span> <span>List</span><span><</span><span>Method</span><span>></span> <span>readyToRunMethods</span><span>;</span> <span>// <= add new field to store filter result </span>

    <span>@Override</span>  
    <span>public</span> <span>Description</span> <span>getDescription</span><span>()</span> <span>{</span>  
        <span>Description</span> <span>result</span> <span>=</span> <span>Description</span><span>.</span><span>createSuiteDescription</span><span>(</span><span>benchmarkClass</span><span>);</span>  
        <span>readyToRunMethods</span><span>.</span><span>stream</span><span>()</span>  <span>// <= use the field here</span>
                <span>.</span><span>map</span><span>(</span><span>this</span><span>::</span><span>getBenchmarkMethodDescription</span><span>)</span>  
                <span>.</span><span>forEach</span><span>(</span><span>result:</span><span>:</span><span>addChild</span><span>);</span>  
        <span>return</span> <span>result</span><span>;</span>  
    <span>}</span>  

    <span>//...</span>

    <span>@Override</span>  
    <span>public</span> <span>void</span> <span>run</span><span>(</span><span>RunNotifier</span> <span>notifier</span><span>)</span> <span>{</span>  
        <span>for</span> <span>(</span><span>Method</span> <span>benchmarkMethod</span> <span>:</span> <span>readyToRunMethods</span><span>)</span> <span>{</span>  <span>// <= and here</span>
            <span>//... </span>
        <span>}</span>  
    <span>}</span>  

    <span>@Override</span>  
    <span>public</span> <span>void</span> <span>filter</span><span>(</span><span>Filter</span> <span>filter</span><span>)</span> <span>throws</span> <span>NoTestsRemainException</span> <span>{</span>  
        <span>List</span><span><</span><span>Method</span><span>></span> <span>filteredMethods</span> <span>=</span> <span>new</span> <span>ArrayList</span><span><>();</span>  

        <span>for</span> <span>(</span><span>Method</span> <span>benchmarkMethod</span> <span>:</span> <span>benchmarkMethods</span><span>)</span> <span>{</span>  
            <span>if</span> <span>(</span><span>filter</span><span>.</span><span>shouldRun</span><span>(</span><span>getBenchmarkMethodDescription</span><span>(</span><span>benchmarkMethod</span><span>)))</span> <span>{</span>  
                <span>filteredMethods</span><span>.</span><span>add</span><span>(</span><span>benchmarkMethod</span><span>);</span>  
            <span>}</span>  
        <span>}</span>  

        <span>if</span> <span>(</span><span>filteredMethods</span><span>.</span><span>isEmpty</span><span>())</span> <span>{</span>  
            <span>throw</span> <span>new</span> <span>NoTestsRemainException</span><span>();</span>  
        <span>}</span>  

        <span>this</span><span>.</span><span>readyToRunMethods</span> <span>=</span> <span>filteredMethods</span><span>;</span>  
    <span>}</span>  
<span>}</span>
public class BenchmarkRunner extends Runner implements Filterable { //... private List<Method> readyToRunMethods; // <= add new field to store filter result @Override public Description getDescription() { Description result = Description.createSuiteDescription(benchmarkClass); readyToRunMethods.stream() // <= use the field here .map(this::getBenchmarkMethodDescription) .forEach(result::addChild); return result; } //... @Override public void run(RunNotifier notifier) { for (Method benchmarkMethod : readyToRunMethods) { // <= and here //... } } @Override public void filter(Filter filter) throws NoTestsRemainException { List<Method> filteredMethods = new ArrayList<>(); for (Method benchmarkMethod : benchmarkMethods) { if (filter.shouldRun(getBenchmarkMethodDescription(benchmarkMethod))) { filteredMethods.add(benchmarkMethod); } } if (filteredMethods.isEmpty()) { throw new NoTestsRemainException(); } this.readyToRunMethods = filteredMethods; } }

Enter fullscreen mode Exit fullscreen mode

Now it runs only methods we ask to run.

Conclusion

Finally, we’ve got unified way to run benchmarks from any IDE during development and debugging. This is really simplify our daily life.
We are still running the benchmarks with main method to reduce environment noise and get reliable results for analysis.

You may find the code in GitHub.

If you find the post helpful, please support me and:

原文链接:Running JMH benchmark from Eclipse

© 版权声明
THE END
喜欢就支持一下吧
点赞15 分享
More grow up more lonely, more grow up more uneasy.
越长大越孤单 ,越长大越不安
评论 抢沙发

请登录后发表评论

    暂无评论内容