Giter Club home page Giter Club logo

jmeter-java-dsl's People

Contributors

anasoid avatar andy-tarr avatar andytarr avatar ankugarg avatar ben-norton-anaplan avatar dependabot[bot] avatar eldaduzman avatar hboutemy avatar kabilesh020799 avatar kirillyu avatar luismartinez22 avatar makssieve avatar mrthreepwood avatar ndanilin avatar pyrobow avatar rabelenda avatar rabelenda-abstracta avatar snyk-bot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jmeter-java-dsl's Issues

Groovy code debug

Any ideas how to effective and fast debug groovy code? Is it possible to run it without the test itself, but with some kind of context?

httpDefaults implementation

It's really hard to use httpDefaults because of the methods now are different from httpRequest. And there is no way to set only port and protocol for example, so the .url method is not really usefull

To have the ability to add duration instead iterations .

TestPlanStats stats =
 testPlan(
               threadGroup(
                   18,
                   10, //here to have a duration of x seconds or time unit
                   httpSampler( ENDPOINT)
                       .header("content-type", JSON.toString())
                       .header("Authorization", "Bearer " + .getAccessToken())
                       .post(payload, MimeTypes.Type.APPLICATION_JSON)),
               influxDbListener("http://localhost:8086/write?db=jmeter"),
               // this is just to log details of each request stats
               jtlWriter("test.jtl"))
           .run();

Functionally Defined Request Properties

I'd love to make the request building a little more functional to allow the generation of url/headers/body per request through code. We would love to use our model factories that we already have in our code base in order to performance test our endpoints.

I have a branch where I have implemented this for bodies:
master...MrThreepwood:FunctionalRequest

I'd be happy to continue building out this pattern for the other properties in the builder if this is something you would want to accept. If it's not, we can work around this by creating our own DslHttpSampler. Currently the DslHttpSampler uses a lot of private variables, so even if you didn't want this functionality in the code base it would be helpful for people looking to extend its functionality to have protected functions that are used instead (getBody(), getHeaders() etc).

__tstFeedback called with wrong number of parameters

I have the exception when try to start this code. It's based on your examples, so probably I take a mistake

@Test
  public void testPerformance() throws Exception {
  int baseRps = 14;
  String host = "https://some_host.io";
  testPlan(
    csvDataSet("showcase.csv"),
    threadGroup(1, 1,
        httpSampler(host+"/showcases/list")
          .post("{\"coordinates\": {\"latitude\": ${cur_lat},\"longitude\": ${cur_lon}},\"source\": \"tst\"}", Type.APPLICATION_JSON)
          .children(
            jsr223PreProcessor(UtilsGet()),
            jsonExtractor("SHOWCASE_ID", "[0].showcaseId"),
            jsr223PostProcessor(showcaseProductGet())
          ),
        httpSampler(host+"/showcases/${SHOWCASE_ID}?source=tst&version=0")
      ),
    rpsThreadGroup()
      .maxThreads(1)
      .rampToAndHold(0.5 * baseRps, Duration.ofSeconds(10), Duration.ofMinutes(10))
      .rampToAndHold(1 * baseRps, Duration.ofMinutes(1), Duration.ofMinutes(10))
      .rampToAndHold(1.5 * baseRps, Duration.ofMinutes(1), Duration.ofMinutes(10))
      .rampToAndHold(2 * baseRps, Duration.ofMinutes(1), Duration.ofMinutes(10))
          .counting(RpsThreadGroup.EventType.ITERATIONS)
          .children(
                httpSampler(host+"/showcases/list").post("{\"coordinates\": {\"latitude\": ${cur_lat},\"longitude\": ${cur_lon}},\"source\": \"tst\"}", Type.APPLICATION_JSON)      
                )
  ).run();
  }

Validating HTTP Status Code

Hello,

Can we validate HTTP Status Code using this library. The idea is that test case should fail if we get HTTP status code other than 2XX (Other stats should not matter in this case).

Regards
Aabir

JmxToDsl is it possible?

I have over hundred test plans with logic and really big scenarios. Probably for lower the entry threshold the dsl need this feature? Or probably it can be killer feature for guys hwo want to try the dsl, and don't know how to start :)

JDBC Sampler, SetUp and TearDown thread groups

I have already started porting my tests to DSL, but I still have difficulty doing it one-to-one. The functionality of these elements is very lacking in order to launch a number of projects through DSL. Much thanks for supporting!

Getting the total response time of all the requests

Hello,

In my test scenario I have thread count of 100 and I want to calculate the total response time of all the requests.
Is there a way by which we can achieve that ?
I have tried using elapsedTime() method but it returned only 1 milli second. [PT0.001S]
This was not the case when the same scenario was tried using Jmeter UI. From Jmeter UI it took around 2000 milli seconds

Code snippet is as follows-

`public class Test2 {

@Test
public void test() throws IOException {
	
	JsonObject postObject = Json.createObjectBuilder()
            .build();
	
	TestPlanStats stats = testPlan(
		      threadGroup(100, 0,
		        httpSampler("https://reqres.in/api/users?page=2")
		        .children(responseAssertion().containsSubstrings("Ok"))
		        .header("Authorization", "Basic ")
		          .post(postObject.toString(), Type.APPLICATION_JSON)))
			.run();
	        System.out.println(stats.overall().elapsedTime());
		   
		  }
}`

Thanks and Regards
Aabir

Test Doesn't Stop

I've probably done something wrong, but I attempted to implement a version of the a DSL sampler that can provide a different body for every request. I mostly just copied and pasted the DslHttpSampler and replaced the body parameter which was a string with a Supplier which is then called every time build arguments is called.

I'm not sure why this is resulting in the program running over and over again, but I think having different request parameters on each iteration is likely a fairly common use case. (I had forked the repo and was going to make a PR if I could get it working).

forEach controller

There are many cases when you should to use jsonExtractor with match No. param setted to "-1". In this case you will get the array, which seems like:
var_1="val"
var_2="val2"
var_3="val3"
var_matchNr=3

In current DSL implementations no way to easy and usefull work with it. The best choice is to use the forEach controller, which can get only "var" parameter and iterate in with setting local variable for loop.

RpsThreadGroup issues

  1. How to execute .saveAsJmx("path") on RpsThreadGroup?
    I see this message The method saveAsJmx(String) is undefined for the type RpsThreadGroup
  2. There is different tree building variants in threadGroup and rpsThreadGroup example:
    threadGroup(1, 1, httpSampler("label","/api/list") )
    and only
    rpsThreadGroup() .children(httpSampler("label","/api/list"))
    So why I can't put childrens in round brackets? It seem's stange and non-universal, but not crit.

Add ability to use custom JavaSamplers

We use a lot of custom JavaSamplers and Listners in our projects.
I want to be able to use them, without extending all of them in DSL. Is it Posible?
example of usage:

Arguments samplerArguments = new Arguments();
        samplerArguments.addArgument("name","value");
        TestPlanStats stats = testPlan(
                threadGroup(2, 10,
                        httpSampler("http://my.service")
                                .post("{\"name\": \"test\"}", MimeTypes.Type.APPLICATION_JSON)

                 ),
                  javaSampler("CustomSampleName", MyCustomJavaSampler.class,samplerArguments)
 
                //this is just to log details of each request stats
                jtlWriter("test" + Instant.now().toString().replace(":", "-") + ".jtl")
        ).run();

push Jmeter Properties from java

I am trying to pass a property to jmeter from java code.

        props.put("productSetId", productSetIds);
        props.put("productSetId_#",productSetIdsCount);``` 
Doen't work. 

JMeterUtils.getProperties()

 returns null

So what is true way?

Test manipulation buttons

Debug now is hard. It is same for jmeter, but not totally. When we started the resultTreeVisualizer in DSL we see the request, if there are many requests it is hard to click on one of samplers. In Jmeter there is the stop button for it. Can we add the tool bar with buttons to dsl? Like .toolbar method or add it to visualisation window?

Counts Not Correct

I did a brief test with this DSL to check out how it works. While it does work fairly conveniently with the test shown, it doesn't seem to work correctly.

Using just the code from the docs I end up with 20 result lines in my JTL output, the stats.overall().sampleCount() is 16 and the errorCount() is 18. Obviously the URL http://my.service doesn't actually exist, so they should all be errors. Considering it runs 2 threads 10 times I would expect the error count to be 20 and the sample count to be 20. This is what appears in the .jtl file, I'm not sure why the programmatic results are different.

Functionality Questions

I'm running into a few problems attempting to figure out how to do things/how things work. We're currently using Gatling which is fairly inconvenient due to it being in Scala and seems to lack simple programmatic access to results/no instructions for running it outside the gatling compiler which automatically runs.

I'm seeing the jtl output file, but I was hoping to completely avoid any need to interact with Jmeter at all. Is there an easy way to turn that simple CSV into an HTML report?

Also, is there a way to access the results of each http request programmatically? One thing I'm wondering about is how I'm supposed to tell whether a network request failed or not. Obviously I have the error count, but how does it decide what is and isn't an error? If I'm trying to test an endpoint that we've intentionally rate limited to avoid load on the system from bad actors, how can I tell if a load test is still overwhelming it with too many requests vs. it simply returning a standard rate limit error response?

Ideally I'd like to be able to loop over each request and make assertions about the results programmatically, but I believe Gatling has something that allows you to define a function that you can pass to define whether or not a call was a success or failure. Is there something similar here?

InfluxDB v2 Tag

Apparently, the current DSL implementation does not support Influxdb v2 and does not support custom TAG pushing.
image
Can you add this feature?

Generated Jmx within HtmlReport could not be opened with Jmeter UI

Step:
run test
find ./build/demo.jmx
launch jmeter
open ./build/demo.jmx
Actual:

Problem loading XML from:'~./build/demo.jmx'. 
Cause:
CannotResolveClassException: us.abstracta.jmeter.javadsl.core.listeners.HtmlReporter$AutoFlushingResultCollector

 Detail:com.thoughtworks.xstream.converters.ConversionException: 
---- Debugging information ----
cause-exception     : com.thoughtworks.xstream.converters.ConversionException
cause-message       : 
first-jmeter-class  : org.apache.jmeter.save.converters.HashTreeConverter.unmarshal(HashTreeConverter.java:66)
class               : org.apache.jmeter.save.ScriptWrapper
required-type       : org.apache.jmeter.save.ScriptWrapper
converter-type      : org.apache.jmeter.save.ScriptWrapperConverter
path                : /jmeterTestPlan/hashTree/hashTree/us.abstracta.jmeter.javadsl.core.listeners.HtmlReporter$AutoFlushingResultCollector
line number         : 12
version             : 5.4.1
-------------------------------

Expected: jmx loaded successfully.

Code:

private String getTimeStamp() {
        return IsaConstants.DT_NOW.toString().replace(":", "-");
    }

  @Test
  public void testName() throws IOException {
      DslTestPlan dslTestPlan = JmeterDsl.testPlan(
          JmeterDsl.htmlReporter("./build/" + getTimeStamp()));
      dslTestPlan.saveAsJmx("./build/demo.jmx");
  }

Note:
passed at:

  • jmeter version 5.3
  • jmeter-java-dsl:0.16
    failed since:
  • jmeter-java-dsl:0.40
  • jmeter version 5.3, 5.4.1

gradle build failed

If you try to get dependencies by this way only:
implementation 'us.abstracta.jmeter:jmeter-java-dsl:0.24'
You will get this exception
`Execution failed for task ':app:compileJava'.

Could not resolve all files for configuration ':app:compileClasspath'.
Could not find org.apache.jmeter:bom:5.2.1.
Searched in the following locations:
- https://repo.maven.apache.org/maven2/org/apache/jmeter/bom/5.2.1/bom-5.2.1.pom
If the artifact you are trying to retrieve can be found in the repository but without metadata in 'Maven POM' format, you need to adjust the 'metadataSources { ... }' of the repository declaration.
Required by:
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_http:5.2.1
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_functions:5.2.1
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_components:5.2.1
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_config:5.2.1
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_java:5.2.1
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_http:5.2.1 > org.apache.jmeter:ApacheJMeter_core:5.2.1
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_http:5.2.1 > org.apache.jmeter:ApacheJMeter_core:5.2.1 > org.apache.jmeter:ApacheJMeter:5.2.1
project :app > us.abstracta.jmeter:jmeter-java-dsl:0.24 > org.apache.jmeter:ApacheJMeter_http:5.2.1 > org.apache.jmeter:ApacheJMeter_core:5.2.1 > org.apache.jmeter:jorphan:5.2.1

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

  • Get more help at https://help.gradle.org`

I fixed it in gradle.build:

`

class JMeterRule implements ComponentMetadataRule {
void execute(ComponentMetadataContext context) {
context.details.allVariants {
withDependencies {
removeAll { it.group == "org.apache.jmeter" && it.name == "bom" }
}
}
}
}
dependencies {
testImplementation 'org.junit.jupiter:junit-jupiter-api:5.6.2'
testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine'
implementation 'com.google.guava:guava:29.0-jre'
implementation 'us.abstracta.jmeter:jmeter-java-dsl:0.24'
testImplementation "org.assertj:assertj-core:3.11.1"
components {
withModule("org.apache.jmeter:ApacheJMeter_core", JMeterRule)
withModule("org.apache.jmeter:ApacheJMeter_java", JMeterRule)
withModule("org.apache.jmeter:ApacheJMeter", JMeterRule)
withModule("org.apache.jmeter:ApacheJMeter_http", JMeterRule)
withModule("org.apache.jmeter:ApacheJMeter_functions", JMeterRule)
withModule("org.apache.jmeter:ApacheJMeter_components", JMeterRule)
withModule("org.apache.jmeter:ApacheJMeter_config", JMeterRule)
withModule("org.apache.jmeter:jorphan", JMeterRule)
}
}

`

HTTP Sampler parametrization

Hi, thanks for the great tool you are developing.

I think the current implementation is missing important functionality:

  1. It would be convenient to configure the URL from separate fields protocol, host, path, port. This will give more flexibility in building requests and using RequestDefaults.
  2. When the URL splitting into components will added, it will be possible to implement adding individual parameters as components of the HTTP Sampler and add property URL Encode?

For example:
https://example.com/any/path?key=va%20lue

httpSampler("Hello Sampler").protocol("https").host("example.com").path("/any/path").addParameter("key","va lue", true)

  1. There is no File Upload functionality.

It would be greate if u can add this.

YAML scripting feature

Many people use jmeter because there is no code there. Simple QA can understand this. And this is the reason for its popularity and large community. Perhaps in the future it is worth thinking about expanding the audience by eliminating the code layer in Java or Kotlin. And use YAML interpretation. As done in taurus https://gettaurus.org/docs/JMeter/#Requests
What you think?

Temporary files are sometimes not deleted

image
This is not critical until they weigh a lot. In the case of using large arrays of properties, this can become a problem.

The close method does not work when the test is stopped manually, as far as I understand.

It would be nice to be able to run JMeter from the same static environment or use the bin directory of an existing JMeter on the machine. What do you think?

JTL file error

Getting below jtl file error when a test is executed:

org.apache.jmeter.reporters.ResultCollector - Exception occurred while initializing file output.
java.io.IOException: The filename, directory name, or volume label syntax is incorrect
at java.io.WinNTFileSystem.canonicalize0(Native Method) ~[?:1.8.0_221]
at java.io.WinNTFileSystem.canonicalize(Unknown Source) ~[?:1.8.0_221]
at java.io.File.getCanonicalPath(Unknown Source) ~[?:1.8.0_221]
at org.apache.jmeter.reporters.ResultCollector.getFileWriter(ResultCollector.java:450) ~[ApacheJMeter_core-5.2.1.jar:5.2.1]
at org.apache.jmeter.reporters.ResultCollector.testStarted(ResultCollector.java:325) ~[ApacheJMeter_core-5.2.1.jar:5.2.1]
at org.apache.jmeter.reporters.ResultCollector.testStarted(ResultCollector.java:351) ~[ApacheJMeter_core-5.2.1.jar:5.2.1]
at org.apache.jmeter.engine.StandardJMeterEngine.notifyTestListenersOfStart(StandardJMeterEngine.java:206) ~[ApacheJMeter_core-5.2.1.jar:5.2.1]
at org.apache.jmeter.engine.StandardJMeterEngine.run(StandardJMeterEngine.java:381) ~[ApacheJMeter_core-5.2.1.jar:5.2.1]
at us.abstracta.jmeter.javadsl.core.EmbeddedJmeterEngine.run(EmbeddedJmeterEngine.java:43) ~[jmeter-java-dsl-0.11.jar:?]
at us.abstracta.jmeter.javadsl.core.DslTestPlan.run(DslTestPlan.java:36) ~[jmeter-java-dsl-0.11.jar:?]
at pg.PerfTestJunit.testPerformance1(PerfTestJunit.java:37) ~[classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_221]
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_221]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_221]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:1.8.0_221]
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:686) ~[org.junit.platform.commons_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:205) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:201) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:137) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:71) ~[org.junit.jupiter.engine_5.6.0.v20200203-2009.jar:5.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:135) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at java.util.ArrayList.forEach(Unknown Source) ~[?:1.8.0_221]
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at java.util.ArrayList.forEach(Unknown Source) ~[?:1.8.0_221]
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51) ~[org.junit.platform.engine_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:248) ~[org.junit.platform.launcher_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$5(DefaultLauncher.java:211) ~[org.junit.platform.launcher_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:226) [org.junit.platform.launcher_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:199) [org.junit.platform.launcher_1.6.0.v20200203-2009.jar:1.6.0]
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:141) [org.junit.platform.launcher_1.6.0.v20200203-2009.jar:1.6.0]
at org.eclipse.jdt.internal.junit5.runner.JUnit5TestReference.run(JUnit5TestReference.java:98) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:41) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:542) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:770) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:464) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:210) [.cp/:?]

Sample test case:
@test
public void testPerformance1() throws IOException {
TestPlanStats stats = testPlan(
threadGroup(2, 10,
httpSampler("https://petstore.swagger.io/v2/pet/1").post("", Type.APPLICATION_JSON)
),
jtlWriter("test" + Instant.now() + ".jtl")
).run();
assertThat(stats.overall().elapsedTimePercentile99()).isLessThan(Duration.ofSeconds(5));
}

Apologies if I'm doing something wrong here.

Jmeter logging

Is there the way how to write JMeter log to the console/stdout?

Placeholder variable doesn't work in postprocessor

The code:

package com.mechanitis.demo.junit5;

import static us.abstracta.jmeter.javadsl.JmeterDsl.*;

import java.io.File;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.time.Duration;
import java.time.Instant;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.function.Function;
import java.util.stream.Collectors;

import org.apache.commons.io.FileUtils;
import org.eclipse.jetty.http.MimeTypes.Type;
import org.junit.jupiter.api.Test;
import us.abstracta.jmeter.javadsl.core.testelements.DslSampler;
import us.abstracta.jmeter.javadsl.core.DslThreadGroup.ThreadGroupChild;
import us.abstracta.jmeter.javadsl.core.DslThreadGroup;
import us.abstracta.jmeter.javadsl.core.TestPlanStats;
import us.abstracta.jmeter.javadsl.core.threadgroups.RpsThreadGroup;

public class PerfTest {

  private static class RpsFragmentProfile {

    private final int rps;
    private final DslSampler sampler;
    private final List<RpsFragmentProfile> children;

    private RpsFragmentProfile(int rps, DslSampler sampler, RpsFragmentProfile... children) {
      this.rps = rps;
      this.sampler = sampler;
      this.children = Arrays.asList(children);
    }

    private List<ThreadGroupChild> buildTestPlanPartWithBaseRps(int parentRps) {
      List<ThreadGroupChild> ret = new ArrayList<>();
      ret.add(sampler);
      ret.addAll(children.stream()
          .flatMap(c -> c.buildTestPlanPartWithBaseRps(rps).stream())
          .collect(Collectors.toList()));
      if (rps == parentRps) {
        return ret;
      }
      double factor = (double) rps / parentRps;
      ThreadGroupChild[] retArr = ret.toArray(new ThreadGroupChild[0]);
      if (factor > 1) {
        double fraction = factor - Math.floor(factor);
        int loops = (int) Math.ceil(factor);
        if (fraction == 0.0) {
            return Collections.singletonList(forLoopController(loops,retArr));
        } else {
            return Collections.singletonList(forLoopController(loops,percentController((float) factor / loops * 100, retArr)));
        }
      } else {
        return Collections.singletonList(percentController((float) factor * 100, retArr));
      }
    }

  }

  public static String showcaseProductGet() throws IOException {
    File file = new File("showcaseProductGet.groovy");
    return FileUtils.readFileToString(file, StandardCharsets.UTF_8);
  }

  public static String UtilsGet() throws IOException {
    File file = new File("util.groovy");
    return FileUtils.readFileToString(file, StandardCharsets.UTF_8);
  }

  private static class RpsTestPlanProfile {

    private final List<RpsFragmentProfile> fragments = new ArrayList<>();

    private RpsTestPlanProfile add(int rps, DslSampler sampler,
        RpsFragmentProfile... children) {
      fragments.add(new RpsFragmentProfile(rps, sampler, children));
      return this;
    }

    private TestPlanStats run(Function<Integer, RpsThreadGroup> threadGroupBuilder)
        throws IOException {
        int maxRps = fragments.stream().mapToInt(c -> c.rps).max().orElse(0);
        String host = "https://somehost.com";
        return testPlan(
          csvDataSet("showcase.csv"),
          threadGroup(1, 1,
              httpSampler("_POST  /showcases/list",host+"/showcases/list")
                .post("{\"coordinates\": {\"latitude\": ${cur_lat},\"longitude\": ${cur_lon}},\"source\": \"1\"}", Type.APPLICATION_JSON)
                .children(
                  jsr223PreProcessor(UtilsGet()),
                  jsonExtractor("SHOWCASE_ID", "[0].showcaseId")
                ),
              httpSampler("_GET /showcases/{showcaseId}",host+"/showcases/${SHOWCASE_ID}?source=1&version=0")
              .children(
                jsr223PostProcessor(showcaseProductGet())
              )
          ),
        csvDataSet("products.csv"),
          threadGroupBuilder.apply(maxRps)
          .counting(RpsThreadGroup.EventType.ITERATIONS)
          .children(
              fragments.stream()
                  .flatMap(r -> r.buildTestPlanPartWithBaseRps(maxRps).stream())
                  .toArray(value -> new ThreadGroupChild[value])
          ),
          jtlWriter("test" + Instant.now().toString().replace(":", "-") + ".jtl"),
          resultsTreeVisualizer()
          )
          .run();
          
    }


  }

  
  private static RpsThreadGroup buildThreadGroup(int baseRps) {
    return rpsThreadGroup()
        .maxThreads(100)
        .rampToAndHold(0.5 * baseRps, Duration.ofSeconds(10), Duration.ofMinutes(1))
        .rampToAndHold(1 * baseRps, Duration.ofMinutes(1), Duration.ofMinutes(1))
        .rampToAndHold(1.5 * baseRps, Duration.ofMinutes(1), Duration.ofMinutes(1))
        .rampToAndHold(2 * baseRps, Duration.ofMinutes(1), Duration.ofMinutes(1));

  }


  @Test
  public void test() throws Exception {
    String host = "https://somehost2.com";
    new RpsTestPlanProfile()
        .add(30, httpSampler("POST /products/search",host+"/products/search")
            .post("{ \"productSetInfos\": [{	\"productSetId\": \"23d2a667-a127-4d6c-a241-54338f01ede8\"} ], \"query\": \"${query}\" }", Type.APPLICATION_JSON)
            .children(
              jsr223PostProcessor("vars.put('PRODUCT_PAYLOAD',props.get('PRODUCTS_PAYLOAD')); System.out.println(props.get('PRODUCTS_PAYLOAD'))") /*I see the variable in log*/
            ))
        .add(6, httpSampler("POST products/dictionary ",host+"/products/dictionary")
            .post("[    ${PRODUCT_PAYLOAD} ]", Type.APPLICATION_JSON))
        .add(4, httpSampler("POST /productSets/{productSetId}/products/batch",host+"/productSets/23d2a667-a127-4d6c-a241-54338f01ede8/products/batch?version=1")
            .post("[    ${PRODUCT_PAYLOAD} ]", Type.APPLICATION_JSON))
        .run(PerfTest::buildThreadGroup);
  }
}

The PRODUCT_PAYLOAD doesn't work. But in previous PostProcessor I see the variable value, thet's strange. Any ideas?

Default elements in test plan

JMeterDSL adds HTTP Cookie Manager and HTTP Cache Manager by default.
In the same time when we create a test plan in JMeter natively there is no elements by default.
I propose to remove default elements from jmeter-java-dsl test plan.
Moreover, JMeterDSL and JMeter are multiprotocol load testing tools (http, jdbc, …), so http requests may not be in a test

Workaround according to docs:

testPlan(
    httpCookies().disable(),
    httpCache().disable(),

Jmeter Master e Slave

As I work with master and slave on jmeter dsl, I didn't see anything in the documentation, I needs to do a high load of data in several machines???

Able to use assertThat instead of Jmeter Autostop plugin

I'm not good in junit or in assertj lib, but I want to understand how it works.

When I run the test assertion happens online or after the test runs?
Is it possible to do this online in such a way that it stops the test (or does some kind of logic)?

What I want? I am currently making a Quality Gate, which should work according to a number of parameters. For example, a certain request should fit into 99 percentiles in 1 second, there should be no more errors than a certain number. These requirements can be both for the entire test and for a specific request. And when they are not fulfilled, there is no point in driving the test further - this is a waste of time. We are working to ensure that stress testing has a minimal effect on the time to market. And this is the way to reduce this time.

The AutoStop plugin is not bad, but it is very poor in functionality. and it works according to logic that is not entirely suitable for me - it kills the jmeter process, and I need it to launch the tearDown thread group, which is a different standard (StopTest and not ShutDown as far as I remember).

Integration with RestAssured

This is another feature 💯 .
It would be nice to have a way to integrate with RestAssured to reuse the actual steps.

API to save response body

Can you please provide API to save response body?

It would be helpful to analyze issues when something goes wrong with application under test.

Jmeter and java versions

  1. In pom i see the jmeter 5.2.1. Do I have to use this version?
  2. I am trying to compile and run my java 16 code. Throws a lot of exceptions during the test plan creation phase. Are these problems just me?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.