Giter Club home page Giter Club logo

moodle-performance-comparison's Introduction

Tools to compare Moodle sites and/or branches performance.

Purpose

They can be used to compare:

Features

  • Clean site installation (from Moodle 2.5 onwards)
  • Fixed data set generation with courses, users, enrolments, module instances... (from Moodle 2.5 onwards)
  • JMeter test plan generation from course contents (from Moodle 2.5 onwards)
  • Web and database warm-up processes included in the test plan (results not collected)
  • JMeter runs gathering results about moodle performance data (database reads/writes/querytime, memory usage...)
  • Database query time value will differ depending on database and hardware used.
  • Runs results comparison

There are scripts for both the web server and the JMeter server sides.

  • In case they are both in the same server you just need to clone the project once.
  • In case they are in different servers you need to clone the project in both servers
    • test_runner.sh along with jmeter_config.properties will be used in the server hosting JMeter
    • before_run_setup.sh, after_run_setup.sh and restart_services.sh will be used in the web server

Requirements

Installation

The installation process differs depending whether you have both the web server and JMeter in the same computer or not.

Web and JMeter servers in the same computer (usually a development computer)

  • Get the code
  • Configure the tool
    • cp webserver_config.properties.dist webserver_config.properties
    • Edit webserver_config.properties with your own values
    • cp jmeter_config.properties.dist jmeter_config.properties
    • Edit jmeter_config.properties with your own values

Web server and JMeter running from a different server

  • Get the code in the web server
  • Get the code in the JMeter server
  • Configure the tool in the web server
    • cp webserver_config.properties.dist webserver_config.properties
    • Edit webserver_config.properties with your own values
  • Configure the tool in the JMeter server
    • cp jmeter_config.properties.dist jmeter_config.properties
    • Edit jmeter_config.properties with your own values

Usage

The simplest is to just execute compare.sh, but it will only work in development computers where jmeter is installed in the web server and when you are testing differences between different branches. For other cases the process also differs depending whether you have both web server and JMeter in the same computer or not. Here there is another alternative, you can load your sql dump instead of having a clean brand new site with a fixed dataset, so you can run the generated test plan using real site generated data.

The groupname and description arguments of test_runner.sh are useful to identify the run when comparing results, you can use it to set the branch name, the settings you used or whatever will help you identify which run is it.

Note that you can run the tests as many times as you want, you just need to run after_run_setup.sh and restart_services.sh before running test_runner.sh every time to clean up the site.

It is recommendable that you run all the scripts using the same user (there is no need to use a root user at all) you can use different users to run them (there are no restrictions about it) but be sure that the permissions are correct, it seems to be one of the more common issues when running this tool.

Web and JMeter servers in the same computer, to find performance differences between different branches (usually a development computer)

  • Run compare.sh, the browser will be automatically opened after both runs are finished
    • ./compare.sh
  • In case the browser doesn't open properly the comparison page, browse to

Web and JMeter servers in the same computer, to find performance differences changing site settings / cache stores

  • Generate the data and run the tests
    • cd /path/to/moodle-performance-comparison
    • ./before_run_setup.sh {XS|S|M|L|XL|XXL}
    • Change site settings if necessary according to what you are comparing
    • ./restart_services.sh
    • ./test_runner.sh {groupname} {descriptioname}
    • ./after_run_setup.sh
    • Change site settings if necessary according to what you are comparing
    • ./restart_services.sh
    • ./test_runner.sh {groupname} {descriptioname}
  • Check the results

Web server and JMeter running from a different server

  • Generate the data and the test plan (web server)
    • cd /path/to/moodle-performance-comparison
    • ./before_run_setup.sh {XS|S|M|L|XL|XXL}
    • Change site settings if necessary according to what you are comparing
    • ./restart_services.sh
  • Get the test plan files (jmeter server)
  • Get the $beforebranch moodle version data (jmeter server)
  • Run the before test (jmeter server)
    • cd /path/to/moodle-performance-comparison
    • ./test_runner.sh {groupname} {descriptioname} testplan.jmx testusers.csv site_data.properties
  • Restore the base state to run the after branch (web server)
    • cd /path/to/moodle-performance-comparison
    • ./after_run_setup.sh
    • Change site settings if necessary according to what you are comparing
    • ./restart_services.sh
  • Get the $afterbranch moodle version data (jmeter server)
  • Run the after test (jmeter server)
    • cd /path/to/moodle-performance-comparison
    • ./test_runner.sh {groupname} {descriptioname} testplan.jmx testusers.csv site_data.properties
  • Check the results (web server)

Using your own sql dump (Moodle 2.5 onwards)

The installation and configuration is the same, it also depends on if you use the same computer for both web server and JMeter or not, but the usage changes when you want to use your own sql dump, it is not that easy to automate, as you need to specify which course do you want to use as target course and you can not use before_run_setup.sh to generate the test plan and test_files.properties.

  • cd /webserver/path/to/moodle-performance-comparison
  • Restore your dataroot to $dataroot
  • Restore your database to $dbname in $dbhost
  • Get the moodle code
  • Upgrade the site to $beforebranch
    • cd moodle/
    • git checkout $beforebranch
    • php admin/cli/upgrade.php --allow-unstable --non-interactive
  • Generate the test plan updating users passwords. You need to provide the shortname of the course that will be tested
    • php admin/tool/generator/cli/maketestplan.php --size="THESIZEYOUWANT" --shortname="TARGETCOURSESHORTNAME" --bypasscheck --updateuserspassword
  • Generate the site_data.properties file, with the current moodle version data, in the root directory of moodle-performance-comparison
    • cd ..
    • ./create_site_data_file.sh
  • Download the test plan and the test users. The URLs are provided by maketestsite.php in the previous step, before the performance info output begins.
  • Backup dataroot and database (pg_dump or mysqldump), this backup will contain the updated passwords
  • Create moodle-performance-comparison/test_files.properties with the backups you just generated and the test plan data
    • cd ../
    • Create a new /path/to/moodle-performance-comparison/test_files.properties file with the following content:

testplanfile="/absolute/path/to/testplan.jmx"

datarootbackup="/absolute/path/to/the/dataroot/backup/directory"

testusersfile="/absolute/path/to/testusers.csv"

databasebackup="/absolute/path/to/the/database/backup.sql"

  • cd /path/to/moodle-performance-comparison and continue the normal process from restart_services.sh -> test_runner.sh -> after_run_setup.sh -> restart_services.sh -> test_runner.sh

Using your own sql dump (before Moodle 2.5)

Moodle 2.5 introduces the site and the test plan generators, so you can not use them if you are comparing previous branches. But you can:

  • Use the template included in Moodle 2.5 codebase and fill the placeholders with one of your site courses info and the test plan users, loops and ramp up period
    • The test plan template is located in admin/tool/generator/testplan.template.jmx
  • Fill a testusers.php with the target course data
    • You will need to check that the test data has enough users according to the data you provided in the test plan
  • Generate the site_data.properties file, with the current moodle version data, in the root directory of moodle-performance-comparison
    • cd ..
    • ./create_site_data_file.sh
  • Follow Using your own sql dump (Moodle 2.5 onwards) instructions

Advanced usage

  • You can overwrite the values provided by the test plan using test_runner.sh options:
    • -u=[users_number]
    • -l=[loops_number]
    • -r=[rampup_period]
    • -t=[throughput]

Security

This tool in only intended to be used in development/testing environments inside the local network, it would be insecure to expose the project root in a public accessible web server, the same only exposing moodle/ directory:

  • Database connection data and other sensitive data is stored in properties files (you can change permissions)
  • It uses default sugar passwords (you can change the defaults in webserver_config.properties)
  • Stores test users credentials in Moodle's wwwroot (you can change permissions)
  • In general all files permissions are non secure at all (you can change permissions)
  • Other things I probably forgot, to resume, don't do it unless you are sure what you are doing

Troubleshooting

  • You can find an extensive troubleshooting guide here
  • You might be interested in raising the PHP memory_limit to 512MB (apache) or something like that to 'M' or bigger when comparing results.
  • You can find JMeter logs in logs/
  • You can find runs outputs in runs_outputs/ the results in runs_samples/ and the php arrays generated from them in runs/
  • The generated .jtl files can be big. Don't hesitate to get rid of them if you don't need them for extra analytic purposes.
  • Same with $backupsdir/ contents, if you run before_run_setup.sh many time you will have a looot of hd space wasted.
  • If files with java_pid[\d]+.hprof format are generated in your project root means that jmeter is running out of resource. http://wiki.apache.org/jmeter/JMeterFAQ#JMeter_keeps_getting_.22Out_of_Memory.22_errors.__What_can_I_do.3F for more info.

moodle-performance-comparison's People

Contributors

andrewnicols avatar danpoltawski avatar dmonllao avatar jfilip avatar junpataleta avatar sarjona avatar stronk7 avatar vmdef avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

moodle-performance-comparison's Issues

Error in recorder.bsf

Using Centos 6.4 64-bit
java version 1.7.0_45
apache-jmeter-2.11

-------------ERROR LOG--------------
2014/01/13 00:54:49 INFO - jmeter.threads.JMeterThread: Thread started: Moodle Test 2-3
2014/01/13 00:54:49 INFO - jmeter.threads.ThreadGroup: Started thread group number 2
2014/01/13 00:54:49 INFO - jmeter.engine.StandardJMeterEngine: All thread groups have been started
2014/01/13 00:54:50 ERROR - jmeter.util.BeanShellInterpreter: Error invoking bsh method: source Sourced file: recorder.bsf : Typed variable declaration : Typed variable declaration : Attempt to resolve method: rightPad() on undefined variable or class name: StringUtils
2014/01/13 00:54:50 WARN - jmeter.visualizers.BeanShellListener: Problem in BeanShell script org.apache.jorphan.util.JMeterException: Error invoking bsh method: source Sourced file: recorder.bsf : Typed variable declaration : Typed variable declaration : Attempt to resolve method: rightPad() on undefined variable or class name: StringUtils
2014/01/13 00:54:50 INFO - jmeter.threads.JMeterThread: Thread started: Moodle Test 2-4
2014/01/13 00:54:50 ERROR - jmeter.util.BeanShellInterpreter: Error invoking bsh method: source Sourced file: recorder.bsf : Typed variable declaration : Typed variable declaration : Attempt to resolve method: rightPad() on undefined variable or class name: StringUtils
2014/01/13 00:54:50 WARN - jmeter.visualizers.BeanShellListener: Problem in BeanShell script org.apache.jorphan.util.JMeterException: Error invoking bsh method: source Sourced file: recorder.bsf : Typed variable declaration : Typed variable declaration : Attempt to resolve method: rightPad() on undefined variable or class name: StringUtils
2014/01/13 00:54:50 INFO - jmeter.threads.JMeterThread: Thread started: Moodle Test 2-5
2014/01/13 00:54:50 ERROR - jmeter.util.BeanShellInterpreter: Error invoking bsh method: source Sourced file: recorder.bsf : Typed variable declaration : Typed variable declaration : Attempt to resolve method: rightPad() on undefined variable or class name: StringUtils
2014/01/13 00:54:50 WARN - jmeter.visualizers.BeanShellListener: Problem in BeanShell script org.apache.jorphan.util.JMeterException: Error invoking bsh method: source Sourced file: recorder.bsf : Typed variable declaration : Typed variable declaration : Attempt to resolve method: rightPad() on undefined variable or class name: StringUtils
2014/01/13 00:54:50 INFO - jmeter.threads.JMeterThread: Thread started: Moodle Test 2-6

Create 28 branch

Moodle 28 has been released, set the 28 release commit hash as $basecommitadd

When web root is misconfigured, the tool doesn't detect it

I was getting jmeter failing:

2013/12/04 14:16:16 WARN  - jmeter.save.SaveService: Problem loading XML, cannot determine class for element: html
2013/12/04 14:16:16 ERROR - jmeter.JMeter: Error in NonGUIDriver java.lang.NullPointerException
        at org.apache.jmeter.gui.tree.JMeterTreeModel.addSubTree(JMeterTreeModel.java:92)
        at org.apache.jmeter.JMeter.runNonGui(JMeter.java:754)
        at org.apache.jmeter.JMeter.startNonGui(JMeter.java:732)
        at org.apache.jmeter.JMeter.start(JMeter.java:390)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.jmeter.NewDriver.main(NewDriver.java:259)

Turned out it was because I misconfigured my web root:

cat ../moodle/testplan.jmx
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL /moodle-perf/pluginfile.php/1/tool_generator/testplan/0/testplan_201312041415_1808.jmx was not found on this server.</p>
</body></html>

Problems with test_files.properties pointing to already deleted runs...

I had a site with some old runs on it, so 2 days ago (october 22), I proceed to:

  1. clean_data.sh
  2. run a new ./compare.sh, comparing 25_STABLE with master.

Today (october 24), I've applied your latest patches and also the "thresholds" pull request and have proceed to:

  1. clean_data.sh
  2. run a new ./compare.sh, comparing 25_STABLE with master.

Then:

  • Installation went ok. (before_branch)

  • Test execution went ok. (test_runner)

  • Got an error (after_branch):

    Users/stronk7/Sites/perf_backups/dataroot_backup_201310221712: No such file or directory
    ./after_run_setup.sh: line 79: /Users/stronk7/Sites/perf_backups/database_backup_201310221712.sql

So, at some point, the OLD, already deleted (october 22) backup is still there in test_files.properties and it never has been updated to point to the new (october 24) backups.

Ciao :-)

Expand test coverage for AJAX

Noting this down for reference, following on from some discussions in Moodle Dev chat. The current JMeter scripts do not call AJAX services.
The following rely heavily on AJAX:

  • Moodle Dashboard
  • Moodle Calendar
  • Activity Completion

i.e. all of the key areas which have been found to be problematic at scale in the real world rather than through these tests.

Big value for Large Scale Moodle in improving this.

We're continuing to research this at UCL along with exploring the use of K6 on a pipeline.
Potential for a Moodle Partner to work with us on this. If we get anywhere, we'll be able to performance test our own site but will be very keen to contribute back for the benefit of all.

Add a setting for the compare.sh size

Now is hardcoded to S (30 users & 5 loops) as is the minimum size value to achieve consistency in the results. I usually edit the script to change it to XS while I'm developing and I need more data but can also be useful if a developer (compare.sh should only be used by devs. in client computers) wants to change it to other levels.

PD: Ok, yes, I want it for me, developing purposes :P

non well-formed numeric value with PHP7

Notice: A non well formed numeric value encountered in /var/lib/jenkins/git_repositories/moodle-performance-comparison/webapp/classes/test_plan_runs.php on line 173
PHP Notice: A non well formed numeric value encountered in /var/lib/jenkins/git_repositories/moodle-performance-comparison/webapp/classes/test_plan_runs.php on line 173

Display numeric data in the user interface

Something like what the previous UI was doing, to show absolute numbers by user. The numbers are currently displayed but we have to hover the bars/lines to see them, which is not specially convenient.

We can also think if would be better to also display it for each step...

delete_files() and wildcards

First of all thank you for this great set of tools. This makes Moodle performance testing a breeze.

I noticed that delete_files() won't work for strings like "runs/.php" in clean_data.sh.
It seems like the whole string is interpreted as a literal: rm "runs/
.php".
With quotes removed in lib.php, last line of delete_files(), it works as expected:
rm -rf $1

Allow removing and downloading runs from the web interface

Add an option to delete runs (from runs/) from the web user interface. We will need to chmod runs/* or create them with 777 as we don't know about the user system.

Also will be good to be able to download runs quickly (without having to get into the FS) to share with others and/or attach to issues (renaming the file to avoid remote but possible problems would be an extra)

Add the average of $var per request

Where $var can be dbreads, dbwrites, memoryused, filesincluded, serverload, sessionsize or timeused.

An option would be to display it below all graphs.

Add info about $beforebranch and $afterbranch commit to the run info

With values $beforebranch = MOODLE_25_STABLE we don't have any info about the commit, we can:

  • Force users to write hashes rather than branch names
  • Add info about the branch to the test run

The aim is that we know exactly which one is the last commit when we see the run info

Move to set -e everywhere

Any sort of shell scripts which execute and don't stop on errors can become incredibly dangerous.

We should move to activating set -e everywhere. It will be a pain, but its always worth it. Please ensure any new files start with set -e as it will make it easier in future.

Before run should stop on errors

For instance, when the /your/directory/with/a/lot/of/free/space/ is not configured, it still executes:

fred@fred:~/www/moodle-performance-comparison$ ./compare.sh

Installing Moodle (d45e65ccadbf4097a970655941ac1c5cae17f26d)
Moodle site configuration finished successfully.
mkdir: cannot create directory /your/directory/with/a/lot/of/free/space': No such file or directory Creating Moodle (d45e65ccadbf4097a970655941ac1c5cae17f26d) database and dataroot backups cp: cannot create directory/your/directory/with/a/lot/of/free/space/dataroot_backup_201310091356': No such file or directory
./before_run_setup.sh: line 190: /your/directory/with/a/lot/of/free/space/database_backup_201310091356.sql: No such file or directory
Upgrading Moodle (d45e65ccadbf4097a970655941ac1c5cae17f26d) to d45e65c
No upgrade needed for the installed version 2.6dev (Build: 20130927) (2013092700). Thanks for coming anyway!

'Before' run setup finished successfully.

Error in NonGUIDriver

I have spent near a week trying to get this tool to work but with little success. I have attempted the following:

CENTOS 6.5
Java 1.6 or Java 1.7 (64 bit)
PHP 5.5 and 5.3
jmeter versions 2.7, 2.8, 2.9, 2.10, and 2.11

jmeter and moodle on the same server.
jmeter and moodle on different servers.
Using own mysqldumps of Moodle.

I keep getting this error when running ./compare.sh
"
ERROR - jmeter.JMeter: Error in NonGUIDriver com.thoughtworks.xstream.io.StreamException: : only whitespace content allowed before start tag and not F (position: START_DOCUMENT seen F... @1:1)
at com.thoughtworks.xstream.io.xml.XppReader.pullNextEvent(XppReader.java:124)
at com.thoughtworks.xstream.io.xml.AbstractPullReader.readRealEvent(AbstractPullReader.java:148)
at com.thoughtworks.xstream.io.xml.AbstractPullReader.readEvent(AbstractPullReader.java:141)
at com.thoughtworks.xstream.io.xml.AbstractPullReader.move(AbstractPullReader.java:118)
at com.thoughtworks.xstream.io.xml.AbstractPullReader.moveDown(AbstractPullReader.java:103)
at com.thoughtworks.xstream.io.xml.XppReader.(XppReader.java:63)
at com.thoughtworks.xstream.io.xml.AbstractXppDriver.createReader(AbstractXppDriver.java:54)
at com.thoughtworks.xstream.XStream.fromXML(XStream.java:913)
at org.apache.jmeter.save.SaveService.loadTree(SaveService.java:501)
at org.apache.jmeter.JMeter.runNonGui(JMeter.java:749)
at org.apache.jmeter.JMeter.startNonGui(JMeter.java:732)
at org.apache.jmeter.JMeter.start(JMeter.java:390)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.jmeter.NewDriver.main(NewDriver.java:259)
Caused by: org.xmlpull.v1.XmlPullParserException: only whitespace content allowed before start tag and not F (position: START_DOCUMENT seen F... @1:1)
at org.xmlpull.mxp1.MXParser.parseProlog(MXParser.java:1519)
at org.xmlpull.mxp1.MXParser.nextImpl(MXParser.java:1395)
at org.xmlpull.mxp1.MXParser.next(MXParser.java:1093)
at com.thoughtworks.xstream.io.xml.XppReader.pullNextEvent(XppReader.java:109)
... 16 more
"

I am assuming this is a user error on my part. Is there any way we could have a very detailed step by step guide on how to get this tool working including all dependencies needed for this tool to run?

cache code deletion and git checkout

I am trying to begin using this fine-looking project in the most basic way possible, that is, having a test run on the 3.8 branch on the local machine and I'm having problems that confuse me and that don't seem documented.

  1. With a vanilla test run, call to admin/cli/install_database.php breaks moodle because dataroot points to the moodle source dir and that cachedir is not set. In lib/upgradelib.php::install_core(), we wipe the cache/ folder of the source file, which is not a cache but code managing cache and thus create an require error later on. To make a test run, I need to modify config.php.template to add a cachedir override.
  2. In before_run_setup.php, we wipe $dataroot and then make a git checkout in it. Unfortunately, deleted files stay deleted until a git reset --hard is ran. I can't make a test run without adding git reset --hard in checkout_branch

I'm new to moodle so I'm very probably missing something obvious. Could you please point me towards ways to make test runs without having to make those two modifications? I could then improve the readme to save the next poor soul the troubles :)

Also: do you think we could change the mysqli checks in the bash scripts to mysqli or mariadb? I use the mariadb driver and hacked my way around those checks and things seem to work.

Ability to export data

Many things to discuss here... Runs data is all contained in PHP files, one per run, so the fastest way to code it would be allowing exports & imports of those .php files.

I doubt whether it is worth to invest resources in creating a new import/export format as those files are pretty readable, containing the run data at the beginning. Waiting for feedback.

Archive results consistency through averages rather than forcing settings

As discussed in the integration chat, would be dangerous to continuously add settings to config.php to have constant and stable results, as:

  • Real sites most of the time uses the standard configuration
  • Real sites users behaviour is random
  • The cost of adapting the settings to new changes would be high
  • Would be hard to compare old major releasese with new ones if we add settings to control them

On the other hand tests need more time to finish and they will not be that exact.

We need to change the tool to stress more on loops and test plan repeats and adapt the results collector and/or displayed results to it.

sudo delete_files not working

In clean_data.sh following calls don't work.
sudo delete_files "cache"
sudo delete_files "$backupsdir/*"
Looks like sudoing a bash function is not possible.

Replaced "delete_files" with "rm -rf" and removed the quotes to make it work:
sudo rm -rf cache
sudo rm -rf $backupsdir/*

Get and measure the number of requests per page

Would be useful to add a new measure variable to compare the number of requests per page because there are changes in how we join JS files for example, this means that:

Allow CI servers to specify if they want to know about all changes (totals + per step) or only totals

Today @rajeshtaneja and I have been talking about this tool and how the results we are getting are not satisfying us. There are multiple issues that we should look at, this issue is about the first of them:

  • The per-step results are not reliable; considering that we send requests in a random order IMO we should increase the per-step thresholds or just disable them. I think that we should choose between testing our real-site-mock performance as a whole or to tests each page's performance. A real site will receive requests from concurrent users in a random order, we try to mimic this random human behaviour configuring jmeter to act like this, but that means that the conditions when a user reaches each step are different and the results will vary a lot.

If we would have to choose I think that we may be more interested in a whole site performance, to see how major core changes affects the general site performance. We could introduce a setting to select whether we want a comparison of the whole site performance or a comparison of each of the steps performance, changing jmeter settings and threshold values accordingly, but for what we have been seeing and the random behaviours of moodle (caches, LASTACCESS_UPDATE_SECS...), I think that would be hard to get consistent results by step, so I would vote for, at least initially, increase the per-step threshold values and to add an extra param to report::calculate_big_differences() to ignore per-step results so we can rely on CI servers warning us about changes affecting the whole system performance. I'm setting this issue as this issue's name.

  • #48 We are missing some important data being sent after we output the MDL_PERF footer info:

I'm currently testing an alternative to, instead of using MDL_PERFTOFOOT var + MDL_PERF_TEST (the one we created to catch data from redirects) remove all references to MDL_PERF_TEST and keep a simple echo $performanceinfo['txt'] at the end of the shutdown function based on MDL_PERF_TEST, the same place where we write to error_log. MDL_PERF_TEST is only set in this tool's tests (as a collateral damage we remove test testing frameworks references in codebase). Probably @danpoltawski can comment about it as he has experience working with apache logs, but this solution would work exactly like reading from apache logs but less trouble for what I can see. Using this alternative we will catch everything, including redirections. @danpoltawski, do you see any problem using this approach? I will paste all this in #48

  • For what Rajesh told me the CI performance server is reporting some misses (404 http code) on some runs randomly, this is not expected and breaks the reliability of all results as, even though it is just one request, results would be affected.

Probably the machine runs out of resources, we need to tune properly the web server and the database to ensure that we don't stress the machine, we are not doing stress tests, just performance tests, but the more users we use the more stable results we can get.

Look for exceptions in jmeter logs

Jmeter does not always return an error code when there was an error. We should grep the generated logs files to look for them ("Exception", "Error"...) to stpo execution if something went wrong.

README file indicates incorrect order of parameters to test_runner.sh script

The README file indicates the following parameter order for executing the JMeter tests:

./test_runner.sh {groupname} {descriptioname} testusers.csv testplan.jmx site_data.properties

But, after failures and examining the test_runner.sh script, I realised that the parameter order should be:

./test_runner.sh {groupname} {descriptioname} testplan.jmx testusers.csv site_data.properties

Refine charts info

#17 (comment):

Copy & paste from Eloy's comments: How to improve the new page. IMO there are things the should be improved. For example, the legends in the grouped charts when the number of steps grow (and they are going to grow). Or showing the +-% of difference on each graph.

Allow multiple repositories

After switching to single to multiple and to multiple to single again we finally allow multiple repositories :) which means that every branch can use a different repo. With this we can run daily comparisons between moodle.git and integration.git

We are missing info

I'm sad :( https://tracker.moodle.org/browse/MDL-47900?focusedCommentId=320129&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-320129

We process the MDL_PERF data displayed on the page footer to know about how things went, but that's not the end of the page scripts, standard log db write-per-page-request is not counted here, and the same happens now when trying to test the events monitor; these db writes are executed before closing the page request in a register_shutdown_function() We are missing all this data and we want it, I am thinking on another register_shutdown_function() (ensuring that is the last registered function to be added) that will output whatever it is that we can parse later from this tool if MDL_PERF_TEST is enabled, and move from parsing the footer html like we currently do to parse this.

Add a defaults file

If we add new stuff to webserver_config.properties.dist (for example) and users already have a webserver_config.properties the new vars will not be added after checking out the latest code.

Providing a .defaults file with non-sensitive vars and loading it (overwritten by webserver_config.properties) would solve the problem in most of the cases.

Update basecommits after MDL-42447

Previous runs results can not be compared after MDL-42447 got integrated, we should update all the branches to the last 25, 26 and master hashes.

Add a base branch

I would paste the conversation with @stronk7 but is in Spanish so is better to resume it.

Thinking about our CI jobs and historic data about them, we don't have enough with $beforebranch and $afterbranch, if there is a tool_generator change between $beforebranch and $afterbranch the results can not be compared, we need to define a common base branch and keep the generators as static as possible.

$basebranch should also be included to the information passed to the test run.

Add more data to the runs about the server environment

Copied from #17 (comment):

Consider to add some extra information to runs. In order to be able to compare different os, web servers, databases, php versions... we should be able to grab, on each run, that information. Note it's information to be fetched from the server. That way we could compare performance between mysql and postgres, or php5.3 vs 5.5 or linux vs windows... or whatever. And the only way to do so is to know against which server each run was done. Right now we lack that completely. those new "variables" should not affect bases nor groupings... but perhaps should be searchable (in the form) and also readable (in the list of runs). Again, please fill a new issue for that.

JMeter Keystore Warning

I run the moodle-performance-comparison from my client PC against our existing test environment and got the following error. Nothing that prevents the test from completing but I wonder how to fix it anyway.

$ ./test_runner.sh first "first test" moodle-testplan_XS.jmx moodle-users-XS.csv site_data.properties
#######################################################################
Test running... (time for a coffee?)
Error: "WARN" found in jmeter logs, read logs/jmeter.202307251209.log to see the full trace. If you think that this is a false failure report the issue in https://github.com/moodlehq/moodle-performance-comparison/issues.

from logs/jmeter.202307251209.log

2023-07-25 12:09:42,539 INFO o.a.j.u.SSLManager: JmeterKeyStore Location:  type JKS
2023-07-25 12:09:42,541 INFO o.a.j.u.SSLManager: KeyStore created OK
2023-07-25 12:09:42,542 WARN o.a.j.u.SSLManager: Keystore file not found, loading empty keystore

My client is arch Linux with JMeter 5.5 installed from AUR.

Not showing valid value after test_runner.sh

Here is my issue, I had run this script for benchmark my moodle site.

However, I did not see any value in the result except latency.

$results[87][] = array(
'thread'=>87,
'starttime'=>1586339991470,
'dbreads'=>0,
'dbwrites'=>0,
'dbquerytime'=>0,
'memoryused'=>'0',
'filesincluded'=>'0',
'serverload'=>'0',
'sessionsize'=>'0',
'timeused'=>'0',

.........

all results value are zero.
What happened here?

Thanks

Typed variable declaration : Object constructor

Hello,

After user warmup, i've many of them:

2016/07/11 12:10:18 ERROR - jmeter.util.BeanShellInterpreter: Error invoking bsh method: source Sourced file: /home/killian/Documents/moodle/moodle-performance-comparison-master/recorder.bsf : Typed variable declaration : Object constructor 
2016/07/11 12:10:18 WARN  - jmeter.visualizers.BeanShellListener: Problem in BeanShell script org.apache.jorphan.util.JMeterException: Error invoking bsh method: source    Sourced file: /home/killian/Documents/moodle/moodle-performance-comparison-master/recorder.bsf : Typed variable declaration : Object constructor 
2016/07/11 12:10:18 INFO  - jmeter.threads.JMeterThread: Thread finished: Moodle Test 2-407 
2016/07/11 12:10:18 ERROR - jmeter.util.BeanShellInterpreter: Error invoking bsh method: source Sourced file: /home/killian/Documents/moodle/moodle-performance-comparison-master/recorder.bsf : Typed variable declaration : Object constructor 
2016/07/11 12:10:18 WARN  - jmeter.visualizers.BeanShellListener: Problem in BeanShell script org.apache.jorphan.util.JMeterException: Error invoking bsh method: source    Sourced file: /home/killian/Documents/moodle/moodle-performance-comparison-master/recorder.bsf : Typed variable declaration : Object constructor 
2016/07/11 12:10:18 INFO  - jmeter.threads.JMeterThread: Thread finished: Moodle Test 2-643 
2016/07/11 12:10:18 ERROR - jmeter.util.BeanShellInterpreter: Error invoking bsh method: source Sourced file: /home/killian/Documents/moodle/moodle-performance-comparison-master/recorder.bsf : Typed variable declaration : Object constructor 
2016/07/11 12:10:18 WARN  - jmeter.visualizers.BeanShellListener: Problem in BeanShell script org.apache.jorphan.util.JMeterException: Error invoking bsh method: source    Sourced file: /home/killian/Documents/moodle/moodle-performance-comparison-master/recorder.bsf : Typed variable declaration : Object constructor 
2016/07/11 12:10:18 INFO  - jmeter.threads.JMeterThread: Thread finished: Moodle Test 2-375 

Can you help me with that please? :)

I'm using ubuntu 15.10 & jmeter 3.0 from editor downloadable archive (not package).
My test plan:
testplan_201607111203_1807.jmx.tar.gz

Thanks for your time !

HEADs UP: This repo has moved from "master" to "main"

Hi all,

today (19th Jan 2024), we have proceeded to rename the "master" branch of this repository to "main".

You can find more details about the migration @ https://tracker.moodle.org/browse/MDLSITE-7418

For some guidance about how to proceed with your clones you can follow the information that GitHub provides in the home page of the repo and, also, take a look to this document @ Moodle Developer Resource centre:

https://moodledev.io/general/community/plugincontribution/master2main

Sorry for the trouble and ciao :-)

PS: This issue will be closed in a few weeks.

-Jbeanshell.listener.init=recorderfunctions.bsf: command not found

Hello all,

My environment:

JMeter: Version 5.4.1
java.version=1.8.0_292
java.vm.name=OpenJDK 64-Bit Server VM
os.name=Linux
os.arch=amd64
os.version=4.18.0-305.el8.x86_64
Max memory =68719476736


After running:

$ ./test_runner.sh Group1 Run1

The end result:

summary =  90000 in 12:30:00 =    2.0/s Avg: 19408 Min:    30 Max: 261061 Err: 14945 (16.61%)
Tidying up ...    @ Mon Jun 14 12:45:38 WIB 2021 (1623649538374)
... end of run
./test_runner.sh: line 168: -Jbeanshell.listener.init=recorderfunctions.bsf: command not found
Error: Jmeter

Why is it said Jbeanshell.listener.init=recorderfunctions.bsf: command not found and return: Error: Jmeter ?

What do you suggest?

Thank you in advance.

Include performance info also when the user is redirected

Following https://tracker.moodle.org/browse/MDL-42447 where we will add a new var (CFG or const) to show the performance info also when redirections are executed.

  • We have 303 redirections, which, by default, are followed by jmeter, getting the last page's performance info (we can change this behaviour disabling follow redirects)
  • We have 200 redirections, where we show a message to the user and we add a meta with a refresh value

The current test plan is affected by the second issue, when we send the data to add a new post discussion the performance info is not shown as we end up with a 200 HTTP response and a message stating that the user have 30 minutes to edit the post.

We would be affected by the first one if, for example, we test the add an activity instance performance, after adding it we perform a redirect with a 303 HTTP response which leads to the course main page or the activity page, depending on whether the selected submit button.

This issue is about adding the defined('MDL_PERF_TEST') or whatever var we end up using to the config.php template.

Decide what to do with the previous web interface to compare runs

Following what was commented #17 (comment) and #17 (comment). Now we are linking to the previous tool UI, so we are using both.

In my opinion would be better to add more functionality to the new one and deprecated the previous one, my points for that are:

  • Harder to maintain in comparison with the new one, which uses an external API to create charts
  • Costly to maintain 2 interfaces at the same time
  • Many things to fix (was one of the reasons to write a new one)

In general I think that would be faster to add the functionality we want to the new one rather than solving the current one issues, this at short term, at long term maintaining 2 UIs also seems more expensive. Maybe @samhemelryk can give us his opinion about it.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.