Giter Club home page Giter Club logo

moodle-qtype_formulas's Introduction

Formulas question type for Moodle

Automated code checks Automated acceptance tests Automated unit tests Coverage Status GitHub Release

This is a question type plugin for Moodle with random values and multiple answer fields. The answer fields can be placed anywhere in the question so that we can create questions involving various answer structures such as coordinate, polynomial and matrix. Other features such as unit checking and multiple subquestions are also available.

These functionalities can simplify the creation of questions in many fields related to mathematics, numbers and units, such as physics and engineering.

This question type was written by Hon Wai Lau and versions for Moodle 1.9 and 2.0 are still available at the original author's website at the date of this writing. It was then upgraded to the new question engine introduced in Moodle 2.1 by Jean-Michel Védrine.

This version is compatible with Moodle 3.9 and newer. It has been tested with:

  • Moodle 3.9 using PHP 7.4
  • Moodle 3.11 using PHP 7.4 and PHP 8.0
  • Moodle 4.0 using PHP 7.4 and PHP 8.0
  • Moodle 4.1 using PHP 7.4, PHP 8.0 and PHP 8.1
  • Moodle 4.2 using PHP 8.0, PHP 8.1 and PHP 8.2
  • Moodle 4.3 using PHP 8.0, PHP 8.1 and PHP 8.2
  • Moodle 4.4 using PHP 8.1, PHP 8.2 and PHP 8.3

Requirements

You will need to install Tim Hunt's Adaptive question behaviour for multi-part questions (qbehaviour_adaptivemultipart) prior to installing the formulas question type. You can also get it from GitHub.

You absolutely need version 3.3 or newer of this behaviour, the formulas question type will not work with previous versions.

Installation

Installation from the Moodle plugin directory (prefered method)

  1. Download the plugin from the Moodle plugin directory.
  2. Install as any other Moodle question type plugin.

Installation Using Git

To install using git type these commands in the root directory of your Moodle install:

$ git clone git://github.com/FormulasQuestion/moodle-qtype_formulas.git question/type/formulas
$ echo '/question/type/formulas/' >> .git/info/exclude

Installation From Downloaded ZIP file

Alternatively, download the zip and unzip it into the $MOODLE_ROOT/question/type folder. Do not forget to rename the new folder to formulas.

Creating formulas questions

This question type is very powerful and permit creation of a wide range of questions. But mastering all the possibilities requires some practice and there is a learning curve on creating formulas questions.

Here are some pointers to the available help :

  • First, you can import the Moodle XML file samples/sample-formulas-questions.xml and play with the included formulas questions.
  • You can visit the documentation made by Dominique Bauer. As there is no or little difference in the Formulas question type plugin for recent versions of Moodle (2.0 and above), the documentation for the Formulas question type has been moved to this location but it applies to all Moodle versions, including the current release.
  • You can read discussions about the formulas question type in the Moodle quiz forum like, for example, the thread where Jean-Michel Védrine presents the (then) new version for Moodle 2.0 or this one from Hon Wai Lau
  • You can post your own questions in this forum.

Reporting bugs, problems

Please open an issue on GitHub.

You can also open an issue in the Moodle Tracker

To create a new tracker issue:

  1. Log in and click on the Create button in the menu bar.
  2. Choose Plugins (CONTRIB) in the "Project" field.
  3. Set the "Component(s)" field to Question type: Formulas.
  4. Try to include as many details as you can so that the problem can be reproduced.

moodle-qtype_formulas's People

Contributors

dbauer-ets avatar golenkovm avatar jmvedrine avatar lucaboesch avatar philippimhof avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

moodle-qtype_formulas's Issues

Fix phpunit

phpunit is failing.

This file is working

RUN PHPUnit tests for qtype_formulas
Moodle 3.8.3+ (Build: 20200618), 898415efec81bc62bde20b862d1b139407a7329a
Php: 7.3.14, pgsql: 9.6.17, OS: Linux 5.0.0-1031-gcp x86_64
PHPUnit 7.5.20 by Sebastian Bergmann and contributors.
....................................................... 55 / 55 (100%)
Time: 3.54 seconds, Memory: 254.50 MB
OK (55 tests, 748 assertions)

These test still should be reviewed completely.

"Exception 5" after upgrading Moodle from 3.11 to 4.1

Description of bug / unexpected behavior

After upgrading Moodle from 3.11 to 4.1, I found that Formulas questions cause errors in the question bank.

As soon as an affected Formulas question is listed, a warning box occurs in the "Needs checking?" column that reads: "Exception - 5: Some expressions cannot be evaluated numerically." Below that question, no further questions are listed.

One can get rid of this warning box by opening the question for editing and saving it without actually making any changes. Back in the question bank, that specific question then gets listed without an error, and the next affected questions shows the warning and prevents listing of any further questions.

I hope I could describe this issue properly. – I don't know whether this a bug with Formulas or Moodle itself, but I only seem to see this problem with Formulas questions.

Expected behavior

There should not be those warning boxes, i.e. all questions should be upgraded to Moodle 4.1 without any errors.

How to reproduce the issue

  1. Start with several Formulas questions in Moodle 3.11.
  2. Upgrade Moodle to 4.1.
  3. Open question bank.
  4. See error.
Screenshots Screen Shot 2023-04-24 at 22 09 33

Environment

System Details
  • Moodle version: 4.1.3
  • Plugin version: 5.2.1
  • PHP version: 8.0.28
  • Browser: Safari/Firefox

Additional comments

pick() with arrays (lists)

The following code does not work properly:

words=["just","a","little","list","of","words"];
w=pick(2,words);

The result is just, but it should be little. In general, when using pick() with an array, for n==1 we get the second element and for all other cases (n==0 or any other value), we get the first element.

sigfig() function in a map() function

Description of bug / unexpected behavior

After the upgrade from version 4.43 to version 5.1.1, one particular thing doesn't work anymore with the sigfig() function.
This line doesn't work anymore :
liste = map("sigfig",[2.123,3.568],3);

Expected behavior

{liste[0]} does display 2.12
{liste[1]} does display 3.57

How to reproduce the issue

Try the xml attached file.
When we type the code line in version 5.1.1, there is an error message : Exception : Call to undefined function sigfig()

Environment

  • Moodle version: 3.11.2 or 4.0.6
  • Plugin version: 5.1.1
  • PHP version: 7.4.33
  • Browser: Firefox

quiz-[B2-SP-CO] S06_Oscillateurs 2020-sigfig() in map()-20230213-1324.zip

Unit tests are failing on MOODLE_39_STABLE branch

Running Moodle 3.9 unit tests and having the following issues:

There were 4 failures:

1) qtype_formulas_variables_test::test_evaluate_general_expression
Failed asserting that -0.35473297204849324 matches expected -0.35473297204849.

/var/www/site/question/type/formulas/tests/variables_test.php:79
/var/www/site/lib/phpunit/classes/advanced_testcase.php:85

To re-run:
 vendor/bin/phpunit "qtype_formulas_variables_test" question/type/formulas/tests/variables_test.php

2) qtype_formulas_variables_test::test_evaluate_assignments_1
Failed asserting that two arrays are equal.
--- Expected
+++ Actual
@@ @@
     'g' => stdClass Object (
         'type' => 'ln'
         'value' => Array (
-            0 => 1
-            1 => 47
-            2 => 2
-            3 => 2.718281828459
-            4 => 16
+            0 => 1.0
+            1 => 47.0
+            2 => 2.0
+            3 => 2.718281828459045
+            4 => 16.0
         )
     )
 )

/var/www/site/question/type/formulas/tests/variables_test.php:158
/var/www/site/lib/phpunit/classes/advanced_testcase.php:85

To re-run:
 vendor/bin/phpunit "qtype_formulas_variables_test" question/type/formulas/tests/variables_test.php

3) qtype_formulas_variables_test::test_evaluate_assignments_3
Failed asserting that two arrays are equal.
--- Expected
+++ Actual
@@ @@
     'A' => stdClass Object (
         'type' => 'ln'
         'value' => Array (
-            0 => 2.718281828459
-            1 => 7.3890560989307
-            2 => 20.085536923188
+            0 => 2.718281828459045
+            1 => 7.38905609893065
+            2 => 20.085536923187668
         )
     )
 )

/var/www/site/question/type/formulas/tests/variables_test.php:461
/var/www/site/lib/phpunit/classes/advanced_testcase.php:85

To re-run:
 vendor/bin/phpunit "qtype_formulas_variables_test" question/type/formulas/tests/variables_test.php

4) qtype_formulas_variables_test::test_numerical_formula_2
Failed asserting that 51.7392700412041 matches expected 51.739270041204.

/var/www/site/question/type/formulas/tests/variables_test.php:642
/var/www/site/lib/phpunit/classes/advanced_testcase.php:85

To re-run:
 vendor/bin/phpunit "qtype_formulas_variables_test" question/type/formulas/tests/variables_test.php

This seems to be already fixed in master branch, see d5ec6c2 . It would be great to have this fixed in MOODLE_39_STABLE branch too.

3.9 LTS version of Moodle still supports security updates till 11 December 2023. In practice that means that many slow moving organisations (especially universities) are most likely stay on 3.9 for a bit longer than bug fixing period. If you had a chance to have the patch backported that would be much appreciated.

I'm more than happy to make a PR for this.

Cheers,
Misha

function 'npr' returns wrong if n-r < r

Description

The function npr documented as \frac{n!}{\left(n-r\right)!}
has an if clause that returns wrong values if (n-r)<r.

Steps to reproduce

Screenshot_2021-05-05_22-26-10

Generate a random variable r

r = {3,7};

Global variables:

n = 10;
does = npr(n,r);
should = fact(n)/(fact(n-r));
check = (does==should)?1:0;

Expected result:

All generated variable duplets (n, r) should have check=1 (True), hence does is should and npr returned the expected (documented) results (720 and 604800 in the example above).

Actual result:

For every n, r where (n-r)<r, check=0 (False), hence does is not should; npr returned an unexpected result (720 and 720 in the example above).

Likely cause

The relevant code snippet is in variables.php:

   function npr($n, $r) {
        if ($r > $n) {
            return 0;
        }
       if (($n - $r) < $r) {  \\ this if clause causes the errors
           return npr($n, ($n - $r));
       }
        $return = 1;
        for ($i = 0; $i < $r; $i++) {
             $return *= ($n - $i);

Proposed fix

Remove the offending if statement.

Impact

Since moodle_formulas is designed to work with random variables, this bug is not easy to catch, hence this is most likely used and might affect students' grades.
This is especially an issue as the permutation calculator in the documentation does not exhibit the same issue.

DynamicCourseware.org status

Do you have any information when DynamicCourseware.org site could be back online? Many of our teachers like and depend on the documentation on your site :)

Thanks in advanced

"One possible correct answer is:"

Philipp,

To systematically say that there are several possible answers when there is only one, that is to say in the majority of cases, is a little misleading. I do not agree with the principle.

When displaying the correct answer automatically, only one answer is displayed. Saying that there are several possible answers and displaying only one is not logical.

Even when there are several possible answers, with certain problems, often it is not important to specify it. In these cases, the student is simply satisfied to get a correct answer.

In cases where it is important to specify that there are several possible answers, I would leave it to the teacher to do so in the wording of the question, and to confirm it in a tailor-made feedback.

In short, I suggest that we use exactly the same wording as in the other qtypes, that is to say "The correct answer is: {$a}", which seems to me sufficient and adequate in all cases.

It will also be necessary to modify the language packs of the other languages (French, Spanish, etc.). Can you do it? I think I know how to do it, if you want.

sort() function not always working correctly

Description of bug / unexpected behavior

  • sort(["4","3","A","a","B","2","1","b","c","C"]) --> ["A","B","C","a","b","c","1","2","3","4"], i.e. the letters come before the numbers.
  • sort(["B","C","A","B"]) --> ["A","B","C"], i.e. when two (or more) members are equal, they are merged into one member in the sorted array.
  • sort(["B","3","1","0","A","C","c","b","2","a"]) --> ["B","3","1","0","A","C","c","b","2","a"], i.e. the "0" element seems to interfere with the sorting.

Expected behavior

  • sort(["4","3","A","a","B","2","1","b","c","C"]) --> ["1","2","3","4","A","B","C","a","b","c"], i.e. the numbers come first.
  • sort(["B","C","A","B"]) --> ["A","B","B","C"]
  • sort(["B","3","1","0","A","C","c","b","2","a"]) --> ["0","1","2","3","A","B","C","a","b","c"]

More information

Inconsistency verifying student answer

When asking students to input the decimal fraction of randomly generated numbers based on the following syntax:
{a1}•10-{b1}= {_0}
They inconsistently verify as wrong answer, where they should be verified as correct ones.
image
Tested with latest stable version from moodle.org/plugins repository, with a Moodle 3.9 stable version.

Attached is a link to a formulas question XML export, for testing:
https://gist.github.com/nadavkav/bb04e7e76a0d207f9970148fa184f7c7

Check variables instantiation not work

Description of bug / unexpected behavior

"Check variables instantiation" not working after updating to version 2022112700

When you click on the "instantiate" button, the message appears

TypeError 

column.definition is undefined

In version 2020061900 "Check variables instantiation" works.

How to reproduce the issue

Create a simple quest:

Random variables: a={1:3}; b={1:3};
Global variables: c=a*b
Question text: how much is {=a}*{=b}

v2020061900 question/type/formulas/instantiate.php return json

{
  "names": {
    "random": [
      "a",
      "b"
    ],
    "global": [
      "c"
    ],
    "local0": [],
    "answer0": [
      "@1"
    ]
  },
  "lists": [
    {
      "random": [
        1,
        2
      ],
      "global": [
        2
      ],
      "local0": [],
      "answer0": [
        [
          2
        ]
      ]
    }
  ],
  "size": 1,
  "maxdataset": 4,
  "errors": [
    ""
  ]
}

v2022112700 lib/ajax/service.php?sesskey=Y2qDOO70dr&info=qtype_formulas_instantiate return json

[
  {
    "error": false,
    "data": {
      "status": "ok",
      "data": [
        {
          "randomvars": [
            {
              "name": "a",
              "value": "2"
            },
            {
              "name": "b",
              "value": "1"
            }
          ],
          "globalvars": [
            {
              "name": "c",
              "value": "2"
            }
          ],
          "parts": [
            []
          ]
        }
      ] } }
]
  • Moodle version: 3.9.18+ (Build: 20221216)
  • Plugin version: 2022112700
  • PHP version: 7.4.24
  • Browser: Firefox, chrome

Tests

This issue is meant to keep track of stuff that can / should be done w.r.t. automated testing.

Unit tests

  • is every function covered?
  • check of no. of arguments for each function
  • write specific unit tests for implementation of
    • concat
    • diff
    • fill
    • join
    • len
    • map
    • poly
    • shuffle
    • sort
    • str
    • sublist
    • sum
  • use strings from language file when checking for error messages (consistency)
  • use data providers for better diagnostic output
  • regrading

Behat

  • switch between easy and expert mode for grading criterion (one part / multiple parts)
    • given simple / expert criterion -> form initialisation correct?
    • valid simple criterions, switch to expert mode -> correct value shown?
    • invalid simple criterions -> what happens?
    • easy criterion, switch to simple mode -> conversion working?
    • set criterion (simple mode), submit -> correctly saved?
  • delete parts (once this is implemented via a button + javascript)
  • variable instantiation (number of datasets, which dataset to show, preview)
  • backup/restore for questions including an image
  • question usage in a quiz (port mobile tests to browser)
  • backup/restore of a course containing formulas question
  • moving of question between categories, with/without images, also move category to another parent category

gcd() in Formulas Update

Hi all,
I would like to suggest a change of gcd() in further versions [like it is realised by TInspire, or described in WIKIPEDIA]:
Now gcd(0,8) returns 1, but should return 8!
So please change the code-line
if ($a == 0 xor $b == 0) { return 1; }
to
if ($a == 0 or $b == 0) { return $a+$b; }
and you can omit
if ($a == 0 && $b == 0) { return 0; }

Regards
Joachim

Significant figures in the "Right answer" for the Number type answer

Regarding the statement "The correct answer is: {$a}" for the Number answer type, it may be useful to allow the number of significant digits to be specified for the value of {$a} . Indeed, if the answer is calculated as being equal to for example 2.039023427152, the automatic feedback will be displayed like this "The correct answer is: 2.039023427152", which is probably not desirable. The feedback "The correct answer is: 2.04", for example with three significant figures, is probably more adequate. However, presently this cannot be displayed with the "Right answer" review option.

formatcheck error

Hello,

I've got an error in my console:
Uncaught SyntaxError: Unexpected token 'else'

The brackets seem to be missing in the last lines of formatcheck.js:

window.onload = (function(oldfunc, newfunc) {
    if (oldfunc && typeof oldfunc == 'function')
        return function() { oldfunc(); newfunc(); }
    else
        return newfunc;
})(window.onload, function() { formulas_format_check(); });

-->

window.onload = (function(oldfunc, newfunc) {
    if (oldfunc && typeof oldfunc == 'function') {
        return function() { oldfunc(); newfunc(); }
    } else {
        return newfunc;
    }
})(window.onload, function() { formulas_format_check(); });

Moodle seems to need these brackets.

atan2(): Problems marking angle questions.

I have many formula type questions with random parameters where the students have to do many calculations with complex numbers. One of these calculations involves calculating the angle. Since it's important to keep the information about which quadrant the complex number is in (they have to add the complex numbers afterwards, so it's important to know the signs of the real and imaginary parts), I've been using the atan2() function. Last year when I was developing these questions, I believe that the documentation of the atan2() read something like this: "...this function delivers a positive angle from the positive x axis in counterclockwise direction" or so. We had a Moodle update recently, and now it reads: "...returns a numeric value between -π and π". Since the parameters are random, I cannot tell in advance what angle I'll get and in which quadrant it will be.

The problem is the following: the atan2() function sometimes returns a positive (counterclockwise) angle and sometimes a negative (clockwise) angle. If Moodle calculates a positive angle (for instance +10°) and the student inputs the corresponding (same) negative angle (in this case -350°, which is strictly speaking correct), Moodle marks that part of the question as incorrect.

What can I do to get Moodle to mark a correct angle (in this case either +10° or -350°) as correct?

Say my calculation code looks like this: answer=atan2(y,x);

Is it possible to do one of the following (if so, how)?:

  • Give 3 possible answers in the corresponding Part X, so that when the student inputs one of the thee the answer is marked as correct. In this case the calculation code could look like this:
    answer1=atan2(y,x);
    answer2=360+answer1; // if answer1 was negative, answer2 is positive
    answer3=answer1-360; // if answer1 was positive, answer3 is negative
    and the thee possible answers would be answer1, answer2 and answer3.

  • write an if clause in the calculation code like this:
    answer=atan2(y,x);
    if (answer<0)
    {
    answer1=360+answer; // the previously negative angle is now positive
    }
    end
    or otherwise force all negative angles to be converted to positive angles.

  • is there another function other than atan2() that would give me positive angles always?

Thanks for your help!

Feature request: new answer types "list" and/or "set"

Description of proposed feature

Add one or both of the following answer types

  • set, e.g. a list of numbers (strictly numbers, no numerical expressions) where the order does not matter
  • list, e.g. a sequence of numbers (strictly numbers again) where the order does matter

How can the new feature be used?

Sometimes, one might ask a question like:

  • list the divisors of the number …
  • find all solutions for the equation …

Currently, it is possible to have multiple answer fields, e.g. two for a quadratic equation. However, in certain cases, one does not want to "give away" the number of solutions. Currently, one would probably write something like: "if there is only one solution, enter 0 in the remaining fields". However, it would be nicer to have a field that could accept a set of solutions.

Also, currently, most people probably use a custom grading criterion to deal with the fact that students enter the two solutions of a quadratic equation in a different order. That problem would disappear with the idea of having just a set of solutions where the order does not matter.

Add option to set size of input fields

Description of proposed feature

Input fields currently have a fixed size according to the answer type. The size can be tweaked using custom CSS in the question, but this is a bit cumbersome.

It might be nice to have an option to specify the size of the input field, e.g. for each part.

Suggested by @AviNat in #7

Should the relative error variable in grading criterion accept "== 0"?

Should the relative error variable in grading criterion accept "== 0"? We tested a simple multiplying and division question with the Number answer type, and the correct answer was marked as incorrect in the preview window - seemingly randomly. The problem vanished when we switched the grading criterion to "Relative error < 0.001" or "Absolute error < 0.000001. Is this a bug or working as intended?

The random variables we used were:
firstthing={16,24,28,32,48,56,64};
secondthing={8,10,20,40}

The global variables we used were:
answer=(4firstthing)/secondthing7

Multiple answers and partial credits for algebraic answers

Description of proposed feature

When using algebraic answers, the teacher could give several possible answers and define how many credits the student should get. Currently, there is only one single possibility and the system checks whether the student's formula is equivalent to the one given by the teacher.

How can the new feature be used?

The teacher might ask a student to simplify an expression like a-(a+3)(a+4) where the correct answer is -a^2-6a-12. However, the student might get the sign wrong and therefore calculate a-a^2+7a+12 = 8a-a^2+12. It is clear that they should not get full marks for that, but the answer is not as bad as if they had written nothing at all; at least they managed to do the multiplication. So a teacher might anticipate that wrong answer and give 50% (or whatever) for it.

Error while exporting private data

Description of bug / unexpected behavior

During the export of private data, I get this error:

An exception occurred while calling mod_quiz\privacy\provider::export_user_data.
This means that the mod_quiz plugin has not finished processing data. The exception information is shown below. They can be sent to the plugin developer.

5: Variable "A" is not defined. in substitute_vname_by_variables


#0 /question/type/formulas/question.php(710): qtype_formulas\variables->evaluate_assignments(Object(stdClass), '@5 = @0; @6 = @...')
#1 /question/type/formulas/question.php(178): qtype_formulas_question->get_global_variables()
#2 /question/classes/privacy/provider.php(200): qtype_formulas_question->format_generalfeedback(Object(question_attempt))
#3 /mod/quiz/classes/privacy/provider.php(524): core_question\privacy\provider::export_question_usage($userid, Object(context_module), Array, 33051, Object(mod_quiz_display_options), true)
#4 /mod/quiz/classes/privacy/provider.php(336): mod_quiz\privacy\provider::export_quiz_attempts(Object(core_privacy\local\request\approved_contextlist))
#5 /lib/moodlelib.php(8139): mod_quiz\privacy\provider::export_user_data(Object(core_privacy\local\request\approved_contextlist))
#6 /privacy/classes/manager.php(578): component_class_callback('mod_quiz\\privac...', 'export_user_dat...', Array)
#7 /privacy/classes/manager.php(611): core_privacy\manager::component_class_callback('mod_quiz', 'core_privacy\\lo...', 'export_user_dat...', Array)
#8 /privacy/classes/manager.php(339): core_privacy\manager->handled_component_class_callback('mod_quiz', 'core_privacy\\lo...', 'export_user_dat...', Array)
#9 /admin/tool/dataprivacy/classes/task/process_data_request_task.php(114): core_privacy\manager->export_user_data(Object(core_privacy\local\request\contextlist_collection))
#10 /lib/cronlib.php(359): tool_dataprivacy\task\process_data_request_task->execute()
#11 /lib/cronlib.php(198): cron_run_inner_adhoc_task(Object(tool_dataprivacy\task\process_data_request_task))
#12 /lib/cronlib.php(76): cron_run_adhoc_tasks(1682511421)
#13 /admin/cli/cron.php(178): cron_run()

I do not get an error when I preview the question.

Environment

System Details
  • Moodle version: 4.1.1+ (Build: 20230217)
  • Plugin version: 2023021500
  • PHP version: 8.0.28
  • Browser: na

Additional comments

Just following the advice "The exception information is shown below. They can be sent to the plugin developer."

Remove duplicate values from an array

Description of proposed feature

We should add a function unique() that will remove duplicate values from an array.

How can the new feature be used?

unique([1,2,3,5,2,4,1]) --> [1,2,3,5,4]

PHP 8.1

Description of bug / unexpected behavior

PHP 8.1 can be used with Moodle 4.1 and 4.2. But certain functions throw deprecation warnings (until PHP 9, then they will throw errors).

How to reproduce the issue

  1. Install PHP 8.1
  2. Run PHPUnit tests
  3. Run Behat tests
Warnings
PHP Deprecated:  stripcslashes(): Passing null to parameter #1 ($string) of type string is deprecated in variables.php on line 665

PHP Deprecated:  explode(): Passing null to parameter #2 ($string) of type string is deprecated in variables.php on line 603

Environment

System Details
  • Moodle version: 4.1.3+ (Build: 20230526)
  • Plugin version: 2023042400
  • PHP version: 8.1.18

Signs

Hello!

Is it somehow possible to display the "+" sign on a positive number? I would like to set the following task:
variables:
m = {-4:4:0.1};
t = {-7:7:0.5};
Question text: ... $$ y = {m} ∗ x {t} $$ ...

If t is negative, there should be e.g. y = 4x - 5, positive x = 4x + 5.

Is that possible somehow? Many thanks for your help!

display:block in.que.formulas .formulation .formulaspart

Hi

We noticed that in the latest version, the display value of .que.formulas .formulation .formulaspart was changed from inline-block to block. However, in our courses this change breaks the way the parts are displayed:

For example, if the student has to to answer whether x is larger or smaller than a certain value, we want the 3 components of the answer to display side by side, like this:
image
but with display:block, the components are displayed one under the other, like this:
image

Would you mind explaining the change, and what's more important - how we should handle a case like the above example?

Array-type Variable Substitution

I believe that there is a bug in the variable substitution code when working with arrays.

I offer the following Jasmine test case to illustrate the issue.

import { substitute_variables_in_text } from './substitute_variables_in_text'

export function spec() {
    describe('substitute_variables_in_text', function() {
        // The following cases should continue to work.
        describe("working (number)", function() {
            it('{X} where X=1 => 1', function() {
                expect(substitute_variables_in_text('{X}', { X: '1' })).toBe('1')
            })
            it('[{X}] where X=1 => [1]', function() {
                expect(substitute_variables_in_text('[{X}]', { X: '1' })).toBe('[1]')
            })
        })
        // The following (disabled cases) should be fixed.
        describe("broken (arrays)", function() {
            xit('{X} where X=[0,1] => [0,1]', function() {
                expect(substitute_variables_in_text('{X}', { X: ['0', '1'] })).toBe('[0,1]')
            })
            xit('[{X}[0],{X}[1]] where X=[0,1] => [0,1]', function() {
                expect(substitute_variables_in_text('[{X}[0],{X}[1]]', { X: ['0', '1'] })).toBe('[0,1]')
            })
            // These cases demonstrate the current (broken) behavior.
            it('{X} where X=[0,1] => [0,1]', function() {
                expect(substitute_variables_in_text('{X}', { X: ['0', '1'] })).toBe('undefined')
            })
            it('[{X}[0],{X}[1]] where X=[0,1] => [0,1]', function() {
                expect(substitute_variables_in_text('[{X}[0],{X}[1]]', { X: ['0', '1'] })).toBe('[undefined[0],undefined[1]]')
            })
        })
        // The following case should continue to work, for backwards compatibility.
        // In any case, it represents evaluation within the curly braces.
        describe("workaround", function() {
            it('[{X[0]},{X[1]}], X=[0,1] => [0,1]', function() {
                expect(substitute_variables_in_text('[{X[0]},{X[1]}]', { X: ['0', '1'] })).toBe('[0,1]')
            })
        })
    })
}

Apologies for not making a PR, I'm very new to Moodle and have not yet set up a development environment. Please let me know if I can be of further assistance.

Binomialcdf in Formulas Update

Hi all,

Dominique and Philipp are doing a great job with the maintenance of formulas-qt especially with the latest updates including normcdf()!!

I would like to suggest to include in further updates the function binomialcdf, e.g. as described in the attached file!

To my own purposes I declared for binomialcdf a return-value -1, if the parameters are not consistent (which could be deleted). Obviously it should be pasted into variables.php after ncr() and as well appear in the lists of functions and where the parameter-list is dealt with :-)
As well there could be in a similar way other probability distributions as PoissonP/Cdf etc.!

binomial.txt

decbin() and the like are not working

Reported in the comments

Does anyone know why formulas no longer recognizes the decbin or decoct function?

A quick check

a = decbin(15);

yields:
Try evalution error! 1: Variable 'decoct' has not been defined. in substitute_vname_by_variables

Similar errors occur for decoct(), octdec() and bindec().

The problem exists at least since 4.91.

Images get lost when moving question to other context

Description of bug / unexpected behavior

When a question is moved from one context to another, images that are embedded in the part's text get lost.

Expected behavior

The images should still be there, obviously.

How to reproduce the issue

  1. Create a new question. Embed an image in the text of part 1. Save the question. Preview the question to verify the image is there.
  2. Move the question to another course. (Moving it to a different category of the same course is not enough.)
  3. Preview the question again. The image is not there anymore.

The same happens for files embedded in parts' feedback texts.

Additional comments

When moving the files, we use the $questionid:

public function move_files($questionid, $oldcontextid, $newcontextid) {
$fs = get_file_storage();
parent::move_files($questionid, $oldcontextid, $newcontextid);
$fs->move_area_files_to_new_context($oldcontextid,
$newcontextid, 'qtype_formulas', 'answersubqtext', $questionid);
$fs->move_area_files_to_new_context($oldcontextid,
$newcontextid, 'qtype_formulas', 'answerfeedback', $questionid);
$fs->move_area_files_to_new_context($oldcontextid,
$newcontextid, 'qtype_formulas', 'partcorrectfb', $questionid);
$fs->move_area_files_to_new_context($oldcontextid,
$newcontextid, 'qtype_formulas', 'partpartiallycorrectfb', $questionid);
$fs->move_area_files_to_new_context($oldcontextid,
$newcontextid, 'qtype_formulas', 'partincorrectfb', $questionid);
$this->move_files_in_combined_feedback($questionid, $oldcontextid, $newcontextid);
$this->move_files_in_hints($questionid, $oldcontextid, $newcontextid);
}

However, when saving the question, we use $ans->id which is the part's ID.

$ans->subqtext = $this->import_or_save_files($subqtextarr, $context, 'qtype_formulas', 'answersubqtext', $ans->id);

Séparateur décimal / Decimal separator

Bonjour,

Je me permet le message en Français...
Le séparateur décimal par défaut sur toutes les questions Moodle est soit le point soit la virgule et donc compatible en fonction des langues utilisateurs.
Dans ce type de question, seul le point est autorisé sinon la réponse est considérée comme fausse.
Il serait vraiment appréciable de permettre aussi la virgule, Français oblige, mais j'ignore la faisabilité ou la quantité de travail pour permettre cela.
Merci !

Éric

==========
Default decimal separator on all Moodle questions is dot or comma and is compatible depending on the user language.
In this question type, only the dot is allowed, otherwise the answer is considered wrong.
It would be really nice to allow the comma, but I don't know the feasibility or amount of work to allow this.
Thank you!

Allow grading in parts with empty input.

Hi,

This issue has come up at least twice that I can see on the Moodle forums, in 2018 and recently in September 2020. Just to continue and move a discussion from there to here, which seems more appropriate. A brief summary of the issue below.

At the moment, it is very convenient to use the multiple input in parts of a formulas type question, both to align the display of the parts, and also to do combined grading on the parts of the question. For example I can align MCE type and input type boxes as below
Screen Shot 2020-10-09 at 11 48 05
so that it looks like a student can input say an interval or set of real numbers with two elements as an answer. It can be a bit tricky to get students to input expressions like [a,b] into an input box and compare it, either with regular expressions or in another question type like STACK (even though they have commands like cc(a,b) that they recognise). There's lots of tradeoffs between ease of input and ease of parsing the answer and grading it.

However there is a huge drawback. Suppose the four input boxes belong to the same part say {#1} and are displayed like {_0:choices:MCE}{_1} , {_2} {_3:choices2:MCE}, then since the answer is for example a numerical list [0,1,2,0] etc. If any of the four input boxes are left empty (as is likely to be the case, students do it all the time on paper assignments), the grading variables and marking are completely skipped and the answer is marked wrong.

Now the possible work arounds are quite involved, especially if we assume the average person making the questions, just wants to use the WYSIWYG editors, learning minimal syntax for inserting parts/input boxes etc (The moodleformulas website is quite good at explaining these things). For instance one has to split the four input boxes above into four separate parts, manually edit the html display style, to get them to align, and now in theory you can get marks for each input even if other boxes are blank. That is unless the grading of the inputs are codependent. For instance imagine the answer is a set and you would like the student to be able to input either {0,1} or {1,0}. Now you can't really just split them into multiple parts, and some workarounds might involve javascript to copy inputs into hidden extra inputs.

Typically having to suggest one workaround to an issue will lead to many more workarounds needed. For instance, even getting the question to display the same using div inline, once the question is answered, the separate parts display their own feedback and that already destroys the html display, in a way it wouldn't if they were all one part

Screen Shot 2020-10-09 at 11 48 23

It would be great if there was a way to either ignore blank input boxes. One could have a special character/comparison or function that could check if the inputs were empty, but nonetheless proceed to the grading, so that parts with multiple inputs can be marked even if some inputs are empty.

If it is too much of a change that it could break old questions, then maybe allow this as a setting/toggle/checkbox which either keeps the old way, or allows empty inputs to still proceed to grading.

import and export features should use the @_file_upload tag

The behat import feature uploads a file and the export feature downloads a file. This works fine when selenium is running in the same machine where the tests are running. But when different machines are used, the steps fail.
To avoid this, there is a @_file_upload tag. When this tag is added to the 2 files, all tests run fine, even using different machines.

Fix behat

Behat tests are now failing

Some changes make behat happy in Firefox and Chrome.

$ moodle-plugin-ci behat
RUN Behat features for qtype_formulas
Running single behat site:
Moodle 3.8.3+ (Build: 20200618), 898415efec81bc62bde20b862d1b139407a7329a
Php: 7.3.14, pgsql: 9.6.17, OS: Linux 5.0.0-1031-gcp x86_64
Server OS "Linux", Browser: "firefox"
Browser specific fixes have been applied.
Started at 23-06-2020, 04:06
...................................................................... 70
............................................................
7 scenarios (7 passed)
130 steps (130 passed)
3m18.62s (72.56Mb)
The command "moodle-plugin-ci behat" exited with 0.
155.14s$ moodle-plugin-ci behat --profile chrome
RUN Behat features for qtype_formulas
Running single behat site:
Moodle 3.8.3+ (Build: 20200618), 898415efec81bc62bde20b862d1b139407a7329a
Php: 7.3.14, pgsql: 9.6.17, OS: Linux 5.0.0-1031-gcp x86_64
Server OS "Linux", Browser: "chrome"
Browser specific fixes have been applied.
Started at 23-06-2020, 04:10
...................................................................... 70
............................................................
7 scenarios (7 passed)
130 steps (130 passed)
2m24.68s (72.60Mb)

Question attempts are not "self-contained"

Description of bug / unexpected behavior

Currently, reviewing a quiz attempt may fail in certain cases where a Formulas Question has been modified between the attempt and the review.

Expected behavior

Changes to a question should not make a finished attempt or an answered question undisplayable.

How to reproduce the issue

  1. Create a simple formulas question with a global variable A=1 and set the answer for part 1 to A.
  2. Create a quiz, add the question and attempt the quiz as a student.
  3. Go to the question bank. Change the global variable definition to B=1 and the model answer for part 1 to B. The question is still in a valid state.
  4. As the student, go to the quiz and review your attempt. You will get the following error:
Exception - Variable 'B' has not been defined. in substitute_vname_by_variables

Debug info:
Error code: generalexceptionmessage
Stack trace
line 703 of /question/type/formulas/variables.php: Exception thrown
line 1045 of /question/type/formulas/variables.php: call to qtype_formulas\variables->substitute_vname_by_variables()
line 799 of /question/type/formulas/question.php: call to qtype_formulas\variables->evaluate_general_expression()
line 652 of /question/type/formulas/question.php: call to qtype_formulas_question->get_evaluated_answer()
line 143 of /question/type/formulas/renderer.php: call to qtype_formulas_question->grade_responses_individually()
line 104 of /question/type/formulas/renderer.php: call to qtype_formulas_renderer->get_part_image_and_class()
line 69 of /question/type/formulas/renderer.php: call to qtype_formulas_renderer->part_formulation_and_controls()
line 385 of /question/engine/renderer.php: call to qtype_formulas_renderer->formulation_and_controls()
line 109 of /question/engine/renderer.php: call to core_question_renderer->formulation()
line 113 of /question/behaviour/behaviourbase.php: call to core_question_renderer->question()
line 907 of /question/engine/questionattempt.php: call to question_behaviour->render()
line 461 of /question/engine/questionusage.php: call to question_attempt->render()
line 1765 of /mod/quiz/attemptlib.php: call to question_usage_by_activity->render_question()
line 1727 of /mod/quiz/attemptlib.php: call to quiz_attempt->render_question_helper()
line 187 of /mod/quiz/renderer.php: call to quiz_attempt->render_question()
line 56 of /mod/quiz/renderer.php: call to mod_quiz_renderer->questions()
line 262 of /mod/quiz/review.php: call to mod_quiz_renderer->review_page()

Checking the database shows that in mdl_question_attempts the (evaluated) model answer is available. The offending variable stems from mdl_qtype_formulas_answers where the model answer is stored as B. When fetching this, we should probably also fetch the corresponding record from mdl_qtype_formulas_options, because it contains the updated variable definition B=1.

Additional comments

We have to be very careful and make sure we change this in a way that will still take into account changes, because otherwise automatic regrading (e.g. when a teacher decides they want to give partial credit for some additional answer they did not expect before) will stop working.

Function to determine number of significant figures (e.g. in answer)

Description of proposed feature

I think it would be helpful to have a function that returns the number of significant figures of some given number, e.g. an answer. This should probably be done after converting to a string in order to catch trailing zeros.

As significant figures convey meaning in physics, I would like to attribute points for rounding numeric answers to a meaningful number of significant figures.

How can the new feature be used?

Say the function was called numsigfig, then it should be used as:

numsigfig(1234) => 4
numsigfig(1.234) => 4
numsigfig(0200) => 3
numsigfig(0.200) => 3

Additional comments

I currently use the following code to determine the number of significant figures nSigFigAns in an answer:

origAns = _0;

expAns = floor( log10(abs(origAns)));
mantAns = origAns / ( 10**( expAns ));
nSigFigAns = 1;
for (i:[0:6]) {
    shiftedNum = round( 10**(i) * abs(mantAns) , 6 );
    divTest = round( shiftedNum - floor(shiftedNum) , 6);
    nSigFigAns = ( divTest > 1e-5 ) ? nSigFigAns+1 : nSigFigAns ;
}

I don't think, it would be doable using Formula's current string functions.

Disable browser autocomplete for answer fields

Hello Philip,

Below is a Formulas question with 26 parts for each of the letters of the alphabet. The answers are of type "Algebraic formula" and the answers are in fact the name of variables (a small trick to circumvent the absence of string answers). The question behaviour is set to "Adaptative mode". I added some css to format the layout, among other things I hid the answer boxes information (answer type and interpretation of input).

Here is an unexpected behavior: when clicking in an empty answer box, previously checked answers are displayed below the answer box. This can indeed be useful to students. Did you add this feature or was it always there?

The XML file of the question is attached below.

Hello Jean-Michel,

Were you aware of this behaviour?

GitHub_20221127_2018

Adult Literacy Fundamental English - Shantel Ivits-Writing - Formulas-20221127-2042.txt

Scheduled task failed: Updating overdue quiz attempts , Variable 'R' has not been defined . in substitute_vname_by_variables

Every few weeks we get the errors reported at CRON level for the scheduled task "Updating overdue quiz attempts".
My guess is that teachers change the question after the quiz started, and student's attempt a version of the question without the relevant variables.

CRON log debugging:
php admin/tool/task/cli/schedule_task.php --execute="\mod_quiz\task\update_overdue_attempts" --showdebugging
Execute scheduled task: Updating overdue quiz attempts (mod_quiz\task\update_overdue_attempts)
Looking for quiz overdue quiz attempts...
... used 140 dbqueries
... used 1.14586186409 seconds
Scheduled task failed: Updating overdue quiz attempts (mod_quiz\task\update_overdue_attempts),Variable 'R' has not been defined. in substitute_vname_by_variables
Backtrace:

  • line 595 of /question/type/formulas/variables.php: call to qtype_formulas_variables->substitute_vname_by_variables()
  • line 873 of /question/type/formulas/question.php: call to qtype_formulas_variables->evaluate_general_expression()
  • line 723 of /question/type/formulas/question.php: call to qtype_formulas_question->get_evaluated_answer()
  • line 835 of /question/type/formulas/question.php: call to qtype_formulas_question->grade_responses_individually()
  • line 266 of /question/behaviour/adaptivemultipart/behaviour.php: call to qtype_formulas_question->grade_parts_that_can_be_graded()
  • line 370 of /question/behaviour/adaptivemultipart/behaviour.php: call to qbehaviour_adaptivemultipart->process_parts_that_can_be_graded()
  • line 203 of /question/behaviour/adaptivemultipart/behaviour.php: call to qbehaviour_adaptivemultipart->process_finish()
  • line 1355 of /question/engine/questionattempt.php: call to qbehaviour_adaptivemultipart->process_action()
  • line 1395 of /question/engine/questionattempt.php: call to question_attempt->process_action()
  • line 859 of /question/engine/questionusage.php: call to question_attempt->finish()
  • line 2171 of /mod/quiz/attemptlib.php: call to question_usage_by_activity->finish_all_questions()
  • line 2005 of /mod/quiz/attemptlib.php: call to quiz_attempt->process_finish()
  • line 79 of /mod/quiz/cronlib.php: call to quiz_attempt->handle_if_time_expired()
  • line 61 of /mod/quiz/classes/task/update_overdue_attempts.php: call to mod_quiz_overdue_attempt_updater->update_overdue_attempts()
  • line 163 of /admin/tool/task/cli/schedule_task.php: call to mod_quiz\task\update_overdue_attempts->execute()

Access to grading variables and answers in part's feedback?

In the general feedback, we can use all global variables. In the part's feedback, we can also use the part's local variables.

However, we are currently not able to use grading variables or the user's answers in the feedback. This would be a nice improvement to an already great plugin. It would allow, for example, to give more specific feedback in case of partially correct answers.

(I also reported this to the Moodle Tracker. What platform do you prefer?)

User-specific unique variables and a seed

Dear developers,
I enjoyed Formulas in several courses. It is a wonderful tool for shorter questions. However, more complex tasks require even hours of calculation and work interruption. Once a browser closes, the random variables are lost and on the next attempt the random set becomes different. It would be nice if a student has the same set of random variables every time.

UserID in Moodle is a unique identifier, that could be used like a seed for random number generator. I could imagine extension e.g.:
Random variables box: a={1:10:0.1 | UserID}; b={1:10:0.1 | UserID+1};
which would impose condition using the seed from UserID. The seed could be further increased/decreased in order to yield different "random" number from that range. Numbers a and b would be the same every time for the same user.

Best regards. Vit

Feature requests

Here is a list of possible feature requests ranked, in my opinion, by decreasing order of priority (higher priorities first). I will edit this list as time goes on. If not already made, separate issues for discussion could be made from this list.

  • #11.
  • #45.
  • Function sen(), asen(), senh() and asenh(): sin(), asin(), sinh() and asinh() in Spanish. To avoid conflict with corresponding existing variable names that users might have used in other languages such as English, French, etc., the use of the functions sen(), senh() and asenh() could be limited to sites where the language is set to Spanish (and also other languages such as Portuguese?).
  • Fix the fmod() function https://dynamiccourseware.org/course/view.php?id=31&section=30.
  • #48
  • Trailing zeros in the sigfig().
  • Affine unit conversion https://moodle.org/mod/forum/discuss.php?d=416824.
  • #21.
  • #17.
  • Improve, if possible, the combined feedback for each part https://tracker.moodle.org/browse/CONTRIB-7280 https://tracker.moodle.org/browse/CONTRIB-7686
  • #7.
  • Answer boxes inside MathJax equations.
  • Matrix operations.
  • Improve string functions.
  • Reconsider single check button for all parts vs one chech button for each part.
  • Question over more than one page.
  • Share variables between questions #29.
  • Third option for "Each attempt builds on the last", but probably relates more to quiz behaviour by Tim Hunt and others #29.
  • Add the decbin(), decoct(), dechex(), and bindec(), octdec(), hexdec(). Although it is straightforward to use the PHP fonctions (I have done it on my localhost), these PHP functions only treat positive integers, not negative, not real values. In PHP, decbin(), decoct(), dechex() yield string values. Reading a student answer in hex, which contains a-f characters, requires that the formulas question can read strings. See also https://moodle.org/mod/forum/discuss.php?d=442216.
  • #106
  • #108
  • 2023-02-18 Modify the Algebraic formula answer to retain only 15 significant figures (or less) for _a and _r in order to avoid computation failure with large number. See https://moodle.org/mod/forum/discuss.php?d=444016.
  • 2023-02-18 Add a significant figure function that outputs numbers, not strings. It could be called sifDig. See https://stackoverflow.com/questions/37618679/format-number-to-n-significant-digits-in-php for the code with and without trailing zeros. This could be useful maybe with large number computations (see above item).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.