Giter Club home page Giter Club logo

yeoman-test's Introduction

yeoman-test

NPM version NPM Test Integration Build Dependency Status Coverage percentage

Test utilities for Yeoman generators

Installation

$ npm install --save-dev yeoman-test

Install target environment and generator:

$ npm install --save-dev yeoman-generator@xxx yeoman-environment@xxx

Usage

Usage using the convenience last RunResult instance:

import helpers, { result } from 'yeoman-test';

describe('generator test', () => {
  describe('test', () => {
    beforeEach(async () => {
      await helpers
        .run(                   // instantiates RunContext
          'namespace',             // namespace or generator
          {},                      // test options
          {}                       // environment options
        )
        [.cd(dir)]                  // runs the test inside a non temporary dir
        [.onTargetDirectory(dir => {})        // prepares the test dir
        [.withGenerators([])]       // registers additional generators
        [.withLookups({})]          // runs Environment lookups
        [.withOptions({})]          // passes options to the generator
        [.withLocalConfig({})]      // sets the generator config as soon as it is instantiated
        [.withAnswers()]            // simulates the prompt answers
        [.withMockedGenerators(['namespace', ...])]      // adds a mocked generator to the namespaces
        [.withFiles({
          'foo.txt': 'bar',
          'test.json', { content: true },
        })]                         // add files to mem-fs
        [.withYoRc({ 'generator-foo': { bar: {} } })]    // add config to .yo-rc.json
        [.withYoRcConfig('generator-foo.bar', { : {} })] // same as above
        [.commitFiles()]            // commit mem-fs files to disk
        [.onGenerator(gen => {})]   // do something with the generator
        [.onEnvironment(env => {})]; // do something with the environment

      [await result.create('another-generator').run();] // instantiates a new RunContext at the same directory
    );

    it('runs correctly', () => {
      // runs assertions using mem-fs.
      [result.assertFile('file.txt');]
      [result.assertNoFile('file.txt');]
      [result.assertFileContent('file.txt', 'content');]
      [result.assertEqualsFileContent('file.txt', 'content');]
      [result.assertNoFileContent('file.txt', 'content');]
      [result.assertJsonFileContent('file.txt', {});]
      [result.assertNoJsonFileContent('file.txt', {});]
    });
  });
});

Generator compose:

import assert from 'assert';
import helpers, { result } from 'yeoman-test';

describe('my-gen', () => {
  before(() => helpers.run('my-gen').withMockedGenerator(['composed-gen']));
  it('should compose with composed-gen', () => {
    assert(result.mockedGenerators['composed-gen'].calledOnce);
  });
});

Generic test folder:

import helpers, { result } from 'yeoman-test';

describe('generic test', () => {
  before(() => helpers.prepareTemporaryDir());
  it('test', () => {
    result.assert...;
  });
});

See our api documentation for latest yeoman-test release.

See our api documentation for yeoman-test 5.0.1. Use 5.x for yeoman-environment 2.x support.

See our api documentation for yeoman-test 2.x.

See our documentation for yeoman-test 2.x.

License

MIT © The Yeoman Team

yeoman-test's People

Contributors

mshima avatar sboudrias avatar dependabot[bot] avatar stefanbuck avatar fernandopasik avatar sindresorhus avatar iatsiuk avatar nilzona avatar awk34 avatar arthurvr avatar eddiemonge avatar hemanth avatar idmontie avatar niksy avatar jvasseur avatar kevva avatar kierans avatar marcoscabbiolo avatar martincostello avatar mklabs avatar n3dst4 avatar tkreis avatar greenkeeper[bot] avatar

Stargazers

DeloDev avatar Jimmy Briggs avatar Kumarajiva avatar Antoine Andrieu avatar wangjing avatar Jorge Luna avatar Michael Rubanov avatar Choubani Amir avatar  avatar Tenvi avatar Rocktim avatar Stefan Boos avatar Oliver Kopp avatar  avatar Pearce Liang avatar Kaito Sugimoto avatar Gerald Rich avatar Paulo McNally avatar Charles Coeurderoy avatar Pocho avatar 鹿森 avatar Daniel Blair avatar Bo Kou avatar Eugene Datsky avatar Hugo Capocci avatar  avatar  avatar Dennis O'Keeffe avatar Leo Lozach avatar Charles Ndubuisi avatar  avatar Sibelius Seraphini avatar hanbaoshashou avatar Joel Chen avatar Lucas Bento avatar  avatar Andrii Trybynenko avatar Hejx avatar  avatar  avatar Matheus Albuquerque avatar Casey Webb avatar Shigetaka Shirouchi avatar

Watchers

Addy Osmani avatar Michael Kühnel avatar James Cloos avatar  avatar  avatar Ulises Gascón avatar  avatar  avatar

yeoman-test's Issues

How to test the 'yosay' calls?

I am trying to test the yosay calls to confirm that they are correctly executed.

I am using the default Jest test.

_tests_/app.js

const path = require('path');
const assert = require('yeoman-assert');
const helpers = require('yeoman-test');

describe('generator-myown-generator:app', () => {
    beforeAll(() => {
      return helpers.run(path.join(__dirname, '../generators/app'))
        .withPrompts(prompts)
        .withArguments(['noinstall'])
        // Construct a test scenario
        .inTmpDir(dir => {
          console.log('App: Files saved on temporal directory:', dir);
        });
    });
    it('Calls the yosay', () => {
        console.log = jest.fn();
        expect(console.log).toBeCalled();
    });
});

generators/app/index.js

class AppClass extends Generator {
  prompting() {
    this.log(yosay(
      'Welcome to the awesome ' + chalk.red('myown-generator') + ' generator!'
    ));
  }
}

But it did not work. I am trying also with the --verbose argument. And the yosay calls are never showed up.

[feature request] withEnd helper

My generator calls npm test at the end on the generated package:

 end() {
    this.spawnCommand('npm', ['test']);
  }

But this causes a problem when I run the generator tests themselves because it is also calling end(). (The generated package calls webpack as part of a pretest build script causing the generator tests to blow up when they can't find a dependency).

I'd like a way to tell the generator tests to ignore either end() or the spawn command.

Perhaps something like this:

  beforeAll(() => {
    return helpers.run(path.join(__dirname, '../generators/app'))
      .withPrompts({someAnswer: true})
      .withEnd( false )
  });

Or if there is another way to do this, please let me know.

Package: https://github.com/mitchallen/generator-mitchallen-react-component

Thanks.

It should remove tmp folder after the tests

yeoman-test creates a tmp folder to execute in. This tmp folder is not cleaned up after runs and causes our build machines to run out of disk space very quickly.

It would be great if there was a yeoman-test function that could be called to clean up (rimraf the folder it used) in an afterEach block.

Defaults are ignored in testing when ~/.yo-rc-global.json is defined

When I run using withPrompts({foo: 'bar'}) and I also have something that looks like this in my yo prompt:

prompt.push({
    type: 'confirm',
    name: 'baz',
    message: `Use baz?`,
    default: false,
    store: true
});

It answers y (true) instead of using the default value. It does this because in my ~/.yo-rc-global.json I have: "qux": true having tested once with this. If I change my personal default to false this then answers N. This makes the tests users dependent and not stable. Any advice to ignoring my personal globals?

Test execution freezes on Windows 10 nodeJS 10

Possibly related issues: #43 and #15
Environment: Windows 10, nodeJS 10.14.1 (LTS), mocha

When I have just a single unit test, where the helpers.run occurs only once, everything is fine.

Example:

    before(() => helpers.run(__dirname)
      .withPrompts({
        name: 'mylib',
        description: 'my lib is awesome',
      }));

When I add a new describe with a different set of prompts, the tests never complete, it just stays there frozen.

Example:

  describe('unscoped', () => {
    before(() => helpers.run(__dirname)
      .withPrompts({
        name: 'mylib',
        description: 'my lib is awesome',
      }));

    it('should create the expected files', () => {
      assert.file([
        '.editorconfig',
        '.eslintrc.js',
      ]);
    });

    it('should have the expected package name', () => {
      assert.jsonFileContent('package.json', {
        name: 'mylib',
      });
    });
  });

  describe('scoped', () => {
    before(() => helpers.run(__dirname)
      .withPrompts({
        name: 'mylib2',
        description: 'my lib is awesome',
        scope: '@ngeor',
      }));

I can experience the same problem by changing the before to beforeEach in the first describe.

I spent a lot of time trying to troubleshoot it with no success. This happens also with jest (I tried it with the generator that is generated by generator-generator).

Then I found issues #43 and #15 and I tried to run my tests within Docker. Everything worked fine... also my build in Travis runs fine, so this is a Windows-specific issue.

My temporary workaround is to run the tests locally with Docker, but this slows things down.

My generator is this one https://github.com/ngeor/generator-nodejs

createDummyGenerator throws errors in 1.7.1

Works fine in 1.7.0, but in 1.7.1

const helpers = require("yeoman-test");

...
    mockery.registerMock(require.resolve("..."), helpers.createDummyGenerator());

throws TypeError: Generator.extend is not a function

see full example

generator.run() should not fail silently if there is a problem with running generator

Hey guys,

I've recently started writing my own generator and one of the problems I've encountered is behaviour of RunContext when running generator fails for some reason. In my case there was one of the parameters missing in the template and it took quite some time to figure out why this.fs.copyTpl(...) was preventing generator in my tests from receiving end event.

Would be great to make test helpers more developer-friendly - not sure if it would require changes in the core library.

Prompt helper function/option `validate` is not getting executed when using `withPrompts`

Prompt:

let prompts = [
   {
      name: 'hello',
      type: 'input',
      validate: function(input) {
         // Not getting executed with the `yeomen-test` unit test
         return "Wrong Input";
      }
]

Unit test:

   helpers.run(path.join(__dirname, '../generators/app'))
        .withPrompts({hello: 'how are you'})       
        .on('end', done);

Looks like the inquirer module runs the validate on submit/enter event.
https://github.com/SBoudrias/Inquirer.js/blob/33403fca0078b4e1dcd4b0e185598cfe88550feb/lib/prompts/input.js#L39

The dummy prompt provided by yeomon-test has to be extended to deal with the key stoke events and other functionalities?

How is one suppose to test a generator using different prompts and options?

This seems like such a simple scenario, but I can't get it to work.

I hoped to test the generator's behavior in regards to one particular question by using two separate RunContext objects in two separate tests with differing prompt answers. But when I ran the tests the whole test process just hangs.

The documentation on the Yeoman website isn't of much help. It doesn't have any examples of tests that give the different prompt answers on different tests.

EDIT: The freezing problem may be an issue with Node 10. I will upgrade my node version to see if it resolves the issue.

EDIT2: I have upgraded my Node version but according to #52 this issue affects all versions of Node >10 on Windows.

Doesn't seem to work with generators transpiled with Babel?

If my generator is transpiled through Babel and exports like:

exports.default = class extends _yeomanGenerator2.default {

I get this when I run tests:

Error: Unhandled "error" event.

But if I change it to:

module.exports = class extends _yeomanGenerator2.default {

The tests run properly. Anyone know why this is?

ComposeWith subgenerator error.

When I test generator manualy with yo myGenerator after npm link on myGenerator home directory, the generator will execute subgenerator composeWith('mygenerator:subgen') after the prompting. And every thing is working like it should. But when I want to test generator with helpers.run it throws and error:

     Error: You don't seem to have a generator with the name mygenerator:subgen installed.
You can see available generators with npm search yeoman-generator and then install them with npm install [name].
To see the 1 registered generators run yo with the `--help` option.
      at Environment.create (node_modules/yeoman-environment/lib/environment.js:296:7)
      at composeWith (node_modules/yeoman-generator/lib/base.js:625:26)
      at module.exports.generator.Base.extend._subgenerator (generators/app/index.js:9:2219)
      at .<anonymous> (generators/app/index.js:9:1312)

How can I set my tests so that It would run local subgenerator?

.inTmpDir with Jest

I'm using Jest to test create-graphql

I would like to use inTmpDir to move some files (fixtures) that will be read by our generator to create some GraphQL files based on mongoose schemas

That's my test: (I would like to test if generate a GraphQL ObjectType from a mongoose schema works)

it('generate a type with Schema', async () => {
  const folder = await typeGenerator
    .inTmpDir((dir) => {
      fs.copySync(path.join(__dirname, '../fixture/Post.js'), path.join(dir, 'src/model/Post.js'));
    })
    .withArguments('Post --schema Post')
    .toPromise();

  const destinationDir = getConfigDir('type');
  const destinationTestDir = getConfigDir('type_test');

  assert.file([
    `${destinationDir}/PostType.js`, `${destinationTestDir}/PostType.spec.js`,
  ]);

  const files = {
    type: getFileContent(`${folder}/${destinationDir}/PostType.js`),
    typeTest: getFileContent(`${folder}/${destinationTestDir}/PostType.spec.js`),
  };

  expect(files).toMatchSnapshot();
});

That's my error:

Error: ENOENT: no such file or directory, stat 'create-graphql/packages/generator/generators/type/fixture/Post.js'
at Error (native)
at Object.fs.statSync (fs.js:987:18)
at null.<anonymous> (create-graphql/node_modules/graceful-fs/polyfills.js:297:22)
at null.statSync (create-graphql/packages/generator/node_modules/graceful-fs/polyfills.js:297:22)
at Object.copySync (create-graphql/packages/generator/node_modules/fs-extra/lib/copy-sync/copy-sync.js:27:84)
at RunContext.<anonymous> (create-graphql/packages/generator/generators/type/__tests__/TypeGenerator.spec.js:79:31)
at create-graphql/packages/generator/node_modules/lodash/lodash.js:5181:46
at create-graphql/packages/generator/node_modules/yeoman-test/lib/index.js:94:5
at next (create-graphql/packages/generator/node_modules/rimraf/rimraf.js:74:7)
at CB (create-graphql/packages/generator/node_modules/rimraf/rimraf.js:110:9)

Typescript definition files

Would you consider adding TypeScript definition files to the package ? I can contribute, I'm writing them at the moment.

Allow for step between tmpdir created and generator run

Allowing for a step between the tmpdir and generator run allows for testing of situations when there are files already in a directory and how the script reacts to them. Can this please be added or documented if this is already possible?

Thanks!

Breaks on node 4.7.0

I'm having some trouble running my tests on our Jenkins instance, which is (for political reasons) running node v4.7.0. I'm running the following test:

describe('ffe-generator', function() {
    it('creates projects with green builds', (done) => {
        helpers.run(path.join(__dirname, 'app'))
            .withOptions({
                skipInstall: false,
            })
            .withPrompts({
                name: 'test-component',
                description: 'test description',
                repourl: 'https://test.url',
            })
            // .then <-- includes the test case, but fails without it as well, so omitted
    });
});

I even rewrote the entire test to ES5, but that made no difference.

This happened on the newest version of both yeoman-generator and yeoman-test.

The stack trace is as follows:

Uncaught SyntaxError: Unexpected token {
      at exports.runInThisContext (vm.js:53:16)
      at require (internal/module.js:12:17)
      at Object.defineProperty.get [as ffe:app] (node_modules/yeoman-test/node_modules/yeoman-environment/lib/store.js:40:23)
      at Store.get (node_modules/yeoman-test/node_modules/yeoman-environment/lib/store.js:64:35)
      at Environment.get (node_modules/yeoman-test/node_modules/yeoman-environment/lib/environment.js:261:21)
      at Environment.create (node_modules/yeoman-test/node_modules/yeoman-environment/lib/environment.js:296:24)
      at RunContext._run (node_modules/yeoman-test/lib/run-context.js:90:29)
      at RunContext.<anonymous> (node_modules/yeoman-test/lib/run-context.js:58:10)
      at node_modules/yeoman-test/node_modules/lodash/lodash.js:5184:35
      at node_modules/yeoman-test/lib/index.js:94:5
      at next (node_modules/yeoman-test/node_modules/rimraf/rimraf.js:74:7)
      at CB (node_modules/yeoman-test/node_modules/rimraf/rimraf.js:110:9)
      at node_modules/yeoman-test/node_modules/rimraf/rimraf.js:136:14
      at FSReqWrap.oncomplete (fs.js:82:15)

The first relevant error, in yeoman-environment, references this line in store.js. I really can't see anything wrong with it - so I'm a bit unsure what to do here.

Thoughts?

New release on NPM for v2?

Hi there,

It seems there's a commit for v2, but no corresponding release on NPM. Could this be rectified please? 😄

Local config interferes with tests

If I am developing a generator and npm link it so I can run it locally, the stored config will interfere with the unit tests.

The prompts that have store: true will also use their stored values when running the tests in jest. Even passing .withLocalConfig({}) doesn't seem to alleviate the issue: the stored config is used.

Even selecting "Clear local config" in yo doesn't help.

It seems like some kind of caching issue that I can't find the root cause of. Any ideas?

Timeout on appveyor

In this code, when run on appveyor, "starting test" is printed but "ready to go" and "inside then" is never printed. Error is timeout. Event if I set timeout value to big number (30000) it still timeout.

It is running fine in travis.

describe(`app git tests`, () => {
  it.only('should use current and parent dir as repo name and org when it is not a git repo', () => {
    console.log('starting test');
    let generator;
    return helpers.run(path.join(__dirname, '../generators/app'))
      .withOptions({
        skipConfiguring: true,
        skipDefault: true,
        skipWriting: true,
        skipInstall: true,
        skipGit: true
      })
      .on('ready', (gen) => {
        console.log('ready to go');
        generator = gen;
      })
      .toPromise()
      .then((dir) => {
        console.log('inside then', dir);
        ...
      });
  });

Here is the code:
https://github.com/typings/generator-typings/blob/626482658015871764f6a0eeac5e75dbdc8d6973/test/git.spec.js

Here is test result:
https://ci.appveyor.com/project/unional/generator-typings/build/1.0.11

`filter` function is not executed by test

I noticed that the filter is not being executed by the tester. I changed this line to

return Promise.resolve(this.question.filter ? this.question.filter(answer) : answer);

And now everything seems to be working as expected. Not sure if it's that easy to fix though.

Test execution freezes with a NodeJS version >= 10.12.0

Related issues: #51, feathers-plus/generator-feathers-plus#103, yeoman/generator#1098
Environment: Windows 10, NodeJS >= 10.12.0, mocha 5.2.0

When testing with the yeoman helper, I discovered that by making several hooks' calls to run the generator with prompts, it locks on the same point.

Example:

beforeEach(async function() {
	await helpers.run(path.join(__dirname, '../generators/myGenerator'))
		.inDir(path.join(__dirname, 'tmp'))
		.withPrompts({
			name: 'aName',
			confirmation: true
	});
});

This happens using CMD and PowerShell (what is strange is that in each of them, the execution tests are locked in different points). However, using bash, the execution tests end successfully.

image

Debugging the tests, I found out that the prompting method runs fine and resolves the promise okay.
The problem is that the writing method never runs at the second time that the helpers.run is called inside the beforeEach hook.

Finally, changing the prompts to arguments was the workaround that I found even though is not the final solution I want (this works in all the environments).

Example:

beforeEach(async function() {
	await helpers.run(path.join(__dirname, '../generators/myGenerator'))
	.inDir(path.join(__dirname, 'tmp'))
	.withArguments(['-n', 'aName'])
});

image

1.9.0 fails our tests

Jenkins build failing today with the following error:

/app/packages/generator-vj/node_modules/yeoman-test/lib/run-context.js:121
  this.generator.run().catch(err => this.emit('error', err));
                            ^

TypeError: this.generator.run(...).catch is not a function
    at RunContext._run (/app/packages/generator-vj/node_modules/yeoman-test/lib/run-context.js:121:29)
    at RunContext.<anonymous> (/app/packages/generator-vj/node_modules/yeoman-test/lib/run-context.js:59:10)
    at /app/packages/generator-vj/node_modules/lodash/lodash.js:5138:35
    at /app/packages/generator-vj/node_modules/yeoman-test/lib/index.js:93:5
    at next (/app/packages/generator-vj/node_modules/rimraf/rimraf.js:75:7)
    at CB (/app/packages/generator-vj/node_modules/rimraf/rimraf.js:111:9)
    at /app/packages/generator-vj/node_modules/rimraf/rimraf.js:137:14
    at FSReqWrap.oncomplete (fs.js:152:21)

We've fixed by locking our version down to ~1.8.0.

Test scaffolding out in alternative to the os temp folder

I am having problems running the tests in our CI environment because the OS tmp folder is locked down; so after reading around I discovered the helper.inDir method.

However, setting the parameter to a custom folder which I can then exclude from distribution doesnt do anything ... the tmp folder which the tests create, now sits in the main project folder and not the os.tmp folder, however, but its not what i was expecting

Any idea what I am doing wrong here?

const tempFolderPath = `${path.join(__dirname, '../test_tmp/')}/${uuidv4()}/`;

helpers
    .run(path.join(__dirname, '../generators/app'))
    .inDir(tempFolderPath)
    .withOptions({ skipInstall: true })
    .withPrompts(defaultAnswers)
    .on('end', done);

I have also tried creating the temp folder myself and then cd ing into it, but again it just creates the test generator structure in the project folder.

const tempFolderPath = `${path.join(__dirname, '../test_tmp/')}/${uuidv4()}/`;

helpers
    .run(path.join(__dirname, '../generators/app'))
    .inDir(tempFolderPath)
    .withOptions({ skipInstall: true })
    .withPrompts(defaultAnswers)
    .cd(tempFolderPath)
    .on('end', done);

Apologies in advance if this isnt a bug, and just my usage.

RunContext#withGenerators() ignore composeWith with a local property

When using composition with a local path (composeWith('gen:app', { local: 'local/path' })), it ignores/bypass the generators mocks we setup on the RunContext with #withGenerators() as it requires/register the generator directly.

This should probably be fixed, I'm unsure about the correct solution ATM.

A potential solution I've come to think about is to mock the module require (with for example proxyquire).

(Moved from yeoman/generator#704)

While testing, how to pass prompt values for dependency generator called using composeWith.

how to pass prompt values for dependency generator called using composeWith.

I am in the process of create a mysql based generators & its sub-generators, The way the generators works is as below.

When the app generator is run, its prompts questions to collect values like "host", "username" & "password" with these answers the generator initial a connect to the mysql server and awaits for the response. When the response is received, the user should be prompted to choose the schema he wants to work with. To achieve this, i am having another sub generator which will gets the connections details using composeWith options parameter and make mysql connection and will show the list of Schema available from the mysql server as a list prompt. The same process is repeated for table and columns and relationships as well.

I am able to write test case and it is working based on the default values provided to the sub generator's prompts. but i am not able to different value to the sub generators as part of assertions. I went through the source of yeoman-test and yeoman-assert but i am not able to find any method through which i can achieve this.

Example Code

app/index.js generator

  get install() {
    return {
      callSubgenerator() {
        this.composeWith(require.resolve('../entitycolumn/index.js'));
      },
      doInstallDependencies() {
        this.installDependencies();
      }
    };
  }

entitycolumn/index.js generator

  prompting() {
    console.log('running sub generator');

    // Have Yeoman greet the user.
    this.log(
      yosay('Welcome to the priceless ' + chalk.red('generator-jdl') + ' generator!')
    );

    //prompt for schema list should come here...
    const prompts = [
      {
        type: 'input',
        name: 'host',
        message: 'Provide the Hostname for the Mysql Server',
        default: 'localhost'
      },
      {
        type: 'input',
        name: 'username',
        message: 'Provide the Username for the Mysql Server',
        default: 'root'
      },
      {
        type: 'password',
        name: 'password',
        message: 'Provide the Password for the Mysql Server',
        default: 'root'
      }
    ];

    return this.prompt(prompts).then(props => {
      // To access props later use this.props.someAnswer;
      this.props = props;
    });
  }

tests/app.js jest test suite file

'use strict';
const path = require('path');
// Const fs = require('fs');
const assert = require('yeoman-assert');
const helpers = require('yeoman-test');

describe('generator-jdl:app', () => {
  beforeAll(() => {
    /* GLOBAL */
    jest.setTimeout(100000);
    console.log('for cred prompt', {
      host: '127.0.0.1',
      username: 'root',
      password: 'root'
    });

    var deps = [[helpers.createDummyGenerator(), '../entitycolumn/index.js']];
    // Var otherDeps = [
    //   [
    //     helpers.createGenerator(
    //       require.resolve('../generators/entitycolumn/index.js'),
    //       null,
    //       [],
    //       null
    //     ),
    //     '../entitycolumn/index.js'
    //   ]
    // ];
	
	// I need some mechanism like below to send the answers for prompts like 
	// 'withPrompts' method at line 39 in generator called with composeWith.
	// Below is not working.
    helpers.mockPrompt(deps[0][0], {});

    return (
      helpers
        .run(path.join(__dirname, '../generators/app'))
        .withGenerators(deps)
        .withArguments(['l27.0.0.1', 'root', 'root'])
        .withPrompts({ schema: 'DBSCHEMA' })
        .inTmpDir(dir => {
          // `dir` is the path to the new temporary directory
          console.log(
            `${dir}/DBSCHEMA.jh \n=-=-=-=-=-=-=-=-=-=-=-=-=-=-= \n${path.join(
              __dirname,
              `dummyfile.jh`
            )}`
          );
        })
    );
  });

  it('creates files', () => {
    assert.file(['DBSCHEMA.jh']);    
  });
});

Hope, I have exemplified my need through the above sample code.

Regards,
Yogesh Surendran

inDir & cd create files in the project root path

In my case I want to share the RunContext tmp dir in two generator by set it in the inDir or cd pipeline, it does work, but instead generate files in the tmp dir it create them in my project root dir. What I did wrong?

  beforeAll(() => {
    return helpers.run(path.join(__dirname, '../generators/app'))
      .inDir(path.join(__dirname, './tmp'))
      .withOptions({'skip-install': true})
      .withPrompts({name: projectName});
  });

  beforeAll(() => {
    return helpers.run(path.join(__dirname, '../generators/module'))
      .inDir(path.join(__dirname, './tmp'))
      .withArguments([moduleName]);
  });

TypeError: Class constructor Generator cannot be invoked without 'new'

With this simple test:

it('generates a project', () => {
  return helpers
    .run(join(__dirname, '../generators/app'))
    .withPrompts({
      chooseApp: 'app-node-pl',
      patternType: '02-molecules',
      name: 'New Component Test',
    })
    .then(() => {
      console.log('Generator ran');
    });
});

Getting the following error:

TypeError: Class constructor Generator cannot be invoked without 'new'

With this stack:

      at new _class (new-component/generators/app/index.js:30:5)
      at Environment.instantiate (new-component/node_modules/yeoman-environment/lib/environment.js:470:12)
      at Environment.create (new-component/node_modules/yeoman-environment/lib/environment.js:448:17)
      at RunContext.Object.<anonymous>.RunContext._run (new-component/node_modules/yeoman-test/lib/run-context.js:91:29)
      at RunContext.<anonymous> (new-component/node_modules/yeoman-test/lib/run-context.js:59:10)
      at new-component/node_modules/lodash/lodash.js:5118:35
      at new-component/node_modules/yeoman-test/lib/index.js:93:5
      at next (new-component/node_modules/rimraf/rimraf.js:83:7)
      at CB (new-component/node_modules/rimraf/rimraf.js:119:9)
      at new-component/node_modules/rimraf/rimraf.js:145:14

With these deps:

  "dependencies": {
    "gulp-rename": "<=1.2.2",
    "lodash": "^4.17.15",
    "yeoman-generator": "^4.1.0",
    "yeoman-test": "^2.0.0"
  }

Is yeoman-test@^2.0.0 compatible with yeoman-generator@^4.1.0?

Build failing

I believe an update to yeoman-generator is breaking our error thrown test.

This is kind of concerning because it might mean yeoman-generator is currently swallowing error.

This is highly probable as with the Promise in node core, a lot of tools have been updated to use promise, and it's hard to know to the caller if a callback is being run as a .then() callback. This is problematic because errors thrown in a then callback are only rejecting the next promise, so they never actually throw.

My guess is calling this.error() inside yeoman-generator during the run loop doesn't throw as we'd expect when no error listener is set.

Add flag to inDir() that stops the dir from being rimrafed

In order to test a sub-generator, we like to run our app generator and then run the sub-generator in the generated project. Currently we rely on a hack to stop the app generated project from being wiped out, it would be great if there was a clean supported way to do this with an alternative fluent function inExistingProjectDir() or with a flag to inDir()

--force option should be set by default

If it's not set, changing the content of an existing file will be very noisy and the trace is not really useful:

Uncaught TypeError: undefined is not a function
      at Conflicter.<anonymous> (node_modules/yeoman-generator/lib/util/conflicter.js:198:36)
      at PromptUI.onCompletion (node_modules/yeoman-test/node_modules/inquirer/lib/ui/prompt.js:57:10)
      at AnonymousObserver.Rx.AnonymousObserver.AnonymousObserver.completed (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1550:12)
      at AnonymousObserver.Rx.internals.AbstractObserver.AbstractObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1489:14)
      at Subject.Rx.Subject.addProperties.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:5871:19)
      at Subject.tryCatcher (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:63:31)
      at AutoDetachObserverPrototype.completed (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:5796:56)
      at AutoDetachObserver.Rx.internals.AbstractObserver.AbstractObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1489:14)
      at AutoDetachObserver.tryCatcher (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:63:31)
      at AutoDetachObserverPrototype.completed (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:5796:56)
      at AutoDetachObserver.Rx.internals.AbstractObserver.AbstractObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1489:14)
      at InnerObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:2966:65)
      at InnerObserver.tryCatcher (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:63:31)
      at AutoDetachObserverPrototype.completed (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:5796:56)
      at AutoDetachObserver.Rx.internals.AbstractObserver.AbstractObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1489:14)
      at AutoDetachObserver.tryCatcher (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:63:31)
      at AutoDetachObserverPrototype.completed (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:5796:56)
      at AutoDetachObserver.Rx.internals.AbstractObserver.AbstractObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1489:14)
      at InnerObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:2966:65)
      at InnerObserver.tryCatcher (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:63:31)
      at AutoDetachObserverPrototype.completed (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:5796:56)
      at AutoDetachObserver.Rx.internals.AbstractObserver.AbstractObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1489:14)
      at AutoDetachObserver.tryCatcher (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:63:31)
      at AutoDetachObserverPrototype.completed (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:5796:56)
      at AutoDetachObserver.Rx.internals.AbstractObserver.AbstractObserver.onCompleted (node_modules/yeoman-test/node_modules/inquirer/node_modules/rx-lite/rx.lite.js:1489:14)
      at node_modules/yeoman-test/node_modules/inquirer/lib/utils/utils.js:18:13
      at f (node_modules/yeoman-test/node_modules/inquirer/node_modules/run-async/node_modules/once/once.js:17:25)
      at node_modules/yeoman-test/node_modules/inquirer/lib/ui/prompt.js:86:7
      at Immediate._onImmediate (node_modules/yeoman-test/lib/adapter.js:39:5)

XO gives an error for new-cap on JSONFileContent

I'm currently using XO together with Yeoman and it's test libraries, however XO trips up on one of the asserts used by yeoman-test called JSONFileContent due to the new-cap rule.

  test/gulp.js:22:10
  ✖  22:10  A function with a name starting with an uppercase letter should only be used as a constructor.  new-cap

  test/uploading/aws.js:22:10
  ✖  22:10  A function with a name starting with an uppercase letter should only be used as a constructor.  new-cap

  test/uploading/rsync.js:22:10
  ✖  22:10  A function with a name starting with an uppercase letter should only be used as a constructor.  new-cap

  3 errors

I'll refer to @sindresorhus comment here.

Remove needs for toPromise()

We can probably automatically duck type the RunContext object to be a Promise to remove the annoying need to run .toPromise().

We'd just need to surface then and catch and mirror them on the result of toPromise() internally.

change TestAdapter to utilize the AbstractAutomationAdapter from the environment package

Need to wait for yeoman/environment#102 to be published.
TestAdapter can be rewritten like so:

'use strict';

var sinon = require('sinon');
const AbstractAutomationAdapter = require('yeoman-enviormnet/automation/abstract-automation-adapter');

const diffHandler = sinon.spy();

const logFactory = {
  createLog() {
    return sinon.spy();
  },
  createLogMethod(log) {
    return function () {
      return sinon.stub().returns(log);
    };
  }
};

class TestAdapter extends AbstractAutomationAdapter {

  constructor(answers) {
    super(diffHandler, answers, logFactory);
  }

}

module.exports = TestAdapter;

Test running commands in test directory

I'm writing a generator which provides certain commands as part of its creation. For example, npm run test or grunt test or whatever. I'd like to be able to verify that they will run properly in my generator's test suite. A naive attempt didn't work:

var exec = require('child_process').exec;
[snip]
    it("provides 'npm run test'", function (done) {
        exec('npm run test', function (err, stdout, stderr) {
          if (err) {
            assert.ok(false, stderr);
          }
          else {
            assert(true, stdout);
          }
          done();
        }
      }
    });

I get the following in err and nothing in stdout or stderr:

{ [Error: Command failed: /bin/sh -c npm run test
]
  killed: false,
  code: 1,
  signal: null,
  cmd: '/bin/sh -c npm run test' }

It's not clear what my next step would be.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.