Giter Club home page Giter Club logo

all-repos's Introduction

build status pre-commit.ci status

all-repos

Clone all your repositories and apply sweeping changes.

Installation

pip install all-repos

CLI

All command line interfaces provided by all-repos provide the following options:

  • -h / --help: show usage information
  • -C CONFIG_FILENAME / --config-filename CONFIG_FILENAME: use a non-default config file (the default all-repos.json can be changed with the environment variable ALL_REPOS_CONFIG_FILENAME).
  • --color {auto,always,never}: use color in output (default auto).

all-repos-complete [options]

Add git clone tab completion for all-repos repositories.

Requires jq to function.

Add to .bash_profile:

eval "$(all-repos-complete -C ~/.../all-repos.json --bash)"

all-repos-clone [options]

Clone all the repositories into the output_dir. If run again, this command will update existing repositories.

Options:

  • -j JOBS / --jobs JOBS: how many concurrent jobs will be used to complete the operation. Specify 0 or -1 to match the number of cpus. (default 8).

Sample invocations:

  • all-repos-clone: clone the repositories specified in all-repos.json
  • all-repos-clone -C all-repos2.json: clone using a non-default config filename.

all-repos-find-files [options] PATTERN

Similar to a distributed git ls-files | grep -P PATTERN.

Arguments:

Options:

  • --repos-with-matches: only print repositories with matches.

Sample invocations:

  • all-repos-find-files setup.py: find all setup.py files.
  • all-repos-find-files --repos setup.py: find all repositories containing a setup.py.

all-repos-grep [options] [GIT_GREP_OPTIONS]

Similar to a distributed git grep ....

Options:

  • --repos-with-matches: only print repositories with matches.
  • GIT_GREP_OPTIONS: additional arguments will be passed on to git grep. see git grep --help for available options.

Sample invocations:

  • all-repos-grep pre-commit -- 'requirements*.txt': find all repositories which have pre-commit listed in a requirements file.
  • all-repos-grep -L six -- setup.py: find setup.py files which do not contain six.

all-repos-list-repos [options]

List all cloned repository names.

all-repos-manual [options]

Interactively apply a manual change across repos.

note: all-repos-manual will always run in --interactive autofixing mode.

note: all-repos-manual requires the --repos autofixer option.

Options:

  • autofix options: all-repos-manual is an autofixer and supports all of the autofixer options.
  • --branch-name BRANCH_NAME: override the autofixer branch name (default all-repos-manual).
  • --commit-msg COMMIT_MSG (required): set the autofixer commit message.

all-repos-sed [options] EXPRESSION FILENAMES

Similar to a distributed git ls-files -z -- FILENAMES | xargs -0 sed -i EXPRESSION.

note: this assumes GNU sed. If you're on macOS, install gnu-sed with Homebrew:

brew install gnu-sed

# Add to .bashrc / .zshrc
export PATH="$(brew --prefix)/opt/gnu-sed/libexec/gnubin:$PATH"

Arguments:

  • EXPRESSION: sed program. For example: s/hi/hello/g.
  • FILENAMES: filenames glob (passed to git ls-files).

Options:

  • autofix options: all-repos-sed is an autofixer and supports all of the autofixer options.
  • -r / --regexp-extended: use extended regular expressions in the script. See man sed for further details.
  • --branch-name BRANCH_NAME override the autofixer branch name (default all-repos-sed).
  • --commit-msg COMMIT_MSG override the autofixer commit message. (default git ls-files -z -- FILENAMES | xargs -0 sed -i ... EXPRESSION).

Sample invocations:

  • all-repos-sed 's/foo/bar/g' -- '*': replace foo with bar in all files.

Configuring

A configuration file looks roughly like this:

{
    "output_dir": "output",
    "source": "all_repos.source.github",
    "source_settings":  {
        "api_key": "...",
        "username": "asottile"
    },
    "push": "all_repos.push.github_pull_request",
    "push_settings": {
        "api_key": "...",
        "username": "asottile"
    }
}
  • output_dir: where repositories will be cloned to when all-repos-clone is run.
  • source: the module import path to a source, see below for builtin source modules as well as directions for writing your own.
  • source_settings: the source-type-specific settings, the source module's documentation will explain the various possible values.
  • push: the module import path to a push, see below for builtin push modules as well as directions for writing your own.
  • push_settings: the push-type-specific settings, the push module's documentation will explain the various possible values.
  • include (default ""): python regex for selecting repositories. Only repository names which match this regex will be included.
  • exclude (default "^$"): python regex for excluding repositories. Repository names which match this regex will be excluded.
  • all_branches (default false): whether to clone all of the branches or just the default upstream branch.

Source modules

all_repos.source.json_file

Clones all repositories listed in a file. The file must be formatted as follows:

{
    "example/repo1": "https://git.example.com/example/repo1",
    "repo2": "https://git.example.com/repo2"
}

Required source_settings

  • filename: file containing repositories one-per-line.

Directory location

output/
+--- repos.json
+--- repos_filtered.json
+--- {repo_key1}/
+--- {repo_key2}/
+--- {repo_key3}/

all_repos.source.github

Clones all repositories available to a user on github.

Required source_settings

  • api_key: the api key which the user will log in as.
    • Use the settings tab to create a personal access token.
    • The minimum scope required to function is public_repo, though you'll need repo to access private repositories.
  • api_key_env: alternatively API key can also be passed via an environment variable
  • username: the github username you will log in as.

Optional source_settings

  • collaborator (default false): whether to include repositories which are not owned but can be contributed to as a collaborator.
  • forks (default false): whether to include repositories which are forks.
  • private (default false): whether to include private repositories.
  • archived (default: false): whether to include archived repositories.
  • base_url (default: https://api.github.com) is the base URL to the Github API to use (for Github Enterprise support set this to https://{your_domain}/api/v3).

Directory location

output/
+--- repos.json
+--- repos_filtered.json
+--- {username1}/
    +--- {repo1}/
    +--- {repo2}/
+--- {username2}/
    +--- {repo3}/

all_repos.source.github_forks

Clones all repositories forked from a repository on github.

Required source_settings

  • api_key: the api key which the user will log in as.
    • Use the settings tab to create a personal access token.
    • The minimum scope required to function is public_repo.
  • api_key_env: alternatively API key can also be passed via an environment variable
  • repo: the repo which has forks

Optional source_settings

  • collaborator (default true): whether to include repositories which are not owned but can be contributed to as a collaborator.
  • forks (default true): whether to include repositories which are forks.
  • private (default false): whether to include private repositories.
  • archived (default: false): whether to include archived repositories.
  • base_url (default: https://api.github.com) is the base URL to the Github API to use (for Github Enterprise support set this to https://{your_domain}/api/v3).

Directory location

See the directory structure for all_repos.source.github.

all_repos.source.github_org

Clones all repositories from an organization on github.

Required source_settings

  • api_key: the api key which the user will log in as.
    • Use the settings tab to create a personal access token.
    • The minimum scope required to function is public_repo, though you'll need repo to access private repositories.
  • api_key_env: alternatively API key can also be passed via an environment variable
  • org: the organization to clone from

Optional source_settings

  • collaborator (default true): whether to include repositories which are not owned but can be contributed to as a collaborator.
  • forks (default false): whether to include repositories which are forks.
  • private (default false): whether to include private repositories.
  • archived (default: false): whether to include archived repositories.
  • base_url (default: https://api.github.com) is the base URL to the Github API to use (for Github Enterprise support set this to https://{your_domain}/api/v3).

Directory location

See the directory structure for all_repos.source.github.

all_repos.source.gitolite

Clones all repositories available to a user on a gitolite host.

Required source_settings

  • username: the user to SSH to the server as (usually git)
  • hostname: the hostname of your gitolite server (e.g. git.mycompany.com)

The gitolite API is served over SSH. It is assumed that when all-repos-clone is called, it's possible to make SSH connections with the username and hostname configured here in order to query that API.

Optional source_settings

  • mirror_path (default None): an optional mirror to clone repositories from. This is a Python format string, and can use the variable repo_name.

    This can be anything git understands, such as another remote server (e.g. gitmirror.mycompany.com:{repo_name}) or a local path (e.g. /gitolite/git/{repo_name}.git).

Directory location

output/
+--- repos.json
+--- repos_filtered.json
+--- {repo_name1}.git/
+--- {repo_name2}.git/
+--- {repo_name3}.git/

all_repos.source.bitbucket

Clones all repositories available to a user on Bitbucket Cloud.

Required source_settings

  • username: the Bitbucket username you will log in as.
  • app_password: the authentication method for the above user to login with
    • Create an application password within your account settings.
    • We need the scope: Repositories -> Read

all_repos.source.bitbucket_server

Clones all repositories available to a user on Bitbucket Server.

Required source_settings

  • base_url: the bitbucket server URL (eg bitbucket.domain.com)
  • username: the Bitbucket username you will log in as.
  • app_password: the authentication method for the above user to login with
    • Create an application password within your account settings.
    • We need the scope: Repositories -> Read

Optional source_settings

  • project (default None): an optional project to restrict the search for repositories.

Directory location

output/
+--- repos.json
+--- repos_filtered.json
+--- {username1}/
    +--- {repo1}/
    +--- {repo2}/
+--- {username2}/
    +--- {repo3}/

all_repos.source.gitlab_org

Clones all repositories from an organization on gitlab.

Required source_settings

  • api_key: the api key which the user will log in as.
    • Use the settings tab (eg https://{gitlab.domain.com}/-/profile/personal_access_tokens) to create a personal access token.
    • We need the scope: read_api, read_repository.
  • api_key_env: alternatively API key can also be passed via an environment variable
  • org: the organization to clone from

Optional source_settings

  • base_url: (default https://gitlab.com/api/v4) the gitlab server URL
  • archived (default: false): whether to include archived repositories.

Directory location

output/
+--- repos.json
+--- repos_filtered.json
+--- {org}/
    +--- {subpgroup1}/
        +--- {subpgroup2}/
            +--- {repo1}/
        +--- {repo2}/
    +--- {repo3}/
    +--- {repo4}/

Writing your own source

First create a module. This module must have the following api:

A Settings class

This class will receive keyword arguments for all values in the source_settings dictionary.

An easy way to implement the Settings class is by using a namedtuple:

Settings = collections.namedtuple('Settings', ('required_thing', 'optional'))
Settings.__new__.__defaults__ = ('optional default value',)

In this example, the required_thing setting is a required setting whereas optional may be omitted (and will get a default value of 'optional default value').

def list_repos(settings: Settings) -> Dict[str, str]: callable

This callable will be passed an instance of your Settings class. It must return a mapping from {repo_name: repository_url}. The repo_name is the directory name inside the output_dir.

Push modules

all_repos.push.merge_to_master

Merges the branch directly to the default branch and pushes. The commands it runs look roughly like this:

git checkout main
git pull
git merge --no-ff $BRANCH
git push origin HEAD

Optional push_settings

  • fast_forward (default: false): if true, perform a fast-forward merge (--ff-only). If false, create a merge commit (--no-ff).

all_repos.push.github_pull_request

Pushes the branch to origin and then creates a github pull request for the branch.

Required push_settings

  • api_key: the api key which the user will log in as.
    • Use the settings tab to create a personal access token.
    • The minimum scope required to function is public_repo, though you'll need repo to access private repositories.
  • api_key_env: alternatively API key can also be passed via an environment variable
  • username: the github username you will log in as.

Optional push_settings

  • fork (default: false): (if applicable) a fork will be created and pushed to instead of the upstream repository. The pull request will then be made to the upstream repository.
  • base_url (default: https://api.github.com) is the base URL to the Github API to use (for Github Enterprise support set this to https://{your_domain}/api/v3).

all_repos.push.bitbucket_server_pull_request

Pushes the branch to origin and then creates a Bitbucket pull request for the branch.

Required push_settings

  • base_url: the Bitbucket server URL (eg bitbucket.domain.com)
  • username: the Bitbucket username you will log in as.
  • app_password: the authentication method for the above user to login with
    • Create an application password within your account settings.
    • We need the scope: Repositories -> Read

all_repos.push.gitlab_pull_request

Pushes the branch to origin and then creates a GitLab pull request for the branch.

Required push_settings

  • base_url: the GitLab server URL (eg https://{gitlab.domain.com}/api/v4)
  • api_key: the api key which the user will log in as.
    • Use the settings tab (eg https://{gitlab.domain.com}/-/profile/personal_access_tokens) to create a personal access token.
    • We need the scope: write_repository.
  • api_key_env: alternatively API key can also be passed via an environment variable

all_repos.push.readonly

Does nothing.

push_settings

There are no configurable settings for readonly.

Writing your own push module

First create a module. This module must have the following api:

A Settings class

This class will receive keyword arguments for all values in the push_settings dictionary.

def push(settings: Settings, branch_name: str) -> None:

This callable will be passed an instance of your Settings class. It should deploy the branch. The function will be called with the root of the repository as the cwd.

Writing an autofixer

An autofixer applies a change over all repositories.

all-repos provides several api functions to write your autofixers with:

all_repos.autofix_lib.add_fixer_args

def add_fixer_args(parser):

Adds the autofixer cli options.

Options:

  • --dry-run: show what would happen but do not push.
  • -i / --interactive: interactively approve / deny fixes.
  • -j JOBS / --jobs JOBS: how many concurrent jobs will be used to complete the operation. Specify 0 or -1 to match the number of cpus. (default 1).
  • --limit LIMIT: maximum number of repos to process (default: unlimited).
  • --author AUTHOR: override commit author. This is passed directly to git commit. An example: --author='Herp Derp <[email protected]>'.
  • --repos [REPOS [REPOS ...]]: run against specific repositories instead. This is especially useful with xargs autofixer ... --repos. This can be used to specify repositories which are not managed by all-repos.

all_repos.autofix_lib.from_cli

def from_cli(args, *, find_repos, msg, branch_name):

Parse cli arguments and produce autofix_lib primitives. Returns (repos, config, commit, autofix_settings). This is handled separately from fix to allow for fixers to adjust arguments.

  • find_repos: callback taking Config as a positional argument.
  • msg: commit message.
  • branch_name: identifier used to construct the branch name.

all_repos.autofix_lib.fix

def fix(
        repos, *,
        apply_fix,
        check_fix=_noop_check_fix,
        config: Config,
        commit: Commit,
        autofix_settings: AutofixSettings,
):

Apply the fix.

  • apply_fix: callback which will be called once per repository. The cwd when the function is called will be the root of the repository.

all_repos.autofix_lib.run

def run(*cmd, **kwargs):

Wrapper around subprocess.run which prints the command it will run. Unlike subprocess.run, this defaults check=True unless explicitly disabled.

Example autofixer

The trivial autofixer is as follows:

import argparse

from all_repos import autofix_lib

def find_repos(config):
    return []

def apply_fix():
    pass

def main(argv=None):
    parser = argparse.ArgumentParser()
    autofix_lib.add_fixer_args(parser)
    args = parser.parse_args(argv)

    repos, config, commit, autofix_settings = autofix_lib.from_cli(
        args, find_repos=find_repos, msg='msg', branch_name='branch-name',
    )
    autofix_lib.fix(
        repos, apply_fix=apply_fix, config=config, commit=commit,
        autofix_settings=autofix_settings,
    )

if __name__ == '__main__':
    raise SystemExit(main())

You can find some more involved examples in all_repos/autofix:

  • all_repos.autofix.azure_pipelines_autoupdate: upgrade pinned azure pipelines template repository references.
  • all_repos.autofix.pre_commit_autoupdate: runs pre-commit autoupdate.
  • all_repos.autofix.pre_commit_autopep8_migrate: migrates autopep8-wrapper from pre-commit/pre-commit-hooks to mirrors-autopep8.
  • all_repos.autofix.pre_commit_cache_dir: updates the cache directory for travis-ci / appveyor for pre-commit 1.x.
  • all_repos.autofix.pre_commit_flake8_migrate: migrates flake8 from pre-commit/pre-commit-hooks to pycqa/flake8.
  • all_repos.autofix.pre_commit_migrate_config: runs pre-commit migrate-config.
  • all_repos.autofix.setup_py_upgrade: runs setup-py-upgrade and then setup-cfg-fmt to migrate setup.py to setup.cfg.

all-repos's People

Contributors

adamgagorik avatar asottile avatar boidolr avatar chriskuehl avatar davidszotten avatar gaborbernat avatar ibledy avatar jorisboeye avatar kinddragon avatar kppullin avatar mxr avatar noorul avatar outdatedversion avatar pre-commit-ci[bot] avatar ronnypfannschmidt avatar sco1 avatar sfc-gh-mkeller avatar sloria avatar tpwo avatar zackhsi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

all-repos's Issues

gitolite source

Hi friend :)

I'd like to add a gitolite source and wanted to run the approach by you before sending a PR. A couple questions:

  1. For getting the list of repositories, seems like the only way provided via the gitolite API itself is the info command. It provides a handy -json flag, e.g.:

    $ ssh [email protected] info -json
    {
       "repos" : {
          "a_repo" : {
             "perms" : {
                "W" : 1,
                "R" : 1
             }
          },
          "some_repo-DEPRECATED" : {
             "perms" : {
                # "W" key is omitted entirely
                "R" : 1
             }
          }
       }
       "gitolite_version" : "1.2.3",
       "USER" : "git@somehost",
       "GL_USER" : "ckuehl",
       "git_version" : "1.2.3"
    }
    

    Potentially we could provide other ways to get this list (e.g. something like y/gits), but I'm thinking if I wanted to do that, I'd just format that into a JSON file and use the JSON file source.

    So, does over SSH sound okay? And if so, do you prefer subprocessing out to ssh or using paramiko? (I think I prefer the subprocess approach just to avoid adding a new dep?)

  2. Actually I guess I only had one question. This looks pretty straightforward to implement :P

Improve querying speed with concurrency

Currently we process grep results serially to produce nice error messages on command failure.

Change that to:

  • run the first repository and report errors on that
  • run the rest in parallel

This should improve querying speed dramatically. Expose a --jobs such that it can be turned off.

`all-repos-list-repos`: add `--output-paths`

The current output is a little clunky when trying to chain with xargs:

$ all-repos-list-repos | tail -5
Yelp/docker-push-latest-if-changed
Yelp/requirements-tools
Yelp/venv-update
Yelp/wsgi-mod-rpaf
Yelp/yelp_cheetah

For instance:

$ all-repos-list-repos | tail -5 | xargs --replace bash -c 'echo -n {}" " && git -C repos-asottile/{} rev-parse HEAD'
Yelp/docker-push-latest-if-changed 1d481bbbb30f1aa164bd965ba33ef61cbb11981c
Yelp/requirements-tools f94e0eb774bb4e37a58e98d5adbbe65a5520b8ef
Yelp/venv-update 02860446c983f8df10db221d66e22a7a912a550c
Yelp/wsgi-mod-rpaf d28a42fff8c77e5480bd6c166e4e53c2d308e81b
Yelp/yelp_cheetah 61ff00fd8944f19bc81b3f03801c322fc339ffeb

It would be nicer if there were this output:

$ all-repos-list-repos --output-paths | tail -5
repos-asottile/Yelp/docker-push-latest-if-changed
repos-asottile/Yelp/requirements-tools
repos-asottile/Yelp/venv-update
repos-asottile/Yelp/wsgi-mod-rpaf
repos-asottile/Yelp/yelp_cheetah

Thoughts on the current documentation

While all-repos is now a tool I do not want to miss anymore, >>I<< had a bit of a hard time to wrap my head around it at first.

From our Twitter conversation I sensed that from your perspective the documentation is sufficient.

These are some suggestions from my side, which would have helped me a lot when I started using all-repos.

At least it is no longer a "I missed something in your documentation", but some concrete things :-)

If you find something useful, I'd volunteer to create a pull request. If you do not like the suggestions, I am fine when you close the issue.

a) While the tagline(Clone all your repositories and apply sweeping changes.) is concise and on the point, I would add another paragraph to give a high level overview what the user can expect.

e.g. something like this

all-repos really shines when you have to manage a lot of git repositories. It offers command line tools to clone all of your repositories at once, supports you in finding files or text within those repositories, and as a highlight, even lets you do changes across all repositories and automatically creates pull requests for you.

b) While you stated that a step by step guide does not make a lot of sense as everyone's setup is different, at least there are some common steps:

  • pip install all-repos
  • create a configuration
  • clone some repositories
  • act on them
    • read only
    • with modifications and possible automatically create pull requests

The current documentation jumps from the installation directly to the CLI reference.

I would possibly unite the first three steps as "preparations" or "setup", and then lead over to the "actions".

When introducing the two kinds of actions (read only vs with modification), I would mention, that for the latter to make really sense you need to configure a push module, and explicitly mention, that with executing an autofixer, branches, commits and pull requests get created automatically.

c) speaking of the configuration, while the error messages are obvious, it would not hurt to add that the configuration file

  • should not be readable for groups / others (chmod 0600)
  • should be called all-repos.json so the all-repos commands automatically pick it up (you mention it elsewhere, but not in the configuring section)

d) Loosely following the documentation categories of https://www.writethedocs.org/videos/eu/2017/the-four-kinds-of-documentation-and-why-you-need-to-understand-what-they-are-daniele-procida/

  • the current documentation is a reference
  • my above suggestions corresponds to a "tutorial" or an introduction
  • then maybe adding one or two use-cases would be great?

A use-case could be (from my past experience):

P.S.: I'll update the blog post in a second - sorry for the non-constructive criticism

How could I select github repos by topic?

In my org, we make heavy use of 'topics' to categorize repositories. These categories are not always encoded in the repo name (eg, the language or platform might be added as a topic, or a topic might indicate that a repo is part of the main service, versus a maintenance tool, etc).

How might I select certain repos, to be able to operate on all repos with a specific topic, etc? Could the include/exclude regexes be used here? The README makes it sound like the regex is only applied to the repo name (but maybe I misunderstood)?

Or maybe a custom source is the way to go? Do you have any examples of this?

Re: #202

test_github_pull_request test broken with git 2.11.0

Mostly just for documentation:

============================================ FAILURES =============================================
____________________________________ test_github_pull_request _____________________________________

mock_requests = auto_namedtuple(get=<MagicMock name='get' id='140440998950448'>, post=<MagicMock name='post' id='140440998548144'>)
fake_github_repo = auto_namedtuple(src=local('/tmp/pytest-of-ckuehl/pytest-72/test_github_pull_request0/repo:user/slug'), dest=local('/tmp/pytest-of-ckuehl/pytest-72/test_github_pull_request0/dest'), settings=Settings(api_key='fake', username='user'))

>   ???

tests/push/github_pull_request_test.py:33:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
all_repos/push/github_pull_request.py:15: in push
    autofix_lib.run('git', 'push', 'origin', f'HEAD:{branch_name}', '--quiet')
all_repos/autofix_lib.py:101: in run
    return subprocess.run(cmd, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

input = None, timeout = None, check = True
popenargs = (('git', 'push', 'origin', 'HEAD:feature', '--quiet'),), kwargs = {}
process = <subprocess.Popen object at 0x7fbaf7dabf28>, stdout = None, stderr = None, retcode = 1

    def run(*popenargs, input=None, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.

        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.

        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.

        There is an optional argument "input", allowing you to
        pass a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.

        The other arguments are the same as for the Popen constructor.

        If universal_newlines=True is passed, the "input" argument must be a
        string and stdout/stderr in the returned object will be strings rather than
        bytes.
        """
        if input is not None:
            if 'stdin' in kwargs:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE

        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired:
                process.kill()
                stdout, stderr = process.communicate()
                raise TimeoutExpired(process.args, timeout, output=stdout,
                                     stderr=stderr)
            except:
                process.kill()
                process.wait()
                raise
            retcode = process.poll()
            if check and retcode:
                raise CalledProcessError(retcode, process.args,
>                                        output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '('git', 'push', 'origin', 'HEAD:feature', '--quiet')' returned non-zero exit status 1.

/usr/lib/python3.6/subprocess.py:418: CalledProcessError
======================================= 90 tests deselected =======================================
============================ 1 failed, 90 deselected in 850.10 seconds ============================

This is caused by: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=863423

Workaround: install a newer git from stretch-backports or some other source.

remote branches

After running the all-repos-clone command, I am not able to checkout any remote branches for the cloned repositories.

I have tried running git fetch --all though that does not do anything.

(discussed briefly on stream)

Support "two-phase autofix"

Currently, the autofix workflow requires that autofixing, merging, and pushing happen all in one step. This makes changes hard to reverse after everything is pushed.

It would be great if autofix happened in two phases (with separate commands): fixing then pushing.

  1. Run an autofixer that commits and merges to master.
  2. Manually verify that everything looks peachy.
  3. Run a command to push all the repos (all-repos-push?).

all-repos-clone doesn't respect default branch

I tried running a --dry-run of https://github.com/asottile/all-repos-incantations#travis-remove-sudo , but it's not being applied to all cloned repos.

I'm trying to apply it to all repos in https://github.com/marshmallow-code.

I'm using the following config

{
    "output_dir": ".",
    "source": "all_repos.source.github_org",
    "source_settings":  {
        "api_key": "snip",
        "org": "marshmallow-code"
    },
    "push": "all_repos.push.github_pull_request",
    "push_settings": {
        "api_key": "snip",
        "username": "sloria"
    }
}

But it's only being applied to three repos (that I didn't create): apispec-webframeworks, django-rest-marshmallow, marshmallow-oneofschema.

I've verified that all the org's repos are in repos.json and repos_filtered.json

marshmallow-code โฏ cat repos.json | jq
{
  "marshmallow-code/marshmallow": "[email protected]:marshmallow-code/marshmallow",
  "marshmallow-code/webargs": "[email protected]:marshmallow-code/webargs",
  "marshmallow-code/flask-marshmallow": "[email protected]:marshmallow-code/flask-marshmallow",
  "marshmallow-code/apispec": "[email protected]:marshmallow-code/apispec",
  "marshmallow-code/marshmallow-sqlalchemy": "[email protected]:marshmallow-code/marshmallow-sqlalchemy",
  "marshmallow-code/marshmallow-validators": "[email protected]:marshmallow-code/marshmallow-validators",
  "marshmallow-code/django-rest-marshmallow": "[email protected]:marshmallow-code/django-rest-marshmallow",
  "marshmallow-code/marshmallow-jsonapi": "[email protected]:marshmallow-code/marshmallow-jsonapi",
  "marshmallow-code/marshmallow-oneofschema": "[email protected]:marshmallow-code/marshmallow-oneofschema",
  "marshmallow-code/marshmallow-select": "[email protected]:marshmallow-code/marshmallow-select",
  "marshmallow-code/apispec-webframeworks": "[email protected]:marshmallow-code/apispec-webframeworks"
}

~/projects/marshmallow-code
marshmallow-code โฏ cat repos_filtered.json | jq
{
  "marshmallow-code/marshmallow": "[email protected]:marshmallow-code/marshmallow",
  "marshmallow-code/webargs": "[email protected]:marshmallow-code/webargs",
  "marshmallow-code/flask-marshmallow": "[email protected]:marshmallow-code/flask-marshmallow",
  "marshmallow-code/apispec": "[email protected]:marshmallow-code/apispec",
  "marshmallow-code/marshmallow-sqlalchemy": "[email protected]:marshmallow-code/marshmallow-sqlalchemy",
  "marshmallow-code/marshmallow-validators": "[email protected]:marshmallow-code/marshmallow-validators",
  "marshmallow-code/django-rest-marshmallow": "[email protected]:marshmallow-code/django-rest-marshmallow",
  "marshmallow-code/marshmallow-jsonapi": "[email protected]:marshmallow-code/marshmallow-jsonapi",
  "marshmallow-code/marshmallow-oneofschema": "[email protected]:marshmallow-code/marshmallow-oneofschema",
  "marshmallow-code/marshmallow-select": "[email protected]:marshmallow-code/marshmallow-select",
  "marshmallow-code/apispec-webframeworks": "[email protected]:marshmallow-code/apispec-webframeworks"
}

Any idea why it's not finding all the repos?

Pass -e to all-repos-sed

I tried out

all-repos-sed '/sudo:/d' .travis.yml \
    --commit-msg "$(echo -e 'remove sudo: in .travis.yml\n\nhttps://blog.travis-ci.com/2018-11-19-required-linux-infrastructure-migration')"

from https://github.com/asottile/all-repos-incantations#travis-remove-sudo

But got a failure:

sed: 1: ".travis.yml": invalid command code .
***Errored
Traceback (most recent call last):
  File "/Users/sloria/.pyenv/versions/3.7.0/envs/marshmallow-code/lib/python3.7/site-packages/all_repos/autofix_lib.py", line 162, in repo_context
    yield
  File "/Users/sloria/.pyenv/versions/3.7.0/envs/marshmallow-code/lib/python3.7/site-packages/all_repos/autofix_lib.py", line 221, in _fix_inner
    apply_fix()
  File "/Users/sloria/.pyenv/versions/3.7.0/envs/marshmallow-code/lib/python3.7/site-packages/all_repos/sed.py", line 40, in apply_fix
    autofix_lib.run(*sed_cmd, *filenames)
  File "/Users/sloria/.pyenv/versions/3.7.0/envs/marshmallow-code/lib/python3.7/site-packages/all_repos/autofix_lib.py", line 116, in run
    return subprocess.run(cmd, **kwargs)
  File "/Users/sloria/.pyenv/versions/3.7.0/lib/python3.7/subprocess.py", line 468, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '('sed', '-i', '/sudo:/d', '.travis.yml')' returned non-zero exit status 1.

This is because my .travis.yml files have multiple occurrences of sudo:. Therefore, the -e option to sed is necessary.

Is there a way to pass additional options to sed? @asottile

make clone removals interactive

while messing around with recreating a gitlab_org source i did put my all-repos checkout into the target dir for a all-repos clone, it was fabulously destroyed

Allow setting reviewers when using github_pull_request

I would like to be able to set the reviewers for PRs created through all-repos.

all-repos-sed ... --reviewers mycolleague

I was going to implement this as a custom push module, but currently push modules can only receive Settings and branch_name.

One approach would be to implement push strategies as classes with a hook for adding additional parser arguments. Something like:

class GitHubPullRequest(BasePushStrategy):
    def add_arguments(self, parser: ParserType):
        parser.add_arguments("--reviewers", nargs="*")

   # or "def handle",  to match Django's management commands
    def __call__(self, settings: Settings, branch_name: str, parsed_args: Any) -> None:
        # ...
        resp = github_api.req(
            f"{settings.base_url}/repos/{repo_slug}/pulls",
            data=data,
            headers=headers,
            method="POST",
        )
        url = resp.json["url"]
        html_url = resp.json["html_url"]
        print(f"Pull request created at {html_url}")
        reviewers = parsed_args.reviewers
        if reviewers:
            resp = github_api.req(
                f"{url}/requested_reviewers",
                data=json.dumps({"reviewers": reviewers}).encode(),
                headers=headers,
                method="POST",
            )
            print(f"Requested review from {reviewers}")

The "push" config in all-repos.json would point to these classes instead of modules.

{
  "push": "all_repos.push.GitHubPullRequest"
}

Open questions:

  1. Is the above interface a good idea?
    1a. If so, is it OK to implement it as a breaking change?
  2. Should the "reviewer" functionality be included in all-repos core?

I don't have a strong feeling on 2. As long as there's a way to do it, I'm fine keeping it as custom code outside of core to start. Up to you, @asottile

Behavior when check_fix fails?

Currently when an autofixer's check_fix function exits with an exception in --interactive mode, the command exits without a way to fix issues. Is there a way to fix the issues found by check_fix before proceeding?

Support interactive cloning

Hi, could you imagine a feature where selecting repositories (for cloning) could be interactive?
My use-case is when filtering from lot of repositories is not trivial (can not decide easily with collaborator/fork/private true-false options).
Found a python module for this: PyInquirer -> checkbox type

1, example all-repos.json:

    ...
    "source_settings":  {
        "api_key": "...",
        "username": "username",
    	"interactive": true,
        "other-options": ....
    },
    ...

2, In this case all-repos-clone should offer interactive selection for all repositories related to options: collaborator/fork/private

3, Selection:

= Organization1 =
[] repository1 (collaborator or fork or private) 
[] repository2 (collaborator or fork or private) 
[] repository3 (collaborator or fork or private) 
= Organization2 =
[] repository4 (collaborator or fork or private) 
[] repository5 (collaborator or fork or private) 
[] repository6 (collaborator or fork or private) 

4, Only the selected repositories will be cloned

Temporary repo context directory cannot be cleaned up on Windows

On Windows all files in .git/ dir are marked as read-only. On Windows, that means that they cannot be deleted by the tempfile.TemporaryDirectory context manager ("Access is denied").

One explicit way to fix this is to do something like this on every file in .git/ prior to exiting the temp dir context manager:

os.chmod(path, stat.S_IWUSR)

See also https://gitpython.readthedocs.io/en/stable/reference.html?highlight=rmtree#git.util.rmtree for implementation.

when git commiter email is not set commiting fails even if author is passed

i have a particular setup where i don't set the git user email for folders outside of specific directories in my home folder
(to control which commits go by my opensource mail and my work mail)

when i ran all-repos-sed without an author, it errors about the author, once i gave the author, it errors about the commit-er which cant be set

document if/how all_repos_clone repos can be used as full repos

i cloned a set of repos in a private organization using all-repos

i noted that im unable to fetch branches from origin

if i go in and use a new clone using the same remote, it works fine

as far as i understood, the primary difference is
fetch = +refs/heads/*:refs/remotes/origin/* on self-clone vs fetch = +refs/heads/master:refs/remotes/origin/master in all-repos clone

Allow comments in config files

I know people have strong opinions about config formats. I don't. json5, toml, yaml--all are fine with me, as long as I can add comments ๐Ÿ˜„

test_grep_error is failing with newer git version

During my developement I found that there are two failig tests on master related to grep.
The root cause of this, there was a change in git upstream related to debug message:
no pattern given
It has been changed from: no pattern given. to no pattern given

Url for the change:
git/git@1a07e59
Search for the: builtin/grep.c file

Reproduction:
$ python3 -m pytest tests/grep_test.py

Test error:

file_config_files = auto_namedtuple(output_dir=local('/tmp/pytest-of-micek/pytest-6/test_grep_error_args0_0/output'), cfg=local('/tmp/pyte...ep_error_args0_0/2'), rev1='09b8d2e4a13de36711107a7814bed289708aee3c', rev2='75c5862c260933404547b4b001ba39eaca130454')
capfd = <_pytest.capture.CaptureFixture object at 0x7fc80abaed30>, args = ()

    @pytest.mark.parametrize('args', ((), ('--repos-with-matches',)))
    def test_grep_error(file_config_files, capfd, args):
        ret = main(('-C', str(file_config_files.cfg), *args))
        assert ret == 128
        out, err = capfd.readouterr()
        assert out == ''
>       assert err == 'fatal: no pattern given.\n'
E       AssertionError: assert equals failed
E         'fatal: no pattern given\n'      'fatal: no pattern given.\n'

args       = ()
capfd      = <_pytest.capture.CaptureFixture object at 0x7fc80abaed30>
err        = 'fatal: no pattern given\n'
file_config_files = auto_namedtuple(output_dir=local('/tmp/pytest-of-micek/pytest-6/test_grep_error_args0_0/output'), cfg=local('/tmp/pytest-of-micek/pytest-6/test_grep_error_args0_0/config.json'), repos_json=local('/tmp/pytest-of-micek/pytest-6/test_grep_error_args0_0/repos.json'), dir1=local('/tmp/pytest-of-micek/pytest-6/test_grep_error_args0_0/1'), dir2=local('/tmp/pytest-of-micek/pytest-6/test_grep_error_args0_0/2'), rev1='09b8d2e4a13de36711107a7814bed289708aee3c', rev2='75c5862c260933404547b4b001ba39eaca130454')
out        = ''
ret        = 128

tests/grep_test.py:125: AssertionError
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> entering PDB >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
> /home/micek/workspace/github-mitzkia/all-repos/tests/grep_test.py(125)test_grep_error()
-> assert err == 'fatal: no pattern given.\n'
(Pdb) q
Exit: Quitting debugger

my environment:

$ python3 --version --version
Python 3.6.7 (default, Oct 22 2018, 11:32:17) 
[GCC 8.2.0]
$ python3 -m pytest --version
This is pytest version 4.3.1, imported from /usr/local/lib/python3.6/dist-packages/pytest.py
setuptools registered plugins:
  pytest-xdist-1.26.1 at /usr/local/lib/python3.6/dist-packages/xdist/plugin.py
  pytest-xdist-1.26.1 at /usr/local/lib/python3.6/dist-packages/xdist/looponfail.py
  pytest-sugar-0.9.2 at /usr/local/lib/python3.6/dist-packages/pytest_sugar.py
  pytest-icdiff-0.0.4 at /usr/local/lib/python3.6/dist-packages/pytest_icdiff.py
  pytest-forked-1.0.1 at /usr/local/lib/python3.6/dist-packages/pytest_forked/__init__.py
  pytest-env-0.6.2 at /usr/local/lib/python3.6/dist-packages/pytest_env/plugin.py
  pytest-cov-2.6.1 at /usr/local/lib/python3.6/dist-packages/pytest_cov/plugin.py
  hypothesis-4.5.2 at /usr/local/lib/python3.6/dist-packages/hypothesis/extra/pytestplugin.py
$ git version 2.19.1

using keyring as credential source

while investigating putting a all_repos configuration into a git repo, i noted that credentials couldn't be externalized

so more than one person wouldnt be able to use it
this could b elevated by having supportt for fetching api keys/passwords from keyring

Syncing forks

After I have cloned my repositories and there are some forks I would like to syncing them with upstream.
As I see after the cloning the remote is not set (checking with git remote -v).
Is it my task to add remote repository with git remote add [upstream] [git-url] in my forks, or can it be automated with all-repos?

My usual workflow is (maybe not the best):
1, clone my fork
2, add upstream repository as a remote
3, sync with upstream (git fetch --all, git merge upstream/master, git push origin master)
4, create my topic branch + add code + create a PR from it

If the step 4 does not take a short time I would like to sync/rebase my topic branch with the master periodically.
I have seen the push-module maybe it could help me in creating a PR, but will it help in syncing?

So my questions:
1, can all-repos-clone automatically add repository from which I forked from as a remote? or is it a users task?
2, can all-repos help me to sync master (or other branches) from time to time with upstream?

Thank you

Make an `all-repos-sed`

Sometimes I don't want to write the full gamut of a fixer and such and really just want some nice sed -i.

I think the interface would be something like:

$ all-repos-sed 's/foo/bar/g' -- baz.f

And this tool would make a fixer which does essentially (for each repository):

git ls-files -- baz.f | xargs sed -i 's/foo/bar/g'

Allow a "read only" mode where `push_settings` are optional

Basically make the configuration lazily load the various parts for pushing / pulling

This would enable one to make a r/o all-repos

I guess this could just be implemented as a builtin noop push and then:

    "push": "all_repos.push.noop",
    "push_settings": {}

all-repos.json has too-permisive permissions

When running it on Windows, I'm getting the following error:

$ all-repos-clone
all-repos.json has too-permissive permissions, Expected 0o600, got 0o666
$ ls -l all-repos.json
-rw-r--r-- 1 bLeDy 197609 377 Apr  9 17:34 all-repos.json

Windows: Version 20H2 (OS Build 19042.906)
Python: 3.9.1
Pip: 21.0.1
all-repos: 1.21.0

Branch name prefix is fixed

all-repos always adds a branch name prefix all-repos_autofix_. At work, we use a branch naming standard which doesn't "approve" of the prefix unfortunately.

Would there be any appetite for changing that behaviour? For example making the prefix configurable?

sed -i needs an extension on osx

'sed -i s/HAI/BAI/g' fails on os x (bsd sed). there it needs to be 'sed -i '' s/HAI/BAI/g'

(noticed because it makes the test suite fail on os x)

Possible python bug when using ~regex in file pattern

When I played with all-repos-find-files found a python crash.
I have also reproduced with pure python so my feeling is that this is not all-repos related, but I need help to validate this (can not decide).

Maybe the use-case is invalid?

Reproduction with all-repos-find-files:

$ all-repos-find-files --output-paths '*'
Traceback (most recent call last):
  File "/usr/local/bin/all-repos-find-files", line 10, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.6/dist-packages/all_repos/find_files.py", line 90, in main
    output_paths=args.output_paths, use_color=args.color,
  File "/usr/local/lib/python3.6/dist-packages/all_repos/find_files.py", line 60, in find_files_cli
    repo_files = find_files(config, pattern)
  File "/usr/local/lib/python3.6/dist-packages/all_repos/find_files.py", line 30, in find_files
    regex = re.compile(pattern.encode())
  File "/usr/lib/python3.6/re.py", line 233, in compile
    return _compile(pattern, flags)
  File "/usr/lib/python3.6/re.py", line 301, in _compile
    p = sre_compile.compile(pattern, flags)
  File "/usr/lib/python3.6/sre_compile.py", line 562, in compile
    p = sre_parse.parse(p, flags)
  File "/usr/lib/python3.6/sre_parse.py", line 855, in parse
    p = _parse_sub(source, pattern, flags & SRE_FLAG_VERBOSE, 0)
  File "/usr/lib/python3.6/sre_parse.py", line 416, in _parse_sub
    not nested and not items))
  File "/usr/lib/python3.6/sre_parse.py", line 616, in _parse
    source.tell() - here + len(this))
sre_constants.error: nothing to repeat at position 0

Reproduction with pure python:

Python 3.6.7 (default, Oct 22 2018, 11:32:17) 
[GCC 8.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> "*".encode()
b'*'
>>> import re
>>> re.compile("*".encode())
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python3.6/re.py", line 233, in compile
    return _compile(pattern, flags)
  File "/usr/lib/python3.6/re.py", line 301, in _compile
    p = sre_compile.compile(pattern, flags)
  File "/usr/lib/python3.6/sre_compile.py", line 562, in compile
    p = sre_parse.parse(p, flags)
  File "/usr/lib/python3.6/sre_parse.py", line 855, in parse
    p = _parse_sub(source, pattern, flags & SRE_FLAG_VERBOSE, 0)
  File "/usr/lib/python3.6/sre_parse.py", line 416, in _parse_sub
    not nested and not items))
  File "/usr/lib/python3.6/sre_parse.py", line 616, in _parse
    source.tell() - here + len(this))
sre_constants.error: nothing to repeat at position 0

merge_to_master: Add setting to allow fast-forward merge

Currently, merge_to_master forces a merge commit by passing --no-ff. I would prefer to do a fast-forward merge when doing autofixes.

It would be nice to have this behavior be configurable. Something like:

   "push_settings": {
        "fast_forward": true
    }

grep all-repos-grep -h option conflict

The grep option '-h' (don't show filenames) does not work, because it triggers the all-repos-grep help.
Example:

> all-repos-grep -h 'foo' '*.txt'

usage: all-repos-grep [options] [GIT_GREP_OPTIONS]
Similar to a distributed `git grep ...`.
optional arguments:
  -h, --help            show this help message and exit
  -C CONFIG_FILENAME, --config-filename CONFIG_FILENAME
                        use a non-default config file (default
                        `/home/beckerdi/.config/all-repos.json`).
  --color {auto,always,never}
                        use color in output (default `auto`).
  --repos-with-matches  only print repositories with matches.
  --output-paths        Use `/` as a separator instead of `:` in outputs
                        (often helpful for scripting).

github enterprise

afaik this should just be providing an api prefix.

Need to find a running instance to test this against though ๐Ÿค”

Provide a way to blacklist or whitelist which pre-commit hooks to autoupdate

We have a pre-commit hook which requires manual intervention when bumping. If I'm running pre-commit autoupdate across may repositories, I would like to exclude that repository. pre-commit autoupdate already supports a --repo option which allows us to whitelist repositories.

Can we expose that option in the autopep8-wrapper auto-fixer?

Include pattern does not work.

I use the following config for cloning my GitHub repos:

{
    "output_dir": "output",
    "source": "all_repos.source.github",
    "source_settings":  {
        "forks": true,
        "private": true,
        "archived": true,
        "api_key": ""
        "username": "CodingSpiderFox"
    },
    "include": "^z.*",
    "push": "all_repos.push.github_pull_request",
    "push_settings": {
        "api_key": "...",
        "username": "CodingSpiderFox"
    },
    "all_branches": true
}

I would have expected that all repos starting with "z" would be cloned. But not a single one gets cloned.

Here is an excerpt from the resulting repos.json.

{"CodingSpiderFox/yara": "[email protected]:CodingSpiderFox/yara", "CodingSpiderFox/youtube-comment-suite": "[email protected]:CodingSpiderFox/youtube-comment-suite", "CodingSpiderFox/ytplaylistmanager": "[email protected]:CodingSpiderFox/ytplaylistmanager", "CodingSpiderFox/zfs": "[email protected]:CodingSpiderFox/zfs", "CodingSpiderFox/zfs-multi-mount": "[email protected]:CodingSpiderFox/zfs-multi-mount", "CodingSpiderFox/zfs-snap-diff": "[email protected]:CodingSpiderFox/zfs-snap-diff", "CodingSpiderFox/zfsbud": "[email protected]:CodingSpiderFox/zfsbud", "CodingSpiderFox/zfs_autobackup": "[email protected]:CodingSpiderFox/zfs_autobackup", "CodingSpiderFox/zinit": "[email protected]:CodingSpiderFox/zinit", "CodingSpiderFox/zplug": "[email protected]:CodingSpiderFox/zplug", "CodingSpiderFox/zsh-autopair": "[email protected]:CodingSpiderFox/zsh-autopair", "CodingSpiderFox/zsh-autosuggestions": "[email protected]:CodingSpiderFox/zsh-autosuggestions", "CodingSpiderFox/zsh-fast-alias-tips": "[email protected]:CodingSpiderFox/zsh-fast-alias-tips", "CodingSpiderFox/zsh-git-sync": "[email protected]:CodingSpiderFox/zsh-git-sync", "CodingSpiderFox/zsh-tmux-auto-title": "[email protected]:CodingSpiderFox/zsh-tmux-auto-title", "CodingSpiderFox/zsh-you-should-use": "[email protected]:CodingSpiderFox/zsh-you-should-use", "CodingSpiderFox/zsh-zsnapac": "[email protected]:CodingSpiderFox/zsh-zsnapac"}

However, the repos_filtered.json stays empty.

Cloning fails due to authentication error

I get errors on all my repos, even the public ones when running

all-repos-clone --config-filename all-repos.json --jobs 1

My config is

{
    "output_dir": "output",
    "source": "all_repos.source.github",
    "source_settings":  {
        "api_key": "hexadecimal-stuff-from-github-settings-page",
        "username": "CodingSpiderFox"
    },
    "include": "",
    "exclude": "^(gecko-dev)$",
    "push": "all_repos.push.github_pull_request",
    "push_settings": {
        "api_key": "hexadecimal-stuff-from-github-settings-page",
        "username": "CodingSpiderFox"
    }
}

Example error output is:

Initializing CodingSpiderFox/maintenance-tools
Initialized empty Git repository in /home/user/gitrepos/output/CodingSpiderFox/maintenance-tools/.git/
[email protected]: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.
Error fetching output/CodingSpiderFox/acx100-acx-mac80211
[email protected]: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.
Error fetching output/CodingSpiderFox/ardour-sessionparser
Warning: Permanently added the RSA host key for IP address '140.82.118.3' to the list of known hosts.
[email protected]: Permission denied (publickey).
fatal: Could not read from remote repository.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.