Giter Club home page Giter Club logo

artman's Introduction

Google APIs

This repository contains the original interface definitions of public Google APIs that support both REST and gRPC protocols. Reading the original interface definitions can provide a better understanding of Google APIs and help you to utilize them more efficiently. You can also use these definitions with open source tools to generate client libraries, documentation, and other artifacts.

Building

Bazel

The recommended way to build the API client libraries is through Bazel >= 4.2.2.

First, install bazel.

To build all libraries:

bazel build //...

To test all libraries:

bazel test //...

To build one library in all languages:

bazel build //google/example/library/v1/...

To build the Java package for one library:

bazel build //google/example/library/v1:google-cloud-example-library-v1-java

Bazel packages exist in all the libraries for Java, Go, Python, Ruby, Node.js, PHP and C#.

Overview

Google APIs are typically deployed as API services that are hosted under different DNS names. One API service may implement multiple APIs and multiple versions of the same API.

Google APIs use Protocol Buffers version 3 (proto3) as their Interface Definition Language (IDL) to define the API interface and the structure of the payload messages. The same interface definition is used for both REST and RPC versions of the API, which can be accessed over different wire protocols.

There are several ways of accessing Google APIs:

  1. JSON over HTTP: You can access all Google APIs directly using JSON over HTTP, using Google API client library or third-party API client libraries.

  2. Protocol Buffers over gRPC: You can access Google APIs published in this repository through GRPC, which is a high-performance binary RPC protocol over HTTP/2. It offers many useful features, including request/response multiplex and full-duplex streaming.

  3. Google Cloud Client Libraries: You can use these libraries to access Google Cloud APIs. They are based on gRPC for better performance and provide idiomatic client surface for better developer experience.

Discussions

This repo contains copies of Google API definitions and related files. For discussions or to raise issues about Google API client libraries, GRPC or Google Cloud Client Libraries please refer to the repos associated with each area.

Repository Structure

This repository uses a directory hierarchy that reflects the Google API product structure. In general, every API has its own root directory, and each major version of the API has its own subdirectory. The proto package names exactly match the directory: this makes it easy to locate the proto definitions and ensures that the generated client libraries have idiomatic namespaces in most programming languages. Alongside the API directories live the configuration files for the GAPIC toolkit.

NOTE: The major version of an API is used to indicate breaking change to the API.

Generate gRPC Source Code

To generate gRPC source code for Google APIs in this repository, you first need to install both Protocol Buffers and gRPC on your local machine, then you can run make LANGUAGE=xxx all to generate the source code. You need to integrate the generated source code into your application build system.

NOTE: The Makefile is only intended to generate source code for the entire repository. It is not for generating linkable client library for a specific API. Please see other repositories under https://github.com/googleapis for generating linkable client libraries.

Go gRPC Source Code

It is difficult to generate Go gRPC source code from this repository, since Go has different directory structure. Please use this repository instead.

artman's People

Contributors

alexander-fenster avatar andreamlin avatar anupkumarpanwar avatar bcoe avatar busunkim96 avatar chingor13 avatar chrisdunelm avatar eoogbe avatar ethanbao avatar garrettjonesgoogle avatar geigerj avatar jbolinger avatar jmuk avatar jskeet avatar justinbeckwith avatar landrito avatar lukesneeringer avatar michaelbausor avatar mvashishtha avatar neozwu avatar noahdietz avatar pongad avatar renovate-bot avatar saicheems avatar shinfan avatar software-dov avatar tbetbetbe avatar vam-google avatar vchudnov-g avatar yihanzhen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

artman's Issues

Running JavaGrpcClientPipeline doesn't work properly

When running the following command nothing happens:

python execute_pipeline.py --config '../googleapis/gapic/api/artman_yyyy.yaml:common|java,../googleapis/gapic/lang/common.yaml:default|java' JavaGrpcClientPipeline

Note: the string "yyyy" stands for a random api.

IMPACT
It seems like we are not able to generate the Java gRPC generated files at all.

EXPECTED RESULT
It should just work: we should get gRPC generated files in Java for a specific api.

ACTUAL RESULT
This is the output (notice that it stops when trying to locate the gRPC plugin for Java):

Final args:
auto_resolve : True
auto_merge : True
import_proto_path : ['../googleapis/gapic/api/../..']
gapic_api_yaml : ['../googleapis/gapic/api/../../google/yyyy/v1/yyyy_gapic.yaml']
toolkit_path : ../toolkit
final_repo_dir : ../gcloud-java/gcloud-java-yyyy
api_name : yyyy-v1
output_dir : ../artman/output
gapic_language_yaml : ['../googleapis/gapic/lang/java_gapic.yaml']
src_proto_path : ['../googleapis/gapic/api/../../google/yyyy/v1']
service_yaml : ['../googleapis/gapic/api/../../google/yyyy/v1/yyyy.yaml']
ignore_base : False
Create JavaGrpcClientPipeline instance.
Generating protos ../googleapis/gapic/api/../../google/yyy/v1
Searching for latest protobuf source
Running protoc with grpc plugin on ../googleapis/gapic/api/../../google/yyy/v1
Searching for latest protobuf source
start gradle process to locate GRPC Java plugin

Integrate Go generator into artman

Please include all dependencies needed by Go grpc/gapic generation into Dockerfile and make sure that artman can be used to generate Go grpc and gapic libraries. This should mostly be integration work.

API naming variable changes

@garrettjonesgoogle @bjwatson @omaray

Background

Some recent changes have introduced inconsistencies in how we refer to the gRPC Python packages -- just wanted to get input on how we should resolve this. Renaming the packages at a future date is a breaking change, and it's linked to some changes I want to make around how we treat the concept of "API names" in our tooling in connection to googleapis/gapic-generator#270, so I wanted to get some feedback before I implement this.

Currently

  • Artman currently has a single "API name" concept, which is configurable in the artman config.
  • A sample API name might be google-cloud-logging-v2
  • Our packaging tools (Packman, and its successor MetadataGen) further introduce two more API name variables:
    • The "version" is the last word in the API name, e.g., "v2"
    • The "short name" is the second-to-last word in the API name, e.g., "logging"
    • The "title name" is Google {titlecased short name}, e.g., "Google Logging". This is used in the autogenerated README to refer to the service.
  • The gRPC Python package name is either grpc-google-{shortname}-{version} or grpc-{api name}, depending on where you look (i.e., either grpc-google-logging-v2 or grpc-google-cloud-logging-v2). That's a problem.

Proposal

  • I want to get rid of the "API name" configuration value in artman. The existing variables will be replaced as follows:
    • Version --> independent configuration value in artman config
    • Short name --> independent configuration value in artman config
    • Title name --> use title attribute from service config (example)
    • API Name --> google-cloud-{short name}-{version}
  • The Python package names will then be grpc-{API Name} and gapic-{API Name}.

Any objection or concerns with this?

Ruby gRPC client generation should be merged into gapic pipeline

The gcloud-ruby project actually bundles gRPC generated files as a part of the google-cloud libraries (for example, gcloud-ruby/google-cloud-pubsub contains the gRPC files at https://github.com/GoogleCloudPlatform/gcloud-ruby/tree/master/google-cloud-pubsub/lib/google/pubsub/v1).

For non-cloud APIs, I think either design (i.e. merging into a single package or separating Gapic client from gRPC client) would be okay. So because of this gcloud-ruby design, I think all Ruby's generated artifacts will follow this style.

The current design of Ruby's grpc client generation pipeline doesn't fit with this design. It should:

  • stop using packman but use the protoc command directly
  • introduce this protoc task as a part of Ruby's Gapic pipeline -- so that the generated files reside in a same package

Remove default --jobboard_name from start_conductor.py

What:

Remove default --jobboard_name from start_conductor.py.

Why:

The default is remote, which should be exclusively used by our remote build server. I just ran start_conductor.py on my workstation, and it ate a bunch of messages from Zookeeper server.

Move Python gRPC packages to google.cloud.grpc namespace

What:

Change the Python path for gRPC-generated cloud APIs to google.cloud.grpc.{api}.{version}.

Why:

We have to do this for the Cloud ML libraries, because they currently define the google.cloud.{api} namespace, which interferes with the google-cloud-python team putting package members into google.cloud.{api} for user convenience. We also want to do the same for other gRPC packages for consistency.

How:

Copy proto to the desired path before building the Python gRPC code. Some hand-edits are needed for dependencies between gRPC files in the same package, since the import path seems to be the only thing that uses the proto package (FYI @nathanielmanistaatgoogle). An example occurs in Cloud Vision.

Note that packman cares what the proto package is, so I had to use protoc directly for doing this by hand. The function computePkg() looks it up and uses it for setting up its arguments to call protoc. This trips up the checkDir calculation and the include path calculation. We need to change this logic somehow. Should packman really care what the proto package says?

We also need a way of telling toolkit about the gRPC path, so it generates its import statements accordingly.

Tests fail if TOOLKIT_HOME is set

What

If the environment variable TOOLKIT_HOME is set, the tests fail because they expect toolkit to be located in the fake test directory, but the environment variable overrides this location. The tests should ignore the environment so that they are deterministic.

Why

Tests should be deterministic independent of environment.

/cc @swcloud

Emit better error if gen-api-package cannot be found

Steps to reproduce:

  1. In a complete repository checkout (including artman and toolkit), cd artman
  2. Confirm that gen-api-package is not in your PATH variable
  3. Run the JavaGrpcClientPipeline or PythonGrpcClientPipeline

Expected result:
A helpful error should indicate that the gen-api-package command cannot be found.

Actual results:
An obscure OSError: [Errno 2] No such file or directory error is emitted.

gRPC plugin structure

It looks like gPRC Python may be moving away from releasing a standalone plugin, and instead wrapping protoc as a Python executable and distributing it in the grpcio-tools package: https://pypi.python.org/pypi/grpcio-tools

We should see whether other languages are doing something similar. If we can use package dependencies rather than assuming that the plugins are installed (I think artman just calls which python_grpc_plugin right now...), that would make setup and keeping up-to-date easier.

Integration test failure

When doing 'tox -e py27', there are errors showing as below

$ tox -e py27
py27 installed: appdirs==1.4.3,automaton==1.8.0,Babel==2.4.0,cachetools==2.0.0,colorlog==2.10.0,configparser==3.5.0,contextlib2==0.5.4,coverage==4.3.4,debtcollector==1.13.0,decorator==4.0.11,enum34==1.1.6,extras==1.0.0,fasteners==0.14.1,fixtures==3.0.0,flake8==3.2.1,funcsigs==1.0.2,functools32==3.2.3.post2,futures==3.0.5,futurist==1.0.0,gcloud==0.15.0,google-apitools==0.5.8,googleapis-common-protos==1.5.2,grpcio==1.2.1,grpcio-tools==1.2.1,httplib2==0.10.3,iso8601==0.1.11,jsonschema==2.6.0,kazoo==2.2.1,linecache2==1.0.0,mccabe==0.5.3,mock==2.0.0,monotonic==1.3,mox3==0.21.0,msgpack-python==0.4.8,netaddr==0.7.19,netifaces==0.10.5,networkx==1.11,oauth2client==3.0.0,oslo.i18n==3.15.0,oslo.serialization==2.18.0,oslo.utils==3.25.0,packaging==16.8,pbr==2.0.0,prettytable==0.7.2,protobuf==3.2.0,py==1.4.33,pyasn1==0.2.3,pyasn1-modules==0.0.8,pycodestyle==2.2.0,pyfakefs==2.9,pyflakes==1.3.0,pyparsing==2.2.0,pytest==3.0.7,pytest-cov==2.4.0,pytest-timeout==1.2.0,python-mimeparse==1.6.0,pytz==2017.2,PyYAML==3.12,requests==2.13.0,rsa==3.4.2,six==1.10.0,stevedore==1.21.0,taskflow==1.25.0,testtools==2.2.0,traceback2==1.4.0,unittest2==1.1.0,wrapt==1.10.10,yapf==0.16.1
py27 runtests: PYTHONHASHSEED='3329163374'
py27 runtests: commands[0] | py.test -rxs --timeout=30 --cov --cov-report= --cov-append /Users/songwang/IdeaProjects/vtk/artman/
========================================================= test session starts ==========================================================
platform darwin -- Python 2.7.10, pytest-3.0.7, py-1.4.33, pluggy-0.4.0
rootdir: /Users/songwang/IdeaProjects/vtk/artman, inifile:
plugins: timeout-1.2.0, cov-2.4.0
timeout: 30.0s method: signal
collected 33 items
test/test_gapic_conductor.py F
test/test_goimport_task.py .
test/test_io_tasks.py ..
test/test_package_metadata_task.py .
test/test_pipeline_baseline.py ........................
test/test_python_package_change_task.py ...
test/utils/test_github_utils.py .
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/_pytest/main.py", line 98, in wrap_session
INTERNALERROR> session.exitstatus = doit(config, session) or 0
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/_pytest/main.py", line 133, in _main
INTERNALERROR> config.hook.pytest_runtestloop(session=session)
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in call
INTERNALERROR> return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in
INTERNALERROR> _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 613, in execute
INTERNALERROR> return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 250, in _wrapped_call
INTERNALERROR> wrap_controller.send(call_outcome)
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/pytest_cov/plugin.py", line 232, in pytest_runtestloop
INTERNALERROR> self.cov_controller.finish()
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/pytest_cov/engine.py", line 150, in finish
INTERNALERROR> self.cov.combine()
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/coverage/control.py", line 767, in combine
INTERNALERROR> self.get_data()
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/coverage/control.py", line 795, in get_data
INTERNALERROR> self.collector.save_data(self.data)
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/coverage/collector.py", line 360, in save_data
INTERNALERROR> covdata.add_arcs(abs_file_dict(self.data))
INTERNALERROR> File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/coverage/data.py", line 365, in add_arcs
INTERNALERROR> raise CoverageException("Can't add arcs to existing line data")
INTERNALERROR> CoverageException: Can't add arcs to existing line data
================================================= 1 failed, 32 passed in 25.12 seconds =================================================
ERROR: InvocationError: '/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/bin/py.test -rxs --timeout=30 --cov --cov-report= --cov-append /Users/songwang/IdeaProjects/vtk/artman/'
_______________________________________________________________ summary ________________________________________________________________
ERROR: py27: commands failed

artman seems to expect toolkit under reporoot, in spite of explicit override in config

Using configure-artman, I set up my toolkit dir to point to the place where I had cloned the toolkit repo, which was not under reporoot. I then tried to generate the GAPIC for pubsub as per artman's README instructions, and it failed.

I then made a symlink such that $REPOROOT/toolkit pointed to the actual directory where I cloned the toolkit repo, and the GAPIC generation succeeded.

From this I conclude that configure-artman's assurances to the contrary, artman does expect toolkit under reporoot

httplib2 dependency problem

I just did sudo pip install googleapis-artman, and then when I try to run artman --api logging --language java --publish local, it says:

pkg_resources.DistributionNotFound: httplib2>=0.9.2,<1dev

And when I rerun the install, I see a line that says

Requirement already satisfied: httplib2>=0.8 in /usr/lib/python2.7/dist-packages (from google-apitools->googleapis-artman)

is_output_gcloud() function might miss some output directory

Recently gcloud-ruby and gcloud-node repositories were renamed to google-cloud-ruby and google-cloud-node.
Ruby and Node pipelines change their behaviors upon is_outrput_gcloud(), and this function needs to deal with the new repository name.

Confusing failure in remote mode for GAPIC config pipeline

Expected behavior

If a GAPIC config already exists and the pipeline is running in remote mode, either

  • artman fails and returns a meaningful error message, or
  • artman overwrites the config in the archive that is downloaded

Observed behavior

Artman fails with no information. Logs indicate only that the config was generated successfully, since the failure occurs at the copying stage -- after the config is actually generated.

Context

Artman does not automatically overwrite existing configs because of the risk of overwriting uncommitted manual edits, which occur frequently in the development process. However, this risk exists only when running Artman in local mode, since in remote mode, nothing is ever overwritten on the local machine. We can therefore safely overwrite.

If detecting the environment (local/remote) is complicated, a clear error message is a good short-term solution.

skip_packman isn't a great config name

When a user writes their own artman configuration yaml, normally they will refer to existing file, and many of existing package yaml contains 'skip_packman' for Ruby and Node.

The problem is that nobody outside of our team would understand what packman is, therefore they will copy 'skip_packman: true' line unintentionally.

It's better to rename this.

GapicConfigPipeline can overwrite file in remote environment

What:

GapicConfigPipeline currently raises a ValueError if the target GAPIC config file already exists. This can be disabled in a remote environment.

Why:

This test is justified in the local environment, because GapicConfigPipeline can overwrite the GAPIC config in a developer's googleapis sandbox that might have unstaged changes. Not having this test is potentially surprising and frustrating for developers using the local environment.

For developers using a remote environment, there is no such danger. GapicConfigPipeline does not touch an active sandbox in this mode, and instead delivers a tarball of the googleapis repository with generated changes overlaid. Having this test is potentially surprising and frustrating for developers using a remote environment, because they have to know that they need to temporarily modify their artman config to re-generate a GAPIC config file.

How:

The test is in pipeline.tasks.gapic_tasks.GapicConfigMoveTask._move_to().

GCloud dependency conflicts

Steps to Reproduce

  1. virtualenv venv
  2. source venv/bin/activate
  3. pip install -e git+https://github.com/googleapis/artman#egg=remote
  4. start_conductor.py

Expected Result

The conductor successfully starts.

Actual Result

The following dependency conflict occurs:

Traceback (most recent call last):
  File "/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/bin/start_conductor.py", line 4, in <module>
    __import__('pkg_resources').require('googleapis-artman==0.1.0')
  File "/usr/local/buildtools/current/sitecustomize/sitecustomize.py", line 181, in SetupPathsAndImport
    return real_import(name, globals, locals, fromlist, level)
  File "/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2985, in <module>
    @_call_aside
  File "/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2971, in _call_aside
    f(*args, **kwargs)
  File "/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2998, in _initialize_master_working_set
    working_set = WorkingSet._build_master()
  File "/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/lib/python2.7/site-packages/pkg_resources/__init__.py", line 662, in _build_master
    return cls._build_from_requirements(__requires__)
  File "/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/lib/python2.7/site-packages/pkg_resources/__init__.py", line 675, in _build_from_requirements
    dists = ws.resolve(reqs, Environment())
  File "/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/lib/python2.7/site-packages/pkg_resources/__init__.py", line 859, in resolve
    raise VersionConflict(dist, req).with_context(dependent_req)
pkg_resources.ContextualVersionConflict: (grpc-google-logging-v2 0.9.3 (/usr/local/google/home/brianwatson/src/gapi-dev/googleapis/venv/lib/python2.7/site-packages), Requirement.parse('grpc-google-logging-v2<0.9.0,>=0.8.1'), set(['gax-google-logging-v2']))

Analysis

This is ultimately because the last release of gcloud (0.18.1) does not set upper bounds on their gax-google-* and grpc-google-* dependencies. Due to the rename of gax-google-* to gapic-google-*, its latest version is only 0.8.x. This is incompatible with the latest 0.9.x versions of grpc-google-*.

FYI @tseaver and @dhermes, if you could push a 0.18.2 version of gcloud that sets an upper bound of <0.9.0 on the gax-google-* and grpc-google-* dependencies, then this issue will be resolved.

The other fix is to migrate artman to google-cloud 0.19.x (the rename of gcloud), which I began in #101 and is captured as a feature request in #103.

Support artman config without REPOROOT

REPOROOT causes confusion when running artman in remote mode -- on the local machine, it corresponds to a directory that contains both the real googleapis repo and a "local_repo", whereas remotely it refers to a single directory named "googleapis" that corresponds to the merged value of both of these.

The artman config refers only to the remote setup; a path such as {REPOROOT}/googleapis/foo/bar.py may refer locally either to googleapis/foo/bar.py or to {LOCAL_REPO}/foo/bar.py. This is particularly confusing because the googelapis in the {REPOROOT}/googleapis/foo/bar.py may not exist on the local path if the file refered to is actually {LOCAL_REPO}/foo/bar.py.

Instead, we could support relative paths in the artman config, like foo/bar.py. This does not to reference googleapis, and so solves the issue of having invalid paths in the config on the local machine.

Add option to skip packman task

For Ruby and NodeJS APIs, due to the effort to integrate with the existing hand-written veneer, some artifacts such as package meta data (README.md, package.json, or gemspec) are not needed, so we can skip the packman task when generating the Gapic artifacts. However, for other APIs such as ErrorReporting, since there is no existing hand-written veneer code, these package meta data still need to be generated.

Configuration merge semantics

From @lukesneeringer's comment on #171:

However, should we consider trying to have a standard format in our YAML for when we have common parameters with potential language overrides? This format seems to be distinct from the format used in artman_{api}.yaml.

I think, of the two, I prefer the other format, which comes down to:

---
common:
  foo: bar
  baz:
    spam: eggs
python:
  foo: bacon
  baz:
    hello: goodbye
node:
  baz:
    spam: weeee!

Which Artman amalgamates together so that in Python, you get:

---
foo: bacon
baz:
  spam: eggs
  hello: goodbye

And in Node you get:

---
foo: bar
baz:
  spam: weeeee!

Stop using packman for Ruby gRPC client generation

Derived from #69:

Currently Ruby gRPC client pipeline use packman to generate gRPC files. However, that does not fit with the design of gcloud-ruby, where a package contains all of hand-written layers, Gapic-generated files, and gRPC-generated files.
That is Cloud APIs only, but there are no strong reasons to split packages for Gapic files and gRPC files in Ruby.

Because of that, we should stop using packman for Ruby's client pipeline. Instread, similar to Java, we should use protoc_tasks we already have.

Also, from #69 (comment),

Currently we get protoc from linuxbrew here: https://github.com/grpc/homebrew-grpc
That site indicates that the linuxbrew method is deprecated for all languages except C/C++.

Therefore, Ruby will use grpc-tools rubygem (https://rubygems.org/gems/grpc-tools).

Cc: @bjwatson @geigerj

Fix build failures, lint test/ directory

After #44 or #45 is merged, we will not be pylint'ing the test directory in order to fix a build failure. In fact, we may not have been pylint'ing the test directory previously. In any case, we should fix this.

Remote execution fails when local_repo is too big

Observed behavior

When the directory specified by the local_repo dir is too big, remote execution fails with the confusing message "taskflow.exceptions.StorageFailure: Storage backend internal error". This is a frequent error case if the directory pointed to is actually a git repo (i.e., also contains associated git metadata).

Expected behavior

Artman should validate that the size of local_repo is < 1 MB.

Action

Accumulate the size of local_repo here:

files_dict[_normalize_path(rel_path)] = base64.b64encode(f.read())

and throw a meaningful exception if it is too big.

Failing jobs linger on zookeeper

What

Failing jobs are not currently removed from the jobboard on zookeeper, but they should be.

Why

Stale, failing jobs still get claimed by the remote conductor, and continue to fail. This has a negative impact on performance and is a waste of resources.

Ruby GRPC build does not work.

The task was previously referring to a variable pkg_dir which raised NameError (the variable was not defined).

As part of #193, I added the variable to the method signature for the task, but it is not being provided by the tasks that precede it:

Traceback (most recent call last):
  File "/usr/local/bin/artman", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/artman/cli/main.py", line 86, in main
    engine.run()
  File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 159, in run
    for _state in self.run_iter():
  File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 185, in run_iter
    self.validate()
  File "/usr/local/lib/python3.5/dist-packages/fasteners/lock.py", line 306, in wrapper
    return f(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 309, in validate
    cause=last_cause)
taskflow.exceptions.MissingDependencies: 'taskflow.patterns.linear_flow.Flow: CodeGenerationPipeline(len=8)' requires ['pkg_dir'] but no other entity produces said requirements
  MissingDependencies: 'RubyGrpcCopyTask-ruby-video-intelligence-v1beta1==1.0' requires ['pkg_dir'] but no other entity produces said requirements

This is orthogonal to #193, so filing here for tracking.

PythonChangePackageTask temporary files referenced in gRPC generated code

What

PythonChangePackageTask creates temporary proto files; these temporary files are referenced in the source comment of the generated gRPC Python code rather than the true source proto (example).

Why

It's confusing to have final, end-user-visible code reference an intermediate, temporary file. Note that this affects comments in the generated code only; there is no behavior caused by this issue.

Use Python GAPIC package name to determine Python gRPC package name

What

The Python GAPIC package name is currently configured in the GAPIC config. The Python gRPC package name is determined by combining the prefix google.cloud.grpc with the proto package name. For consistency, it would be good to use the same package name structure for both GAPIC and gRPC; that is, any GAPIC package {prefix}.gapic.{suffix} will depend on the gRPC package {prefix}.grpc.{suffix}.

Why

This avoids the verbosity of some proto package names while ensuring a consistent naming strategy across Python packages. See @bjwatson's comment in googleapis/api-client-staging#154

Concerns

The simplest implementation requires artman to read the GAPIC config to obtain the GAPIC package name and pass it to the gRPC package renaming task. I'm wary about introducing a dependency on the GAPIC config for a gRPC pipeline, since none currently exists, and such a dependency is counterintuitive (GAPIC should depend on gRPC, not the other way around).

An alternative is to move package name configuration to the artman config. This has the drawback that the package name must still be propagated to the GAPIC config (so it's duplicated across configs) and introduces the possibility of inconsistency if the GAPIC config is manually edited, but doesn't introduce any new dependencies to the gRPC pipeline.

cc: @bjwatson, @lukesneeringer

Baseline tests don't capture gRPC plugin

#118 fixes a breakage in the Ruby gRPC pipeline, but the baseline tests don't reflect that; the actual pipeline passes the gRPC plugin parameter value, but the test pipeline doesn't. Investigate why not.

Original report from @swcloud:

Hey Jacob, I just updated artman to be the latest and when I run pipeline with task 'GrpcClientPipeline' for ruby and 'logging' API. I got the following error:
Final args:
package_defaults_yaml : ../googleapis/gapic/packaging/api_defaults.yaml
repo_root : ..
package_dependencies_yaml : ../googleapis/gapic/packaging/dependencies.yaml
language : ruby
stage_output : False
import_proto_path : ['../googleapis']
enable_batch_generation : True
gapic_api_yaml : ['../googleapis/google/logging/v2/logging_gapic.yaml']
common_protos_yaml : ../googleapis/gapic/packaging/common_protos.yaml
toolkit_path : ../toolkit
skip_packman : True
api_name : google-cloud-logging-v2
output_dir : ../artman/output
final_repo_dir : ../google-cloud-ruby/google-cloud-logging
staging_repo_dir : ../api-client-staging/generated
src_proto_path : ['../googleapis/google/logging/v2']
service_yaml : ['../googleapis/google/logging/logging.yaml']
gapic_language_yaml : ['../googleapis/gapic/lang/ruby_gapic.yaml']
proto_gen_pkg_deps : ['google-common-protos']
Create GrpcClientPipeline instance.
Searching for latest protobuf source
/Users/songwang/homebrew/bin/grpc_ruby_plugin: File does not reside within any path specified using --proto_path (or -I). You must specify a --proto_path which encompasses this file. Note that the proto_path must be an exact prefix of the .proto file names -- protoc is too dumb to figure out when two paths (e.g. absolute and relative) are equivalent (it's harder than you think).

Traceback (most recent call last):
File "execute_pipeline.py", line 239, in
main(sys.argv[1:])
File "execute_pipeline.py", line 87, in main
engine.run()
File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 159, in run
for _state in self.run_iter():
File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 223, in run_iter
failure.Failure.reraise_if_any(it)
File "/Users/songwang/IdeaProjects/vtk/artman/.tox/py27/lib/python2.7/site-packages/taskflow/types/failure.py", line 292, in reraise_if_any
failures[0].reraise()

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.