Giter Club home page Giter Club logo

tap-chargebee's People

Contributors

arvindenvoy avatar cb-aravindh avatar cb-nandita avatar cb-svel avatar dbshah1212 avatar dmosorast avatar dwallace0723 avatar hpatel41 avatar kallan357 avatar leslievandemark avatar luandy64 avatar nick-mccoy avatar nitingaikwad1 avatar rushit0122 avatar savan-chovatiya avatar sgandhi1311 avatar zachharris1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tap-chargebee's Issues

CRITICAL can't compare offset-naive and offset-aware datetimes

When running the tap-chargebee in sync mode, it fails with the following traceback:

INFO METRIC: {"type": "counter", "metric": "record_count", "value": 100, "tags": {"endpoint": "events"}}
ERROR can't compare offset-naive and offset-aware datetimes
ERROR Failed to sync endpoint events, moving on!
CRITICAL can't compare offset-naive and offset-aware datetimes
Traceback (most recent call last):
  File "./tap-chargebee/bin/tap-chargebee", line 8, in <module>
    sys.exit(main())
  File "/home/brylie/Code/MaaS_Global/singer/tap-chargebee/lib/python3.8/site-packages/singer/utils.py", line 192, in wrapped
    return fnc(*args, **kwargs)
  File "/home/brylie/Code/MaaS_Global/singer/tap-chargebee/lib/python3.8/site-packages/tap_chargebee/__init__.py", line 27, in main
    runner.do_sync()
  File "/home/brylie/Code/MaaS_Global/singer/tap-chargebee/lib/python3.8/site-packages/tap_framework/__init__.py", line 81, in do_sync
    raise e
  File "/home/brylie/Code/MaaS_Global/singer/tap-chargebee/lib/python3.8/site-packages/tap_framework/__init__.py", line 71, in do_sync
    stream.sync()
  File "/home/brylie/Code/MaaS_Global/singer/tap-chargebee/lib/python3.8/site-packages/tap_framework/streams.py", line 142, in sync
    return self.sync_data()
  File "/home/brylie/Code/MaaS_Global/singer/tap-chargebee/lib/python3.8/site-packages/tap_chargebee/streams/base.py", line 193, in sync_data
    max_date = max(
TypeError: can't compare offset-naive and offset-aware datetimes

Repeatedly synchronizing entire Invoices table

We are using Stitch-managed version of Singer with the tap-chargebee enabled and noticed that we were hitting our row limit.

The Chargebee tap is replicating all of the rows in the Invoices table repeatedly, instead of the new rows only. What can we do to prevent the tap-chargebee from replicating the entire table, and all related tables, each night?

"Expecting value: line 1 column 1 (char 0)" - Stitch integration

Hi there,

I'm using Stitch to sync some of our Chargebee data with Google BigQuery datasets.

The extractions run every 6 hours, and the initial 5 were successful. We then started receiving this error:

Expecting value: line 1 column 1 (char 0

The first time the error occurred, I re-ran the extraction manually and it was successful. The next day, the error occurred again, and now is happening on 100% of retries. I've pasted a portion of the logs below. From a quick scan, it seems to be an issue with the invoice extraction from Chargebee.

Does anyone have any ideas on how to investigate the issue further? Thank you!

2021-03-24 15:21:27,538Z    tap - INFO [smart-services] event successfully sent to kafka: com.stitchdata.streamRecordCount [2] at offset None
2021-03-24 15:21:27,538Z    tap - INFO replicated 100 records from "invoices" endpoint
2021-03-24 15:21:27,538Z    tap - INFO Advancing by one offset.
2021-03-24 15:21:27,538Z    tap - INFO Making GET request to https://screencastify.chargebee.com/api/v2/invoices
2021-03-24 15:22:37,498Z    tap - ERROR Expecting value: line 1 column 1 (char 0)
2021-03-24 15:22:37,498Z    tap - ERROR Failed to sync endpoint invoices, moving on!
2021-03-24 15:22:37,498Z    tap - CRITICAL Expecting value: line 1 column 1 (char 0)
2021-03-24 15:22:37,499Z    tap - Traceback (most recent call last):
2021-03-24 15:22:37,499Z    tap -   File "tap-env/bin/tap-chargebee", line 33, in <module>
2021-03-24 15:22:37,499Z    tap -     sys.exit(load_entry_point('tap-chargebee==1.0.1', 'console_scripts', 'tap-chargebee')())
2021-03-24 15:22:37,499Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/singer/utils.py", line 192, in wrapped
2021-03-24 15:22:37,499Z    tap -     return fnc(*args, **kwargs)
2021-03-24 15:22:37,499Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/tap_chargebee/__init__.py", line 27, in main
2021-03-24 15:22:37,500Z    tap -     runner.do_sync()
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/tap_framework/__init__.py", line 81, in do_sync
2021-03-24 15:22:37,500Z    tap -     raise e
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/tap_framework/__init__.py", line 71, in do_sync
2021-03-24 15:22:37,500Z    tap -     stream.sync()
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/tap_framework/streams.py", line 142, in sync
2021-03-24 15:22:37,500Z    tap -     return self.sync_data()
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/tap_chargebee/streams/base.py", line 156, in sync_data
2021-03-24 15:22:37,500Z    tap -     params=params)
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/backoff.py", line 286, in retry
2021-03-24 15:22:37,500Z    tap -     ret = target(*args, **kwargs)
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/singer/utils.py", line 62, in wrapper
2021-03-24 15:22:37,500Z    tap -     return func(*args, **kwargs)
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/tap_chargebee/client.py", line 69, in make_request
2021-03-24 15:22:37,500Z    tap -     response_json = response.json()
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/requests/models.py", line 892, in json
2021-03-24 15:22:37,500Z    tap -     return complexjson.loads(self.text, **kwargs)
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/simplejson/__init__.py", line 516, in loads
2021-03-24 15:22:37,500Z    tap -     return _default_decoder.decode(s)
2021-03-24 15:22:37,500Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/simplejson/decoder.py", line 370, in decode
2021-03-24 15:22:37,501Z    tap -     obj, end = self.raw_decode(s)
2021-03-24 15:22:37,501Z    tap -   File "/code/orchestrator/tap-env/lib/python3.5/site-packages/simplejson/decoder.py", line 400, in raw_decode
2021-03-24 15:22:37,501Z    tap -     return self.scan_once(s, idx=_w(s, idx).end())
2021-03-24 15:22:37,501Z    tap - simplejson.scanner.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
2021-03-24 15:22:37,539Z target - INFO Serializing batch with 300 messages for table invoices
2021-03-24 15:22:37,545Z target - INFO Sending batch of 299633 bytes to https://api.stitchdata.com/v2/import/batch
2021-03-24 15:22:37,551Z target - INFO [smart-services] event successfully sent to kafka: com.stitchdata.streamRecordCount [6] at offset None
2021-03-24 15:22:37,551Z target - INFO replicated 300 records from "invoices" endpoint
2021-03-24 15:22:37,842Z target - INFO Requests complete, stopping loop
2021-03-24 15:22:37,899Z   main - INFO Target exited normally with status 0
2021-03-24 15:22:38,330Z   main - INFO [smart-services] event successfully sent to kafka: com.stitchdata.extractionJobFinished [1] at offset None
2021-03-24 15:22:38,331Z   main - INFO No tunnel subprocess to tear down
2021-03-24 15:22:38,331Z   main - INFO Exit status is: Discovery succeeded. Tap failed with code 1 and error message: "Expecting value: line 1 column 1 (char 0)". Target succeeded.

Credit Note Line Item Tax Rate Stored as Integer

I'm synchronizing to RDS.

In the invoices__line_item_taxes table, the tax_rate column is stored as a bigint.

This means that tax rates are truncated to the nearest whole percent (eg: Quebec QST is truncated from 9.975 to 9).

draft invoices persist in synced data warehouse, despite being published

Chargebee invoices can be generated as drafts. In this case, their id is something like 'draft_inv_*' and they remain in that state until the invoice is published and sent to the customer. We have noticed that there are cases where our 'invoices' table maintains the draft invoice, even after the proper invoice has been published and paid. We would expect that the draft id would at least have a "deleted"=True flag, but that's not the case.
So far we had been ignoring draft invoices for calculations, but now we want to create an operational check for invoices that are stuck in draft state, and that's how we noticed it.
We are using chargebee v1 catalog, with Stitch on a BigQuery project.

Is anyone anymore maintaining actively this repository?

Hi all,

I'm wondering whether this repository is anymore actively maintained. If this isnโ€™t maintained actively by anyone, are there any plans to share the maintenance rights with someone else, so the community could keep improving this library? ๐Ÿ˜„

There seem to be some open issues and PRs that no one seems to have reacted to for over 1 year, which hints that there might not be anyone here to maintain.

About the open issues, for instance, tax_rate is stored as an integer even though it's double. It makes this plugin unusable when needing access to the US city taxes andย US special taxes which can be under 1%.

See the Chargebee docs:
https://apidocs.chargebee.com/docs/api/invoices?prod_cat_ver=2#invoice_line_item_taxes_tax_rate
image

The issue has been reported (#81) and a fix has been made (#80) however it hasn't been merged or reviewed over 1.5 years.

Thanks in advance! ๐Ÿ™

Subscriptions not ordered correctly during historical load

The latest version that is inserted in the warehouse (snowflake) is not the right one during historical sync.
Chargebee's resource_version field is correctly incremented, however when stitch loads it, the ordering is not correct
Wondering if the tap uses the resource_version field during historical sync to correctly order the events?

Transactions/Subscriptions/Orders Data not coming through

We've been successful with Customers and Invoices, but the three schema's above aren't working. Get the following error for Orders, unsure if the same error is impacting subscriptions and transactions:
tap - ERROR 400 Client Error: Bad Request for url: https://acloudguru-us.chargebee.com/api/v2/orders?include_deleted=True&limit=100&updated_at%5Bafter%5D=1532822400

Looks like others aren't getting populated because the tap process stops after this error and exits out.

Worth noting I'm coming from the Stitch Beta if that helps.

Did we stop importing the latest row?

Was there a recent change that would cause zero rows to be imported in the case where there are no changes in the api?

We were always getting a single row from unchanged plans, coupons and credit_notices and were using that to determine that the integration was working.

Recently we are now getting 0 rows for those types

Stitch Integration- Missing Business Entity Details from Chargebee customers details

Hi,

We are using stitch integration to get data from CHARGEBEE to SNOWFLAKE.
We need "business_entity_id" attribute from customers details, however, a warning message is noticed in the logs as below and this attribute is not fetched to snowflake.
WARNING Removed paths list: ['business_entity_id', 'cf_people_id', 'tax_providers_fields'].
Why is this attribute not copied over from chargebee to snowflake?
What should we do to get this attribute?
Stitch team has redirected us to this channel. Attached is the email response from stitch team.
Your response is much appreciated.
Stitch Integration- Missing Business Entity Details from Chargebee customers details.pdf
Thank You!

Endless 'Error persisting data to Stitch: 400' error

Hey everyone

I'm getting this 400 error on all extractions:

Error persisting data to Stitch: 400: {'error': 'Record 130 for table payment_sources did not conform to schema:\n#: #: no subschema matched out of the total 2 subschemas\n#: expected: null, found: JSONObject\n#/reference_id: #: no subschema matched out of the total 2 subschemas\n#/reference_id: expected: null, found: String\n#/reference_id: expected maxLength: 50, actual: 52\n'}

In the logs themselves im getting:

2021-04-24 04:30:34,093Z tap - WARNING Removed paths list: ['billing_address.object', 'billing_address.validation_status', 'cf_guid', 'cf_is_migrated', 'cf_password_hash', 'cf_verified', 'payment_method.gateway_account_id']

2021-04-24 04:30:34,114Z tap - WARNING Removed paths list: ['dunning_attempts', 'line_item_taxes.0.object', 'line_item_taxes.1.object', 'line_items.0.customer_id', 'line_items.1.customer_id', 'shipping_address.object', 'shipping_address.validation_status', 'taxes.0.object']

And lots more of the same error just with different attributes.

Setup is CB-Singer/Stitch-BigQuery

I tried to set up a second connection and create new tables fresh, but the same error persists even with a clean setup.

Updated: Could it be related to this (singer-io/tap-xero#17)

JSON Decode error

I am using tap-chargebee together with postgresql target. While retrieving the information I get a lot of warnings which I don't understand how to solve them. On the other hand, while exporting the data to the target I get the JSONDecodeError .

command:

tap-chargebee --config config.json --catalog catalog.json | ~/.virtualenvs/soclose-postgresql/bin/target-postgres --config target_config.json >> state.json

output:

WARNING Removed paths list: ['content.card', 'content.customer.cf_checkbox_confirm', 'content.customer.cf_instagram_username', 'content.customer.payment_method.gateway_account_id', 'webhooks']
WARNING Removed 8 paths during transforms:
	content.card
	content.customer.cf_checkbox_confirm
	content.customer.cf_instagram_username
	content.customer.payment_method.gateway_account_id
	content.invoice.dunning_attempts
	content.invoice.line_items.0.customer_id
	content.invoice.line_items.1.customer_id
	webhooks
WARNING Removed paths list: ['content.card', 'content.customer.cf_checkbox_confirm', 'content.customer.cf_instagram_username', 'content.customer.payment_method.gateway_account_id', 'content.invoice.dunning_attempts', 'content.invoice.line_items.0.customer_id', 'content.invoice.line_items.1.customer_id', 'webhooks']
WARNING Removed 4 paths during transforms:
	content.invoice.dunning_attempts
	content.invoice.line_items.0.customer_id
	content.invoice.line_items.1.customer_id
	webhooks
WARNING Removed paths list: ['content.invoice.dunning_attempts', 'content.invoice.line_items.0.customer_id', 'content.invoice.line_items.1.customer_id', 'webhooks']
WARNING Removed 8 paths during transforms:
	content.card
	content.customer.cf_checkbox_confirm
	content.customer.cf_instagram_username
	content.customer.payment_method.gateway_account_id
	content.invoice.dunning_attempts
	content.invoice.line_items.0.customer_id
	content.invoice.line_items.1.customer_id
	webhooks
WARNING Removed paths list: ['content.card', 'content.customer.cf_checkbox_confirm', 'content.customer.cf_instagram_username', 'content.customer.payment_method.gateway_account_id', 'content.invoice.dunning_attempts', 'content.invoice.line_items.0.customer_id', 'content.invoice.line_items.1.customer_id', 'webhooks']
INFO METRIC: {"type": "counter", "metric": "record_count", "value": 86, "tags": {"endpoint": "events"}}
INFO Final offset reached. Ending sync.
INFO Updating state.
INFO Updating state.
INFO Mapping: databasechangelog to None
INFO Mapping: databasechangeloglock to None
INFO Mapping: databasechangeloglock_pkey to None
INFO Mapping: core_user_id_seq to None
INFO Mapping: core_user_pkey to None
INFO Mapping: core_user_email_key to None
INFO Mapping: metabase_database_id_seq to None
INFO Mapping: metabase_database_pkey to None
INFO Mapping: metabase_table_id_seq to None
INFO Mapping: core_user to None
INFO Mapping: metabase_database to None
INFO Mapping: metabase_field_id_seq to None
INFO Mapping: metabase_field_pkey to None
INFO Mapping: metabase_table_pkey to None
INFO Mapping: idx_field_table_id to None
INFO Mapping: idx_table_db_id to None
INFO Mapping: metabase_field to None
INFO Mapping: metabase_table to None
INFO Mapping: metabase_fieldvalues_id_seq to None
INFO Mapping: metabase_fieldvalues_pkey to None
INFO Mapping: metabase_fieldvalues to None
INFO Mapping: idx_fieldvalues_field_id to None
INFO Mapping: report_card_id_seq to None
INFO Mapping: report_card_pkey to None
INFO Mapping: idx_card_creator_id to None
INFO Mapping: report_cardfavorite_id_seq to None
INFO Mapping: report_cardfavorite_pkey to None
INFO Mapping: report_cardfavorite to None
INFO Mapping: idx_unique_cardfavorite_card_id_owner_id to None
INFO Mapping: idx_cardfavorite_card_id to None
INFO Mapping: idx_cardfavorite_owner_id to None
INFO Mapping: report_card to None
INFO Mapping: report_dashboard_id_seq to None
INFO Mapping: report_dashboard_pkey to None
INFO Mapping: idx_dashboard_creator_id to None
INFO Mapping: report_dashboardcard_id_seq to None
INFO Mapping: report_dashboardcard_pkey to None
INFO Mapping: report_dashboardcard to None
INFO Mapping: idx_dashboardcard_card_id to None
INFO Mapping: idx_dashboardcard_dashboard_id to None
INFO Mapping: revision to None
INFO Mapping: core_session to None
INFO Mapping: idx_revision_model_model_id to None
INFO Mapping: core_session_pkey to None
INFO Mapping: setting to None
INFO Mapping: setting_pkey to None
INFO Mapping: revision_id_seq to None
INFO Mapping: activity_pkey to None
INFO Mapping: revision_pkey to None
INFO Mapping: activity_id_seq to None
INFO Mapping: activity to None
INFO Mapping: idx_activity_timestamp to None
INFO Mapping: idx_activity_user_id to None
INFO Mapping: idx_activity_custom_id to None
INFO Mapping: view_log_id_seq to None
INFO Mapping: view_log_pkey to None
INFO Mapping: view_log to None
INFO Mapping: idx_view_log_user_id to None
INFO Mapping: idx_view_log_timestamp to None
INFO Mapping: data_migrations to None
INFO Mapping: data_migrations_pkey to None
INFO Mapping: idx_data_migrations_id to None
INFO Mapping: pulse_id_seq to None
INFO Mapping: pulse_pkey to None
INFO Mapping: pulse to None
INFO Mapping: idx_pulse_creator_id to None
INFO Mapping: pulse_card_id_seq to None
INFO Mapping: pulse_card_pkey to None
INFO Mapping: pulse_card to None
INFO Mapping: idx_pulse_card_pulse_id to None
INFO Mapping: idx_pulse_card_card_id to None
INFO Mapping: pulse_channel_id_seq to None
INFO Mapping: pulse_channel_pkey to None
INFO Mapping: pulse_channel to None
INFO Mapping: idx_pulse_channel_pulse_id to None
INFO Mapping: idx_pulse_channel_schedule_type to None
INFO Mapping: pulse_channel_recipient_id_seq to None
INFO Mapping: pulse_channel_recipient_pkey to None
INFO Mapping: pulse_channel_recipient to None
INFO Mapping: segment_id_seq to None
INFO Mapping: dependency_pkey to None
INFO Mapping: segment_pkey to None
INFO Mapping: idx_segment_creator_id to None
INFO Mapping: idx_segment_table_id to None
INFO Mapping: dependency_id_seq to None
INFO Mapping: dependency to None
INFO Mapping: idx_dependency_model to None
INFO Mapping: idx_dependency_model_id to None
INFO Mapping: idx_dependency_dependent_on_model to None
INFO Mapping: idx_dependency_dependent_on_id to None
INFO Mapping: metric_id_seq to None
INFO Mapping: metric_pkey to None
INFO Mapping: idx_metric_creator_id to None
INFO Mapping: idx_metric_table_id to None
INFO Mapping: dashboardcard_series_id_seq to None
INFO Mapping: dashboardcard_series_pkey to None
INFO Mapping: dashboardcard_series to None
INFO Mapping: idx_dashboardcard_series_dashboardcard_id to None
INFO Mapping: idx_dashboardcard_series_card_id to None
INFO Mapping: label_id_seq to None
INFO Mapping: label_pkey to None
INFO Mapping: label_slug_key to None
INFO Mapping: idx_label_slug to None
INFO Mapping: card_label_id_seq to None
INFO Mapping: card_label_pkey to None
INFO Mapping: card_label to None
INFO Mapping: label to None
INFO Mapping: unique_card_label_card_id_label_id to None
INFO Mapping: idx_card_label_card_id to None
INFO Mapping: idx_card_label_label_id to None
INFO Mapping: idx_report_dashboard_show_in_getting_started to None
INFO Mapping: metric to None
INFO Mapping: idx_metric_show_in_getting_started to None
INFO Mapping: segment to None
INFO Mapping: idx_segment_show_in_getting_started to None
INFO Mapping: metric_important_field_id_seq to None
INFO Mapping: metric_important_field_pkey to None
INFO Mapping: metric_important_field to None
INFO Mapping: unique_metric_important_field_metric_id_field_id to None
INFO Mapping: idx_metric_important_field_metric_id to None
INFO Mapping: idx_metric_important_field_field_id to None
INFO Mapping: permissions_group_id_seq to None
INFO Mapping: permissions_group_pkey to None
INFO Mapping: unique_permissions_group_name to None
INFO Mapping: idx_permissions_group_name to None
INFO Mapping: permissions_group_membership_id_seq to None
INFO Mapping: permissions_group_membership_pkey to None
INFO Mapping: permissions_group to None
INFO Mapping: permissions_group_membership to None
INFO Mapping: unique_permissions_group_membership_user_id_group_id to None
INFO Mapping: idx_permissions_group_membership_group_id to None
INFO Mapping: idx_permissions_group_membership_user_id to None
INFO Mapping: idx_permissions_group_membership_group_id_user_id to None
INFO Mapping: permissions_id_seq to None
INFO Mapping: permissions_pkey to None
INFO Mapping: permissions to None
INFO Mapping: idx_permissions_group_id to None
INFO Mapping: idx_permissions_object to None
INFO Mapping: idx_permissions_group_id_object to None
INFO Mapping: permissions_group_id_object_key to None
INFO Mapping: permissions_revision_id_seq to None
INFO Mapping: idx_metabase_table_show_in_getting_started to None
INFO Mapping: idx_metabase_table_db_id_schema to None
INFO Mapping: collection_id_seq to None
INFO Mapping: permissions_revision_pkey to None
ERROR Exception writing records
Traceback (most recent call last):
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/postgres.py", line 236, in write_batch
    self.setup_table_mapping_cache(cur)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/postgres.py", line 223, in setup_table_mapping_cache
    table_path = json.loads(raw_json).get('path', None)
  File "/usr/local/Cellar/[email protected]/3.9.5/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/usr/local/Cellar/[email protected]/3.9.5/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/local/Cellar/[email protected]/3.9.5/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
CRITICAL ('Exception writing records', JSONDecodeError('Expecting value: line 1 column 1 (char 0)'))
Traceback (most recent call last):
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/postgres.py", line 236, in write_batch
    self.setup_table_mapping_cache(cur)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/postgres.py", line 223, in setup_table_mapping_cache
    table_path = json.loads(raw_json).get('path', None)
  File "/usr/local/Cellar/[email protected]/3.9.5/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/usr/local/Cellar/[email protected]/3.9.5/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/local/Cellar/[email protected]/3.9.5/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/bin/target-postgres", line 8, in <module>
    sys.exit(cli())
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/__init__.py", line 45, in cli
    main(args.config)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/__init__.py", line 39, in main
    target_tools.main(postgres_target)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/target_tools.py", line 28, in main
    stream_to_target(input_stream, target, config=config)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/target_tools.py", line 77, in stream_to_target
    raise e
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/target_tools.py", line 70, in stream_to_target
    state_tracker.flush_streams(force=True)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/stream_tracker.py", line 47, in flush_streams
    self._write_batch_and_update_watermarks(stream)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/stream_tracker.py", line 67, in _write_batch_and_update_watermarks
    self.target.write_batch(stream_buffer)
  File "/Users/Nacho/.virtualenvs/soclose-postgresql/lib/python3.9/site-packages/target_postgres/postgres.py", line 309, in write_batch
    raise PostgresError(message, ex)

No updates for subscriptions & transactions

2019-07-31 10:48:10,285Z main - INFO State update: adding bookmarks.subscriptions.bookmark_date = "2019-06-28T16:38:07Z"
2019-07-31 10:48:10,285Z main - INFO State update: adding bookmarks.transactions.bookmark_date = "2019-06-28T16:38:00Z"

No synchronisation for "orders" endpoint

I'm trying to synch the "orders" endpoint to BigQuery. I've been successful with all others endpoints except this one.
The log doesn't show any error even if calling subdomain.chargebee.com/api/v2/orders from postman returns all my orders.
Does any one know how to solve this?

Checkout.com integration issue

Checkout.com has payment source reference_id of length: 61 which is breaking tap because of schema limits.

Kindly have a look at the pr with fix for the same: #27

Missing Addons imported to redshift

I have the Chargebee Integration running through Stitch. I just noticed that the addons imported to redshift does not include the full list which is available in Chargebee. I do have 10 addons in warehouse and 30 addons in Chargebee. What is the potential root cause for this issue.

Deleted invoices since sync

Hi,
If include_deleted was false, but at the time of a sync, the invoice was 'active', then deleted - what happens?

Does the invoice get updated via Tap into whatever destination was configured to mark it as deleted?
Or
Do we need include_deleted set to true for updates to occur?

tiers:price attribute missing from addons integration

I am seeing a consistent error within the 'tiers' attribute of the addon object. Specifically, the attribute (which is itself json) is missing the 'price' attribute. This is the case for all relevant addons with 'tiered', 'volume', and 'stairstep' pricing.

Stitch Transactions & Invoices to Redshift comes over as several tables with no keys back to eachother

Transaction tables are coming over as 4 separate tables:

transactions_linked_credits, transactions_linked_invoices, transactions_linked_refunds. This isnt allowing me to link them back to invoices or to subscriptions:

Screen Shot 2021-02-17 at 8 55 55 AM

You can see that there is a transactions_linked_invoices table, however there is no transaction id for me to link to transactions only invoices. Has anyone else experienced this using Chargebee Stitch & Redshift?

Thanks

Invoice status not syncing correctly via Stitch

I'm currently using Stitch as our data extractor for Chargebee and I noticed that the status for some invoices may not be showing correctly.

This invoice data is extracted via Stitch (with this integration):
https://drive.google.com/file/d/12CfnATGmurQt5kGApyToKieiW0CQljz6/view?usp=sharing

The exact same invoice on Chargebee:
https://drive.google.com/file/d/12CfnATGmurQt5kGApyToKieiW0CQljz6/view?usp=sharing

You can see that the invoice status in Chargebee it's showing "voided", but via Stitch it's still "not paid". While the majority of the invoices are syncing correctly, a good amount of invoices are still behaving similarly to the example presented above which can be quite troublesome.

I'm pretty sure this is a bug, just not sure how to fix it?

Coupons Not Aligned with Subscriptions

My coupons are not being properly attributed to subscriptions in Stitch. The majority of cases where a coupon should be in the coupon column, subscription table, it shows up as null. The rest of the record is present and accurate -- only this field is incorrect.

Is there some setting or change needed in Stitch to get all the values to appear there?

Events Schema Blocks Negative Round Off Values

We have negative round off values for some of our Chargebee event-based credit notes. For example, rounding from 13.10 to 13.00 results in a -0.10 round off whcih is replresented as -10 in the event data - resulting in the following error:

Failed validating 'minimum' in schema['properties']['content']['properties']['credit_note']['properties']['round_off_amount']:
    {'minimum': 0, 'type': ['null', 'integer']}

On instance['content']['credit_note']['round_off_amount']:
    -10

Not All Events Synched Over

There are instances where a subscription has events in chargebee and some events do not get brought over into the events table in redshift

Is there a way to identify why these records were not picked up and sent to redshift?

Missing Item Price custom fields

Disclaimer: I'm not a dev ๐Ÿ˜…

Hello, I noticed Chargebee changed the way they structure custom fields on their product catalog v2.

Previously these were structured as they currently are on "customers", i.e. stored in a custom_fields object on Plans.
On PC v2, it looks like these are directly available as fields on the Item Price.

Is there a way to support this new behavior in the tap?
Thanks!

No updates for subscriptions & transactions

We are not seeing updates to our Chargebee subscription & transaction data since 2019-06-28. Despite the fact that there are new subscriptions & transactions since then. If we look at the logs we can see that the bookmark_date for these tables is not updated:

2019-07-31 10:48:10,285Z main - INFO State update: adding bookmarks.subscriptions.bookmark_date = "2019-06-28T16:38:07Z"
2019-07-31 10:48:10,285Z main - INFO State update: adding bookmarks.transactions.bookmark_date = "2019-06-28T16:38:00Z"

How can we fix this?

Found a difference between data arriving in Snowflake/Bigquery vs Postgres

Hi,

I was migrating to Snowflake from a Postgres-based analytics set-up and found a difference.
The Postgres one is correct after checking in chargebee.
I thought this might be a Snowflake error so I created another stitch account and sent my data to Bigquery.
However there I have exactly the same data as in Snowflake.

I'm talking about the customer-table. I don't know how often it happens, but here is an example record for which I made a field by field comparison :ย https://docs.google.com/spreadsheets/d/19pjWtwEjfOY8s0gJdVEa1gBwopclo8CI68xJpURX_xM/edit#gid=0

Strangely for Snowflake/Bigquery the updated_at field equals the created_at field while that customer has received many changes since it's creation date. All fields seem to correspond with the state they had at creation time while in postgres the updated_at field is this month and the other fields match what I find in Chargebee. Another strange thing is that the field DELETED is set to TRUE in the Snowflake/Bigquery version, while I cannot find any customer_deleted event in the customer's event history.

If you have any idea what's wrong, let me know.

Florian

Chargebee-Stitch Integration Issue

I'm having an issue with the integration between Chargebee-Stitch in my database. i dropped a table and then reloaded with filtering some of the columns. But now I have only 1 row of data in the table that I dropped.

How can I solve this issue ?

Chargebee Upsell & Downgrade Metrics

Hi,

We are trying to retrieve upsell & downgrade data via the Stitch integration. Chargebee & Stitch first referred us to the events table. But we are running into some limitations. How to retrieve the MRR value of the subscription changes & the type of change (addons, coupons, renewal frequency). Did anyone run into this issue as well? Were you able to solve this?

Thanks in advance.
Jonas

Missing Addons into BQ

Hi, I'm transferring data from Chargebee to BigQuery, and all tables are importing correctly, except for Addons. It's only importing 1 row, while in CB there are around 30. Would appreciate any help.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.