Giter Club home page Giter Club logo

domo-python-sdk's People

Contributors

aldifahrezi avatar bradleyhurley avatar brandontysonmoss avatar gurfuzle avatar jboliv01 avatar jeremydmorris avatar ldacey avatar mijns avatar murphysean avatar nareshkp avatar rswingler avatar samrands avatar statianzo avatar swenseng-domo avatar thattolleyguy avatar yunruiwatts avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

domo-python-sdk's Issues

No matching distribution found for pydomo

Hello,

I'm trying to install pydomo on Debian 9 Stretch.

Command:

$ sudo pip3 install pydomo

Output:

Collecting pydomo
  Could not find a version that satisfies the requirement pydomo (from versions: )
No matching distribution found for pydomo

Stream upload_part question.

Should the following example from pydomo.streams raise an error when an upload fails(i.e. server refuses connection etc.)? Only having a problem when multi-threading uploads. My test streams are arriving short of data and no exceptions are popping up.

execution = streams.upload_part(stream.id, execution.id, part, csv)

If this is something that should be taken up with the Support team, let me know.

Phillip Irwin
Bass Pro Shops

Domo won't accept schema stored as a variable

I have a script that will loop through all the files in a directory, discover their schema, create the dataset in Domo and push the data to Domo. However, Domo's API won't accept my schema as a variable. If I print the variable and copy and paste the result into the dsr.schema it takes it fine. Can we change it so that it accepts the schema as a variable?

DataSetClient references self.log and not self.logger

DataSetClient inherits from DomoAPIClient which initializes self.logger but it uses self.log so that raises AttributeError as can be seen in this traceback:

Traceback (most recent call last):
File "test/cli.py", line 167, in
domo.load_datasets()
File "/Users/test/domo_target.py", line 111, in load_datasets
csv = self.domo.datasets.data_export(d['id'], True)
File "/Users/test/.virtualenvs/test/lib/python3.6/site-packages/pydomo/datasets/DataSetClient.py", line 115, in data_export
self.log.debug("Error downloading data from DataSet: " + self.transport.dump_response(response))
AttributeError: 'DataSetClient' object has no attribute 'log'

Integer Overflow Error on String Field

If you have a dataset in Domo with a field of type string that contains numeric data with values larger than your system's signed integer maximum (i.e. some sort of 12+ digit numeric id value), the ds_get function will throw and integer overflow error.

This occurs because the Domo API exports this field in the csv with quotation marks to denote that it is a string, not a number, causing the pandas read_csv function to create the field with a dtype of "object". The ds_get function then (on line 195) attempts to interpret all object type fields as dates. Then, down the stack, the Pandas datetime parser tries to read the entire string as an integer. Because this happens in Cython, it throws an overflow error.

Here's the stack trace (minus my code):

[ERROR] OverflowError: signed integer is greater than maximum
Traceback (most recent call last):
File "/var/task/pydomo/init.py", line 195, in ds_get
df[col] = to_datetime(df[col])
File "/var/task/pandas/core/tools/datetimes.py", line 805, in to_datetime
values = convert_listlike(arg._values, format)
File "/var/task/pandas/core/tools/datetimes.py", line 472, in _convert_listlike_datetimes
allow_object=True,
File "/var/task/pandas/core/arrays/datetimes.py", line 2081, in objects_to_datetime64ns
require_iso8601=require_iso8601,
File "pandas/_libs/tslib.pyx", line 364, in pandas._libs.tslib.array_to_datetime
File "pandas/_libs/tslib.pyx", line 591, in pandas._libs.tslib.array_to_datetime
File "pandas/_libs/tslib.pyx", line 726, in pandas._libs.tslib.array_to_datetime_object
File "pandas/_libs/tslib.pyx", line 717, in pandas._libs.tslib.array_to_datetime_object
File "pandas/_libs/tslibs/parsing.pyx", line 243, in pandas._libs.tslibs.parsing.parse_datetime_string
File "/var/task/dateutil/parser/_parser.py", line 1374, in parse
return DEFAULTPARSER.parse(timestr, **kwargs)
File "/var/task/dateutil/parser/_parser.py", line 655, in parse
ret = self._build_naive(res, default)
File "/var/task/dateutil/parser/_parser.py", line 1241, in _build_naive
naive = default.replace(**repl)

Incremental refreshes?

I am using this approach to append data with Workbench:

SELECT *
FROM schema.view
WHERE row_key > '!{lastvalue:row_key}!'

Is there a way I can do something similar using the API? Sometimes I do not want to replace a dataset which has millions of rows, but appending can be challenging.

Also, I assume that there is no way to upsert data, right? Sometimes I have data which eventually changes so I can't append it, but the dataset might be 10 million rows, so replacing the whole thing is wasteful.

Unable to delete user using python api

I have a script that deleted the users programmatically based on adp. But when I try to delete that user, it says this user owns dataset - please reassign the datasets before deleting which is not true. The user role is participant this users do not won any dataset. Can we fix this please asap

ds_query returning status 400

I am having an issue with ds_query. When running the following query, I get an error (status code 400).

Query:
data_set_id = '**********'
query = {"sql": "SELECT * FROM table LIMIT 2"}
ds = domo.ds_query(data_set_id, query)

Error:
{"status":400, "statusReason":"BadRequest","message":"Cannot deserialize instance of java.lang.String out of START_OBJECT token"}.

ds_get works fine and returns data as expected.

The max amount of users to return in the list 500

Limiting the max numbers of users to return in the list to 500 is very inconvenient. We are planning to automate some of the user management through python and we have more than 800 users and this makes it very difficult for us to fully automate.

Are you guys planning on increasing this in the future?

Publication Groups

Is there a way to manage publication groups?

We need to manage the according:

  • Pages
  • Data Sets
  • Access Groups

Command "python setup.py egg_info" failed

Hello,

I installed Python 3 and tried to download the pydomo package with pip but got the following error.

$ pip install pydomo --no-cache-dir
Collecting pydomo
  Downloading pydomo-0.2.2.1.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "c:\cygwin64\tmp\pip-build-2gkndm\pydomo\setup.py", line 13, in <module>
        except FileNotFoundError as fnfe:
    NameError: name 'FileNotFoundError' is not defined

    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in c:\cygwin64\tmp\pip-build-2gkndm\pydomo\

pydomo get error

Good day. I'm using the datasets.data_import api to upload records into a Domo dataset. When I ran the update, it appeared to run successfully, but the dataset was not changed. How can I trap the error from datasets.data_import?

Thanks in advance for your attention.

ds_create not uploading the full dataset

Hi (again), ds_create is working but it is not uploading the full data frame. There are 974007 in my dataframe and ds_create only uploads 940K rows to DOMO. When I add an extra column, less rows were uploaded. The size of the uploaded data is always around 130Mb. When I tried uploading a smaller dataset of size 400kb all data was uploaded successfully. I am wondering if there is a limit to the dataframe to be uploaded?

Thanks!

Kind regards
Heidi

Enable streaming to disk for "data_export_to_file()"

Currently, pydomo pulls csv http responses into memory, which can be arbitrarily large. Improve the data export functions to write csv data directly to disk, avoiding a memory error on large files.

The "data_export_to_file()" function was tested by downloading a 1.82GB csv file, featuring 12.9M rows and 14 columns. The download completed successfully multiple times in a row. Despite this, further defenses can be made against memory errors with large files.

Example script fails on export

I've been using DOMO for 6 months now and I want to learn python so I'm making this my first project. When I run the example script (with my api credentials), it successfully creates a dataset in my instance and updates it, but seems to fail on export. Since I just started with Python, I'm not sure what the problem is.
run data.txt

Create a Dataset that will Append

Hey there,

I know in the R documentation you can create a dataset and set the update method to 'append'. I don't see that functionality in the Python SDK. Is there a workaround for this?

Thanks!

Run_Examples exampe: pages link in docstring lead to 404

https://github.com/domoinc/domo-python-sdk/blob/master/run_examples.py

In the docstring of function pages:

def pages(self, domo):
'''Page Docs: https://developer.domo.com/docs/domo-apis/pages
'''

This link when put into the browser leads to a 404.

Curl example:

$ curl -I https://developer.domo.com/docs/domo-apis/pages
HTTP/2 404
server: nginx/1.13.7
date: Mon, 21 May 2018 15:10:49 GMT
content-type: text/html; charset=UTF-8
vary: Accept-Encoding
set-cookie: SESS10694d8e4f0d8a48662661b2ab8039f7=7d51703215357c3d4dd8d64e42577ec8; expires=Mon, 21-May-2018 15:10:48 GMT; Max-Age=0
set-cookie: SSESS10694d8e4f0d8a48662661b2ab8039f7=5908eefd674b6cd0c7242ea43b234b03; path=/; domain=.developer.domo.com; secure; HttpOnly
expires: Wed, 11 Jan 1984 05:00:00 GMT
cache-control: no-cache, must-revalidate, max-age=0
link: https://developer.domo.com/wp-json/; rel="https://api.w.org/"
strict-transport-security: max-age=15724800; includeSubDomains;

Thanks!

Delete a specific Execution ID

Would it be feasible to add an API feature that would delete a specific Execution ID? I can list them, and abort specific ones, but I can only delete one through the web interface (history page, but it is not labeled).

In certain cases I would prefer to use 'append' datasets instead of 'replace'. My workaround is to send parts over and then committing the execution when the parts have been loaded, but it is wasteful in some cases where I only need to replace a few calendar dates of data and not entire months or a year.

[{'createdAt': '2018-05-11T18:26:28Z',
  'currentState': 'SUCCESS',
  'endedAt': '2018-05-11T18:26:37Z',
  'id': 1,
  'modifiedAt': '2018-05-11T18:26:37Z',
  'startedAt': '2018-05-11T18:26:28Z',
  'updateMethod': 'APPEND'},
 {'createdAt': '2018-05-11T18:26:44Z',
  'currentState': 'SUCCESS',
  'endedAt': '2018-05-11T18:26:54Z',
  'id': 2,
  'modifiedAt': '2018-05-11T18:26:54Z',
  'startedAt': '2018-05-11T18:26:44Z',
  'updateMethod': 'APPEND'},

I think that it would be helpful to associate each Execution ID with a calendar date in my workflow. If I rerun tasks for that date, I would be able to delete the associated Execution ID, and then replace the data for just that date as a new execution.

The import has been cancelled because a new request has been submitted.

Hello,

I try uploading datasets to Domo. However, the dataset upload fails while the dataset on Domo shows Storing.
It keeps showing that continuously. If you try to execute the program again it fails as well and the dataset history shows "The import has been cancelled because a new request has been submitted." with a new status of the the current upload in "Storing" state.

Customer's voice on improving examples

Today is my first day knowing Domo's Python SDK. My goal is to pull data from Domo, so that I can analyze it using Python and Pandas.

I installed it easily and get the client id and secrets set up easily. However, it is not easy to understand how to pull data from Domo. For example, how to retrieve a list of datasets available to me?

It would be great to be clear in stating what is the problem to solve from the user perspective in your examples of code. Thank you!

Encoding Issues

I am receiving some encoding issues on certain characters when using the Python API (which is awesome by the way, thanks so much!), but the same data sent through workbench does not have any errors.

datablock = datablock.encode("iso-8859-1")
UnicodeEncodeError: 'latin-1' codec can't encode character '\u0160' in position 5104: ordinal not in range(256)

Some other ones which did not work are below are '\u2013', '\u2019', '\u2018'. In all cases, my workaround was to loop through my dimension columns and replace the characters with blanks or something similar.

df[column] .str.encode('utf-8')
Do some replacements and then
df[column] .str.decode('latin-1')

Mainly raising this as an issue just because Workbench seems to handle it fine and the data gets imported to Domo with no errors.

UTF bomb when using pydomo's Domo constructor

Trying to use a specific domo instance (e.g. acme.domo.com)

client_id = 'XXXX'
api_host = 'XXXX.domo.com'
client_secret = 'XXXXXXXXXXXXXX'
(redacted the client id and client secret)

domo = Domo(client_id, client_secret, logger_name='foo', log_level=logging.INFO, api_host=api_host)

Gives back this output:

---------------------------------------------------------------------------
JSONDecodeError                           Traceback (most recent call last)
<ipython-input-17-f0743cd73e81> in <module>()
      5 
      6 # equivelent of DomoR::init('customer', 'token') i think
----> 7 domo = Domo(client_id, client_secret, logger_name='foo', log_level=logging.INFO, api_host=api_host)

/usr/local/lib/python3.6/site-packages/pydomo/__init__.py in __init__(self, client_id, client_secret, api_host, **kwargs)
     74         self.logger.debug("\n" + DOMO + "\n")
     75 
---> 76         self.transport = DomoAPITransport(client_id, client_secret, api_host, kwargs.get('use_https', True), self.logger)
     77         self.datasets = DataSetClient(self.transport, self.logger)
     78         self.groups = GroupClient(self.transport, self.logger)

/usr/local/lib/python3.6/site-packages/pydomo/Transport.py in __init__(self, client_id, client_secret, api_host, use_https, logger)
     19         self.clientSecret = client_secret
     20         self.logger = logger
---> 21         self._renew_access_token()
     22 
     23     @staticmethod

/usr/local/lib/python3.6/site-packages/pydomo/Transport.py in _renew_access_token(self)
     78         response = requests.post(url=url, auth=HTTPBasicAuth(self.clientId, self.clientSecret))
     79         if response.status_code == requests.codes.OK:
---> 80             self.access_token = response.json()['access_token']
     81         else:
     82             self.logger.debug('Error retrieving access token: ' + self.dump_response(response))

/usr/local/lib/python3.6/site-packages/requests/models.py in json(self, **kwargs)
    890                     # used.
    891                     pass
--> 892         return complexjson.loads(self.text, **kwargs)
    893 
    894     @property

/usr/local/lib/python3.6/site-packages/simplejson/__init__.py in loads(s, encoding, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, use_decimal, **kw)
    516             parse_constant is None and object_pairs_hook is None
    517             and not use_decimal and not kw):
--> 518         return _default_decoder.decode(s)
    519     if cls is None:
    520         cls = JSONDecoder

/usr/local/lib/python3.6/site-packages/simplejson/decoder.py in decode(self, s, _w, _PY3)
    368         if _PY3 and isinstance(s, binary_type):
    369             s = s.decode(self.encoding)
--> 370         obj, end = self.raw_decode(s)
    371         end = _w(s, end).end()
    372         if end != len(s):

/usr/local/lib/python3.6/site-packages/simplejson/decoder.py in raw_decode(self, s, idx, _w, _PY3)
    398             elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
    399                 idx += 3
--> 400         return self.scan_once(s, idx=_w(s, idx).end())

JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Which I think is a UTF-8 decoding issue.

It is interesting to note a copule of things.

One this version of the code:

client_id = 'XXXX'
api_host = 'api.domo.com'
client_secret = 'XXXXXXXXXXXXXXXXXXX'

domo = Domo(client_id, client_secret, logger_name='foo', log_level=logging.INFO, api_host=api_host)

Will return

Exception: Error retrieving a Domo API Access Token: {"status":401,"statusReason":"Unauthorized","path":"/oauth/token","message":"Bad credentials","toe":"M6OEHSGBBH-S6C2Q-AGM3V"}

So this case is not suffering from a json decoding error.

The other is when using the correct credentials for domoR package:

DomoR::init( . <client_id> , 'XXXXXXXXX')
DomoR::fetch(...)

this code will work. The domoR is using a v2 version of the api: /api/data/v2/datasources/
Where the pydomo is using URL_BASE = '/v1/datasets' .

I was also wondering why there are different endpoints being used.

Exception when creating pages with API

Not critical, but I was attempting to make some pages with the API and I was not successful. I noticed that there were no imports listed on the example page as well, so I attempted:

from pydomo import PageClient

This seemed to work, but when I tried to create some pages:

domo = Domo(DOMO_CLIENT, DOMO_SECRET, DOMO_API, use_https=True, logger_name='domo', logger_level=logging.ERROR)
page = {'name': 'Telephony Dashboard'}
new_page = domo.pages.create(**page)
collections = [
        domo.pages.create_collection(new_page['id'], 'Telephony | Previous Day Performance'),
        domo.pages.create_collection(new_page['id'], 'Telephony | Week to Date Performance'),
        domo.pages.create_collection(new_page['id'], 'Telephony | Month to Date Performance'),
        domo.pages.create_collection(new_page['id'], 'Telephony | Historical Performance'),
    ]

I ran into this exception:

Exception: Page Error: {"error":"insufficient_scope","error_description":"Insufficient scope for this resource","scope":"dashboard"}

Likewise when I ran list(domo.pages.list())

Dataset Extract - Quoting Issue

I have notice an issue with text data being modified when being downloaded using the Python API.

I have a column in Domo defined as Text from the web ui and as STRING in the json response. In some of the rows the value is just a hyphen -, but when I extract the data to CSV it is written as '- (a single quote and the hyphen).

Here is how I am downloading the file:
domo.datasets.data_export_to_file(dataset_id='my-id', file_path='local_path', include_csv_header=True)

If I use postman to call the api directly the response looks correct.
https://api.domo.com/v1/datasets/query/execute/<< dataset id >>

Body:

{"sql": "SELECT * FROM table limit 1"}
pip list
Package           Version
----------------- ---------
pydomo            0.3.0.2

PDP for stream

When creating a stream, what is the best way to enable PDP?

In the examples, the PDP settings are only within the 'Datasets' section and not the streams section. I was able to create a streaming dataset and then manually add PDP though using the UI at domo.com.

pdp = datasets.create_pdp(dataset['id'], pdp_request)

I tried to replace datasets with streams, but I received an error.

pdp = streams.create_pdp(stream['id'], pdp_request)

AttributeError: 'StreamClient' object has no attribute 'create_pdp'

'TypeError: can't concat bytes to str' when calling domo.users.update()

I started exploring Domo API today and having problems when calling domo.users.update(). Please let me know what might be causing this issue, as I think I satisfied system requirements that are listed.

Code

import logging
from pydomo import Domo
from pydomo.users import CreateUserRequest

# Build an SDK configuration
client_id = 'my client id'
client_secret = 'my client secret'
api_host = 'api.domo.com'

# Check if connected successfully by trying to display my name
domo = Domo(client_id, client_secret, logger_name='foo', log_level=logging.INFO, api_host=api_host)
my_id = 12345678
user = domo.users.get(my_id)
print(user['name']) # correctly prints my name

# Now try changing my name
user_update = CreateUserRequest()
user_update.name = 'Max Lawnboy'
user = domo.users.update(my_id, user_update)
print(user['name'])

Error Message

Traceback (most recent call last):
  File "test.py", line 18, in <module>
    user = domo.users.update(my_id, user_update)
  File "C:\Users\user\Desktop\projects\domo_api_test\env\lib\site-packages\pydomo\users\UserClient.py", line 45, in update
    return self._update(self._base(user_id), HTTPMethod.PUT, requests.codes.ok, user_update, self.userDesc)
  File "C:\Users\user\Desktop\projects\domo_api_test\env\lib\site-packages\pydomo\DomoAPIClient.py", line 50, in _update
    + self.transport.dump_response(response))
  File "C:\Users\user\Desktop\projects\domo_api_test\env\lib\site-packages\pydomo\Transport.py", line 86, in dump_response
    data = dump.dump_all(response)
  File "C:\Users\user\Desktop\projects\domo_api_test\env\lib\site-packages\requests_toolbelt\utils\dump.py", line 193, in dump_all
    dump_response(response, request_prefix, response_prefix, data)
  File "C:\Users\user\Desktop\projects\domo_api_test\env\lib\site-packages\requests_toolbelt\utils\dump.py", line 154, in dump_response
    proxy_info=proxy_info)
  File "C:\Users\user\Desktop\projects\domo_api_test\env\lib\site-packages\requests_toolbelt\utils\dump.py", line 65, in _dump_request_data
    bytearr.extend(prefix + method + b' ' + request_path + b' HTTP/1.1\r\n')
TypeError: can't concat bytes to str

System Info

Fresh virutalenv
Python 3.6.1
pydomo (0.2.1)
Windows 10, 64-bit

JSONDecodeError

Hi, I am not able to connect.
Whenever I am trying to make a connection, the following error occurs. I am using Anaconda 3

File "E:/Python projects/DOMO_Project/domo_init.py", line 17, in
domo = Domo(client_id, client_secret, api_host=api_host)
File "E:\Anaconda3\lib\site-packages\pydomo_init_.py", line 76, in init
self.transport = DomoAPITransport(client_id, client_secret, api_host, kwargs.get('use_https', True), self.logger)
File "E:\Anaconda3\lib\site-packages\pydomo\Transport.py", line 21, in init
self._renew_access_token()
File "E:\Anaconda3\lib\site-packages\pydomo\Transport.py", line 80, in renew_access_token
self.access_token = response.json()['access_token']
File "E:\Anaconda3\lib\site-packages\requests\models.py", line 892, in json
return complexjson.loads(self.text, **kwargs)
File "E:\Anaconda3\lib\json_init
.py", line 319, in loads
return _default_decoder.decode(s)
File "E:\Anaconda3\lib\json\decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "E:\Anaconda3\lib\json\decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Page API

I am using Python API to pull list of all the pages and their subpages in our domo instance. I am an Admin in our domo instance. When I run the script, It only pulls the pages that I am owner or I am shared with.
I would like to be able to pull all the pages regardless of I am shared or not?

Is this intentional?

SSL Error

Hi,

I've been trying to use this script but am running into SSL issue. Can anyone help? I'm trying to do a string import into Domo.

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/contrib/pyopenssl.py", line 441, in wrap_socket
    cnx.do_handshake()
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1716, in do_handshake
    self._raise_ssl_error(self._ssl, result)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1456, in _raise_ssl_error
    _raise_current_error()
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/OpenSSL/_util.py", line 54, in exception_from_error_queue
    raise exception_type(errors)
OpenSSL.SSL.Error: [('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/connectionpool.py", line 601, in urlopen
    chunked=chunked)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/connectionpool.py", line 346, in _make_request
    self._validate_conn(conn)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/connectionpool.py", line 850, in _validate_conn
    conn.connect()
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/connection.py", line 326, in connect
    ssl_context=context)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/util/ssl_.py", line 329, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/contrib/pyopenssl.py", line 448, in wrap_socket
    raise ssl.SSLError('bad handshake: %r' % e)
ssl.SSLError: ("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/adapters.py", line 440, in send
    timeout=timeout
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/connectionpool.py", line 639, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/util/retry.py", line 388, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.domo.com', port=443): Max retries exceeded with url: /oauth/token?grant_type=client_credentials (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "DomoScript.py", line 95, in <module>
    DomoSDKExamples()  # Execute the script
  File "DomoScript.py", line 41, in __init__
    domo = self.init_domo_client(CLIENT_ID, CLIENT_SECRET)
  File "DomoScript.py", line 55, in init_domo_client
    return Domo(client_id, client_secret, logger_name='foo', log_level=logging.INFO, api_host=API_HOST, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/pydomo/__init__.py", line 76, in __init__
    self.transport = DomoAPITransport(client_id, client_secret, api_host, kwargs.get('use_https', True), self.logger)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/pydomo/Transport.py", line 21, in __init__
    self._renew_access_token()
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/pydomo/Transport.py", line 78, in _renew_access_token
    response = requests.post(url=url, auth=HTTPBasicAuth(self.clientId, self.clientSecret))
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/api.py", line 112, in post
    return request('post', url, data=data, json=json, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/api.py", line 58, in request
    return session.request(method=method, url=url, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/sessions.py", line 508, in request
    resp = self.send(prep, **send_kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/sessions.py", line 618, in send
    r = adapter.send(request, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/adapters.py", line 506, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='api.domo.com', port=443): Max retries exceeded with url: /oauth/token?grant_type=client_credentials (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))`

Error: dataset_request missing

I'm trying to run the file, but it gives my the following error

  File "domo_api.py", line 20, in <module>
    domo.datasets.create()
TypeError: create() missing 1 required positional argument: 'dataset_request```

'Domo' has no attribute 'ds_create'

Hi, I am trying to upload a panda dataset to DOMO using ds_create but it returns that the domo object has no attribute 'ds_create' that I could use.

my domo is like:
domo = Domo(client_id, client_secret, api_host=api_host, logger_name=logger_name, log_level=log_level)

and I am trying to create a dataset and upload it to DOMO:
domo.ds_create(Output,'Python | historical data','Python | Generated during demo')

Thankss!

Limit of user that can be pulled per group

I was wondering is there a limit on how many users can you pull per group. Currently we have several groups that has more than 50 users but when I try to pull the list of users per each group, it only pulls only 50 users for that group.

Here is the script below:

data_export is not returning the full data consistently

Hi there,

Context:
We have code exporting domo dataset (~155MB) on an hourly basis for over a year and everything works fine.

Issue:
Starting around September 16 2021, we've observed the csv data export return is missing ~700 rows every now and then.

Attempt to Solve:

  • reduced the data set thinking it might be size issue, but no luck here.
  • checked the domo-python-sdk changelog that could have affected the datasets.data_export function, but also nothing here.

GZIP support

The API only supports CSV files...is there going to be support for GZIP filese?

From the Domo website:
Note: Streams support gzip compression; all CSV data parts should be uploaded as gzip files (application/gzip). This will greatly reduce upload time. Plain CSV uploads are also supported, but are only recommended for very small DataSets.

A part should be no smaller than 20MB and no larger than 100MB of data

Code documentation is incorrect for ds_query.

The documentation for ds_query states to pass in the query as a dictionary with the sql key however when passing it through down to dataset.query that method expects a string and packages it as a dictionary. The documentation example should be updated to use a string and not a dict.

Pydomo package cannot be resolved using Poetry

I noticed the Poetry dependency resolver cannot find the latest pydomo (0.3.0.3).
To get around it, I was forced to first install packages using Poetry and then install pydomo separately using pip. Maybe something in the package structure is unexpected?

Domo Connection Issue

Hi,
I am trying to run the script through Python and now it is giving this error. It was working fine few days prior. Any Ideas why this error is coming up.

SSLError: HTTPSConnectionPool(host='api.domo.com', port=443): Max retries exceeded with url: /oauth/token?grant_type=client_credentials (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')])")))

Domo Maintenance

Hi,

I have some scripts that grab a bunch of data from an FTP server and uploads it to Domo.
The data is formatted strangely which is why we needed a custom solution.

As I upload these CSV files using the data_import_from_file() I often run across this Domo Maintenance issue. I wrote the script to handle it and retry however this happens often enough that I don't want to try to upload files that are too big in case I have to start over multiple times.

The questions are:

  • What is the recommended file size to upload to Domo?
  • Should I try a stream instead?
  • Is this a common issue that others are experiencing?
  • Is there anyway to avoid this?

See my log output below:

2019-05-23 08:44:13,662 - logger - INFO - File size: 93.875116 MB. 
2019-05-23 08:47:49,270 - logger - ERROR -  Failed to append 201901-Orders_part4.csv to Domo. Error uploading DataSet: <!DOCTYPE html>

<html>
    <head>
        <meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate" />
        <meta http-equiv="Pragma" content="no-cache" />
        <meta http-equiv="Expires" content="0" />
        <title>Domo Maintenance</title>
    </head>
    <body>
			<style type="text/css">
				body {
					background-color: #54585a;
					color: #d4d5d6;
					font-size: 14px;
					font-family: Helvetica Neue, Arial, Helvetica, sans-serif;
					padding-top: 94px;
					line-height: 20px;
				}

				.error-page {
					width: 825px;
				}

				.domobot {
					float: left;
					height: 390px;
					width: 285px;
					padding-right: 40px;
					padding-top: 6px;
					text-align: right;
				}

				h1 {
					color: white;
					font-size: 32px;
					line-height: 38px;
					font-weight: 500;
				}

				h2 {
					font-size: 20px;
					font-weight: 200;
					line-height: 27px;
				}

				h3 {
					font-weight: 500;
					font-size: 15px;
				}

				hr {
					border-top: 1px solid #404445;
					border-bottom: 1px solid #7F8283;
					border-left: none;
					border-right: none;
					margin: 28px 0 25px 0;
				}

				a {
					color: #9acbeb;
					text-decoration: none;
				}

				a:hover, a:active {
					text-decoration: underline;
				}

			</style>
			<div class="error-page">
				<div class="domobot"><img src="/error/domo-logo-200.png" width="200" height="200" alt="Domo"/></div>
				<div class="content">
					<h1>Domo is currently undergoing<br/> maintenance.</h1>
					<h2><br/>Thank you for your patience.</h2>
					<hr />
					<h3>Have any questions? Feel free to contact us.</h3>
					<p>Call us toll-free: 800.899.1000<br/>Email: <a href="mailto:[email protected]">[email protected]</a></p>
				</div>
			</div>
    </body>
</html>

2019-05-23 08:47:49,271 - logger - INFO - Domo is undergoing Maintenance. Waiting 60 seconds and trying again. 1 attempt(s)
2019-05-23 08:48:49,274 - logger - INFO - Attempting to APPEND 201901-Orders_part4.csv to Domo datasetID: d4f43122-8add-40be-b905-. 
2019-05-23 08:48:49,275 - logger - INFO - File size: 93.875116 MB. 
2019-05-23 08:51:51,909 - logger - INFO - Appended 201901-Orders_part4.csv to Domo.

TypeError when trying to export Dataset

I am trying to export a dataset from my company's Domo instance and am getting the following error when trying to run the following code (I have removed the actual client_id, client_secret, and data_set_id:

import pandas as pd
domo = Domo('client_id','client_secret', api_host='api.domo.com')
domo.ds_get('data_set_id')```

results in `TypeError: <class 'bool'> is not convertible to datetime`

I checked the github and it looks like the function definition for the ds_get function attempts the following on each column of the dataset:

`for col in df.columns:
            if df[col].dtype == 'object':
                try:
                    df[col] = to_datetime(df[col])
                except ValueError:
                    pass`
This seems to be where I am getting the TypeError which appears to be failing because this try-except block looks to only handle ValueErrors. Is there any way to get around this? 

Getting Dataset Type & Accounts List

I cannot find documentation on using pydomo to get the dataset type or an account list? Also trying this with curl to no success. I can get my datasets, query them, etc, just no indication of what type the dataset is, so I tried to approach it from the accounts side and could not get access to the accounts using curl, as I could not find a way to get it with pydomo.

Can you help?

When I try to curl the accounts list, I am getting "Full authentication is required to access this resource" 401 response.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.