davidmuller / aws-requests-auth Goto Github PK
View Code? Open in Web Editor NEWAWS signature version 4 signing process for the python requests module
License: Other
AWS signature version 4 signing process for the python requests module
License: Other
Can't get this to work. Trying the README example I get the following error
AttributeError: 'AWSRequestsAuth' object has no attribute 'split'
Seems like both Urllib3HttpConnection
and RequestsHttpConnection
require http_auth to be either ':' separated string or a tuple. Any ideas?
When using AWSRequestsAuth with e.g. execute-api service, then there are issues with urlencoding of spaces. The requests lib uses quote_plus and this causes issues like this:
The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.
The canonical string for this request should have been
When a space is encoded as + instead of %20.
What do you think of a optional parameter that allows specifying of the quote-function? The elasticsearch lib handles this by itself, but it would be nice for other services.
I note the comment in the README that botocore must be installed in order to import BotoAWSRequestsAuth but with the current version of botocore, aws-requests-auth, and Python 3.6.4, this import fails with ImportError: cannot import name 'BotoAWSRequestsAuth'.
I have:
aws-requests-auth==0.3.3
boto3==1.7.83
botocore==1.10.83
My Python script is:
import requests
from aws_requests_auth.boto_utils import BotoAWSRequestsAuth
It would be great if this could be extended to use STS temporary credentials which include a token as well as the access key and secret key.
I may do a PR at some point but I'm not sure when I will have time to get to it.
Thanks for the library!
From the aws offical documentation I saw the header name X-Amz-Date
is uppercase with every first letter.
But in your code, the header is x-amz-date
.
Referencing this:
This call requires host
, region
, and service
to be provided. Since it uses botocore
functionality, it should be able to capture region
from the environment if not explicitly supplied. It does have to be override-able, but shouldn't be required by default.
Making region
optional is a potentially breaking change, since the order would likely also change.
Currently, the version of aws-requests-auth published on PyPI is a source tarball, but this poses issues if you're building a lambda that has binary dependencies, such as, say PyYAML or the cryptography library.
To give you an example, say you do your development of a Mac or Windows box: you might install all your libraries into a directory to be zipped up like so:
pip3 install \
--ignore-installed \
--compile \
--platform linux_x86_64 \
--only-binary :all: \
--implementation cp \
--requirement "$requirements" \
--target .
The use of --only-binary :all:
is unfortunately required if you're also using --platform
to guarantee that any binary dependencies downloaded are for the correct target platform. Unfortunately, this means that pip won't bother downloading source tarballs if that's all that's available. Instead, you get an error like the following:
ERROR: Could not find a version that satisfies the requirement aws-requests-auth (from -r ../requirements.txt (line 1)) (from versions: none)
ERROR: No matching distribution found for aws-requests-auth (from -r ../requirements.txt (line 1))
The likes of requests, boto3, &c., don't have this particular issue because they're packaged as wheels.
The additional flags are unavoidable, unfortunately, and there seems to be no way to tell pip that source distributions are OK as a fallback.
The fix is simple enough. Add this to setup.cfg
:
[bdist_wheel]
universal = 1
That will mean the wheel will work with both Python 2 and Python 3 and indicates that the wheel will also work on any platform. Also, in setup.py
, replace from distutils.core import setup
with from setuptools import setup
. This also fixes a bug as distutils doesn't actually support the install_requires
distribution option, so it means the dependency on requests will now work properly.
Finally, rather than just doing python3 setup.py sdist bdist_wheel
, ensure the wheel
package is installed and do python3 setup.py sdist bdist_wheel
instead.
You should end up with something like this:
$ ls dist
aws-requests-auth-0.4.2.tar.gz
aws_requests_auth-0.4.2-py2.py3-none-any.whl
Trying with Python 3.6, and version 0.4.1:
import requests
from aws_requests_auth.aws_auth import AWSRequestsAuth
# let's talk to our AWS Elasticsearch cluster
auth = AWSRequestsAuth(aws_access_key='****',
aws_secret_access_key='****',
aws_host='awis.us-west-1.amazonaws.com',
aws_region='us-west-1',
aws_service='awis')
r = requests.get('https://awis.amazonaws.com/api?Action=CategoryListings&Count=10&Path=Top/Arts', auth=auth)
I got
The request signature we calculated does not match the signature you provided.
whereas it's working fine with another library. I missed something?
I'm switching some of my HTTP calls from requests
to aiohttp
which uses BasicAuth - is there any way of using this library with aiohttp
?
I tried "Using Boto To Automatically Gather AWS Credentials" and get:
File "/var/task/aws_requests_auth/aws_auth.py", line 165, in get_aws_request_headers
self.service)
File "/var/task/aws_requests_auth/aws_auth.py", line 27, in getSignatureKey
kDate = sign(('AWS4' + key).encode('utf-8'), dateStamp)
TypeError: must be str, not NoneType
Could you please provide any way how to append something to SignedHeaders
?
Just wanted to thank you for your work with this module. This saved me quite a bit of time and effort.
FWIW (if you want to note in the readme), when using Lambda to access AWS ES, you can simply use the environment variables and therefore the role that Lambda is assuming:
import os
from aws_requests_auth.aws_auth import AWSRequestsAuth
from elasticsearch import Elasticsearch, RequestsHttpConnection
def lambda_handler(event, context):
other_code_here()
auth = AWSRequestsAuth(aws_access_key=os.environ['AWS_ACCESS_KEY_ID'],
aws_secret_access_key=os.environ['AWS_SECRET_ACCESS_KEY'],
aws_token=os.environ['AWS_SESSION_TOKEN'],
aws_host=es_host,
aws_region='us-west-2',
aws_service='es')
Can we add support for regeneration of the STS token, as the token would get expired after sometime. (for those using REST APIs only, without boto packages)
For eg - Downloading large file from S3 and if the token expires in between, the file downloaded would be incomplete.
I'm deploying CSR 1000v on an EC2 instance in AWS.
This is my python code for authentication in order to use RESTCONF which is already enabled in the router.
import requests
import pprint
from aws_requests_auth.aws_auth import AWSRequestsAuth
def get_json(interface):
authaws = AWSRequestsAuth(aws_access_key='AWS_ACCESS_KEY',
aws_secret_access_key='AWS_SECRET_ACCESS_KEY',
aws_host='ec2-xx-xx-xx-xx.us-west-2.compute.amazonaws.com',
aws_region='us-west-2',
aws_service='compute')
source = 'https://ec2-xx-xx-xx-xx.us-west-2.compute.amazonaws.com/restconf/data/'
module = 'ietf-interfaces:'
container = 'interfaces'
leaf = '/interface=' + interface
options = ''
url = source + module + container + leaf + options
headers = {'Content-type': 'application/yang-data+json', 'Accept': 'application/yang-data+json'}
r = requests.get(url, auth=authaws, headers=headers, verify=False)
return r.json()
if __name__ == '__main__':
interface = 'GigabitEthernet1'
pprint.pprint(get_json(interface))
Here what I got after execution.
server@zsz:~/shared_files$ python get_one_interface.py
{u'errors': {u'error': [{u'error-tag': u'access-denied',
u'error-type': u'protocol'}]}}
Obviously, the authentication cannot be done.
For aws_access_key
and aws_secret_access_key
, I got it from IAM console. I even generated new ones, but still does not work.
For people using multiple different AWS profiles, it makes life much easier to be able to specify them easily programmatically.
Following the example in readme, I tried creating es_client
object as follows:
es_host = "search-XXXXX-YYYYYYYYYYY.us-west-2.es.amazonaws.com"
auth = AWSRequestsAuth(
aws_host=es_host,
aws_region="us-west-2",
aws_service="es",
**get_credentials()
)
es = Elasticsearch(
host=es_host,
port=443, # also tried port=80, same error
connect_class=RequestsHttpConnection,
http_auth=auth
)
However, the above results in a TypeError
being raised with the following traceback:
Traceback (most recent call last):
File "utils/wrappers/elastic.py", line 41, in <module>
http_auth=auth
File "/home/hjpotter92/.virtualenvs/venv/local/lib/python2.7/site-packages/elasticsearch/client/__init__.py", line 171, in __init__
self.transport = transport_class(_normalize_hosts(hosts), **kwargs)
File "/home/hjpotter92/.virtualenvs/venv/local/lib/python2.7/site-packages/elasticsearch/transport.py", line 108, in __init__
self.set_connections(hosts)
File "/home/hjpotter92/.virtualenvs/venv/local/lib/python2.7/site-packages/elasticsearch/transport.py", line 161, in set_connections
connections = map(_create_connection, hosts)
File "/home/hjpotter92/.virtualenvs/venv/local/lib/python2.7/site-packages/elasticsearch/transport.py", line 160, in _create_connection
return self.connection_class(**kwargs)
File "/home/hjpotter92/.virtualenvs/venv/local/lib/python2.7/site-packages/elasticsearch/connection/http_urllib3.py", line 57, in __init__
self.headers.update(urllib3.make_headers(basic_auth=http_auth))
File "/home/hjpotter92/.virtualenvs/venv/local/lib/python2.7/site-packages/urllib3/util/request.py", line 65, in make_headers
b64encode(b(basic_auth)).decode('utf-8')
File "/usr/lib/python2.7/base64.py", line 54, in b64encode
encoded = binascii.b2a_base64(s)[:-1]
TypeError: b2a_base64() argument 1 must be convertible to a buffer, not AWSRequestsAuth
I am using
aws-requests-auth==0.3.3
boto3==1.4.5
elasticsearch==5.3.0
When using with elasticsearch, if you have a character outside the ASCII range, the library fails with
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe4 in position 0: ordinal not in range(128)
To reproduce, try to index a simple record with a single field containing ä
.
It's not always possible to sign the payload - especially when uploading large files to S3. That's why there exists UNSIGNED-PAYLOAD.
Unfortunately, AWSRequestsAuth.get_aws_request_headers
does not support it.
First off, thanks for a great library!
I'm writing some unit tests on a package that makes use of your library, and I've got a method called validate_auth
that simply pings an endpoint at which authentication is required. I can mock out the endpoint's response, but I'm having trouble knowing how to mock out BotoAWSRequestsAuth
. Could you offer any guidance on what the best approach might be?
I'm wondering if perhaps using boto's stubber is the best approach?
We used to authenticate lambda using this method, however now since last Friday it failing. Is there any new changes that I am not aware of or is it a correct way of doing it. Please see the the below error I get when i run my lambda.
{"message":"The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.
The Canonical String for this request should have been
'PUT
/_template/global
host:search-comments-logs-fl-lknqw2m7h3c2235kiv5mdchxd4.us-east-1.es.amazonaws.com
x-amz-date:20171009T204043Z
x-amz-security-token:<token>
host;x-amz-date;x-amz-security-token
<hash>'
The String-to-Sign should have been
'AWS4-HMAC-SHA256
20171009T204043Z
20171009/us-east-1/es/aws4_request
<hash>'
"}
This is to request support for matrix parameters.
urlparse()
places these parameters in the params attribute so they are being ignored. They should be part of the path when creating the canonical request.
This is only an issue when the URL ends with the matrix parameters. So:
This is fine: https://x.com/user;name=Piet;surname=Pompies/details
This does NOT work: https://x.com/user;name=Piet;surname=Pompies
Getting TypeError: unicode-objects must be encoded before hashing. I believe its something to do with line.
I know you're using the aws signed example
payload_hash = hashlib.sha256(body).hexdigest()
Was hoping you knew of a way to encode the body.
I am confused on this because in some requests using boto3, I saw the header name X-Amz-Date is uppercase with every first letter.
But in our code, the header is x-amz-date .
https://github.com/DavidMuller/aws-requests-auth/blob/master/aws_requests_auth/aws_auth.py#L182
Is there anything I am missing ?
https://www.python-httpx.org/compatibility/#request-urls
Accessing response.url will return a URL instance, rather than a string.
Use str(response.url) if you need a string instance.
Im assuming the incident ticket process is for actual bugs
is there a discussion or forum to help get the plug in working?
im so close now and must be missing something
im getting an error Check your AWS Secret Access Key and signing method
it took me a while to realize it was the AccessKey and not the AccessKeyID (returned by the sts call) i need to use
Im using the SecretAccessKey returned by the sts auth
Any guidance on how i can get assistance to take this forward would be much appreciated
How would I use this to post JSON data?
auth = AWSRequestsAuth(aws_access_key=access_key,
aws_secret_access_key=secret_key,
aws_token=security_token,
aws_host='g1ik5r2vf5.execute-api.eu-west-1.amazonaws.com',
aws_region='eu-west-1',
aws_service='execute-api')
g = v.post('',headers=entry_headers, json=entry_data, auth=auth, verify=True)
I am using PyCharm as my IDE and Python 3.6 version. When I tried to install aws-requests-auth with Pycharm pip. it does not find the package. However, when I manually moved the package to my env folder, everything works fine...
(and i can find the package if i use Python 2.7)
I wonder if the download is available with Python 3.6
Hi I just found out about this project which seems to be more recent, however there is this existing project https://github.com/sam-washington/requests-aws4auth - which suppose to do the same and is also used by elasticsearch-curator.
Could you please tell me if there are any differences or improvements over mentioned project which also seems not to be maintained?
I'm using this library in EC2 to connect to elasticsearch via the python elasticsearch library. This works fine for several hours but then all requests start failing with 403 The security token included in the request is expired
errors. It seems like whatever token this library is using is expiring and the library isn't handling requesting a new token. I'm using credentials pulled from the boto3
session.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.