Giter Club home page Giter Club logo

shred's Introduction

Overview

Shred is an HTTP client that wraps HTTP interfaces so you can easily create CoffeeScript or JavaScript clients.

Define your client:

github = resource "https://api.github.com/",
  repo: (resource) ->
    resource "repos/{owner}/{repo}/",
      issues: (resource) ->
        resource "issues",
          create:
            method: "post"
            headers:
              accept: "application/vnd.github.v3.raw+json"
            expect: 201
          list:
            method: "get"
            headers:
              accept: "application/vnd.github.v3.raw+json"
            expect: 200

And then use it:

try
  {data} = yield github
    .repo {owner, repo}
    .issues
    .list()
  console.log number, title for {number, title} in (yield data)
catch error
  console.error error

Installation

npm install shred

Status

Shred 1.0.x is a complete reboot of the project. The goal is the same, but the interface has changed drastically. Shred 1.0.x is not backwards compatible with earlier versions.

Shred is presently alpha quality code, as designated by the version suffix.

Description

HTTP is a rich protocol, but the low-level details of setting headers and checking response codes muck up our code. So we either ignore these nuances or write wrapper functions to hide the details.

Shred makes it easy to declaratively create API wrapper functions. Shred also features support for URL templates, response compression, authorization, and streaming responses. And more features are coming soon!

API

  • resource: Define a resource. Takes a URL, path, or URL template, and an interface declaration.

Interface Definitions

Interface definitions are an object whose properties define subresources or request methods on the resource being defined.

If a property contains an object, it will be defined as a request method described the object.

If it contains a function, the property will define a subresource for which the function is the initializer.

Request Method Descriptions

Request method descriptions can include three properties:

  • method: The HTTP method used to make the request, ex: GET, PUT.

  • headers: An object describing the HTTP headers to send with the request, ex: content-type, accept.

  • expect: The expected HTTP status code, ex: 200. Other status codes will be treated as errors.

Subresource Initializer Functions

The signature for resource initializers is:

(resource) ->

The resource function passed into the initializer functions works just as the resource exported by Shred. You can use it to define the subresource.

Typically, you'll want to use paths or template URLs to define subresources, which will be concatenated with the parent resources' URLs.

Subresources defined using URL templates will be functions, allowing you to pass in an object whose key-value pairs will instantiate the template.

Shred Responses

When you invoke a Shred request functions, you get back a response context. This contains both the HTTP response object and a data object. The data object is a promise that resolves when the response body is finished streaming. If the content-type appears to be JSON-based, Shred will parse the response body as JSON and return an object, otherwise, it will return a string.

Explicit Invocation

You can explicit invoke Shred request functions using the invoke command. This is particularly useful when using helper functions defined on the request function, such as authorize.

Authorization With Shred

You can use the authorization method defined on request functions to pass authorization information into the request. The authorization function takes a object describing the authorization scheme. Currently, basic and bearer are supported.

Example

try
  yield github
    .repo {owner, repo}
    .issues
    .create
    .authorize basic: {username: token, password: ""}
    .invoke {title, body}

shred's People

Contributors

andyburke avatar aredridel avatar automatthew avatar ayoung avatar brbrady avatar cwholt avatar dyoder avatar fehguy avatar freeformflow avatar g23 avatar jmcintire avatar jnugh avatar kapouer avatar kaven276 avatar lancelakey avatar nlacasse avatar relictm avatar thedaniel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

shred's Issues

Leaking listeners for socket event "timeout"

warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.

at shred/lib/shred/request.js:400:25

Is there an easy way to fix this ? Intense shred usage would require setMaxListeners(0) by default ?

should let register custom processors

At the end of shred.js, add the following line is ok.

  Shred.registerProcessor = require('./shred/content.js').registerProcessor;

Then, you can do register like this

var qs = require('querystring');
Shred.registerProcessor(
   ["application/x-www-form-urlencoded"],
   { parser : qs.parse, stringify : qs.stringify });

Then, you can shred.get like this:

headers['Content-Type'] = "application/x-www-form-urlencoded";
shred.post({
  url : loginURL,
  headers : headers,
  content : {
    phoneNum : username,
    password : password,
    preLoginUrl : ''
  },
  on : {

Seeing weirdness with Buffers in latest code...

I used to be able to do:

console.log( response.content.body );

And get a string version of the body, which was useful for debugging. Now with the buffer change, I'm seeing a lot of:

<Buffer 22 59 6f 75 ... 72 6c 2e 22>

What's the proper way to get:

  1. A string representation of the response body?
  2. A parsed representation of the response body? (Sometimes response.content.data seems to work, other times, it seems just to be a buffer...)

socket hang up on 302 response

When making a request to a Rails app, it responds with a 302, but shred bails with a "Request failed: socket hang up" error.

Adding a "request_error" handler doesn't trap the error.

If I call the Rails app directly, I get these headers:

X-Ua-Compatible IE=Edge
Location http://localhost:3000/auth/admin
Connection Keep-Alive
Content-Type text/html; charset=utf-8
Date Wed, 21 Dec 2011 01:51:04 GMT
Server WEBrick/1.3.1 (Ruby/1.8.7/2011-02-18)
X-Runtime 0.025210
Content-Length 98
Cache-Control no-cache

Not sure where the problem is... I would think Shred would handle this as a redirect?

Cannot handle malformed cookies

I'm getting an exception when using shred, on line 40:

/app/node_modules/shred/node_modules/cookiejar/cookiejar.js:72                   
                  , key=pair[1].trim().toLowerCase()                             
TypeError: Cannot read property '1' of null                                      
    at Cookie.parse (/app/node_modules/shred/node_modules/cookiejar/cookiejar.js:72:17)
    at new Cookie (/app/node_modules/shred/node_modules/cookiejar/cookiejar.js:28:16)
    at Cookie (/app/node_modules/shred/node_modules/cookiejar/cookiejar.js:32:16)
    at CookieJar.setCookies (/app/node_modules/shred/node_modules/cookiejar/cookiejar.js:217:16)
    at Object.<anonymous> (/app/node_modules/shred/lib/shred/response.js:40:23)  
    at ClientRequest.<anonymous> (/app/node_modules/shred/lib/shred/request.js:340:16)
    at ClientRequest.emit (events.js:67:17)                                      
    at HTTPParser.onIncoming (http.js:1225:11)
    at HTTPParser.onHeadersComplete (http.js:102:31)
    at Socket.ondata (http.js:1124:24)

It's running in a script with thousands of URLs, so I couldn't tell you the URL unfortunately.

The exception occurs because the pattern the cookie is looking for isn't matched, which means the cookie must be malformed.

Browserify Fails to Run - Cannot find shred.js

I'm having problems getting Browserify to bundle Shred. See below (on Windows 7) after npm install:

D:\shred>cake bundle
path.existsSync is now called `fs.existsSync`.

D:\shred\node_modules\browserify\lib\wrap.js:352
            throw new Error('Cannot find module ' + JSON.stringify(mfile)
                  ^
Error: Cannot find module "D:\\shred\\lib\\shred.js" from directory "D:\\shred\\
lib"
  at EventEmitter.Wrap.require (D:\shred\node_modules\browserify\lib\wrap.js:352
:19)
  at include (D:\shred\node_modules\browserify\lib\wrap.js:312:14)
  at Array.forEach (native)
  at EventEmitter.Wrap.requireMultiple (D:\shred\node_modules\browserify\lib\wra
p.js:290:16)
  at EventEmitter.Wrap.require (D:\shred\node_modules\browserify\lib\wrap.js:324
:21)
  at module.exports (D:\shred\node_modules\browserify\index.js:117:7)
  at Object.TaskHelpers.makeBundle (D:\shred\Cakefile:39:14, <js>:64:16)
  at Object.action (D:\shred\Cakefile:10:3, <js>:21:24)
  at helpers.extend.invoke (C:\Users\kevin.deenanauth\AppData\Roaming\npm\node_m
odules\coffee-script\lib\coffee-script\cake.js:44:26)
  at Object.exports.run (C:\Users\kevin.deenanauth\AppData\Roaming\npm\node_modu
les\coffee-script\lib\coffee-script\cake.js:70:21)
  at Object.<anonymous> (C:\Users\kevin.deenanauth\AppData\Roaming\npm\node_modu
les\coffee-script\bin\cake:7:38)
  at Module._compile (module.js:456:26)
  at Object.Module._extensions..js (module.js:474:10)
  at Module.load (module.js:356:32)
  at Function.Module._load (module.js:312:12)
  at Function.Module.runMain (module.js:497:10)
  at startup (node.js:119:16)
  at node.js:901:3


D:\shred>

I've validated shred.js exists in d:\shred\lib\

Thanks!

Remove support for multiple listeners for single event

Viz.

  A GET request with multiple "response" handlers
    ✓ should be able to have multiple handlers
    ✓ should run all of the handlers

  A request using status names and status codes
    ✓ should run both the status code handler and the status name handler
    ✓ should only run lowercased handlers

gzip support

If a response comes back with "Content-Encoding: gzip", then the body of the response should be gunzipped.

This will require node 0.6, so that we can use the native zlib library:
http://nodejs.org/docs/latest/api/zlib.html

In response.js lines 26-32 we set the body of the response by calling "new Content" and passing in the Content-Type. We should also pass in the "Content-Encoding".

Inside content.js, we need to check the content-encoding, and do the gunzipping if it is "gzip".

You can make a request that asks for gzip content like this:
shred.get({
url: "http://api.spire.com",
headers: {
accept: "application/json",
"Accept-Encoding": "gzip"
},
on: {
response: function (d) { console.log(d) }
}
});

Right now that request logs a bunch of gzipped garbage. Once this ticket is fixed, that should log text.

We should make a test for this too.

Make Request an EventEmitter

You have this api where you can do

shred.request({
  ...
  on: { ... }
});

It would be nice if you could do

shred.request({
...
}).on(...).on(...);

instead. This would require making request an EventEmitter.

Which basically means not having an .emitter property but having request inherit from EventEmitter

Port should default to window.location.port, not 80, if not specified

The port should not default to port 80 if not specified in a request. Instead it should default to the value obtained from window.location.port unless it is undefined in which case it should default to 80.

This snippet:

port: {
    get: function() {
      if (!this._port) {
        switch(this.scheme) {
          case "https": return this._port = 443;
          case "http":
          default: return this._port = 80;
        }
      }
      return this._port;
    },
    set: function(value) { this._port = value; return this; },
    enumerable: true
  },

Conflicts with this snippet:

http.request = function (params, cb) {
    if (!params) params = {};
    if (!params.host) params.host = window.location.host.split(':')[0];
    if (!params.port) params.port = window.location.port;

    var req = new Request(new xhrHttp, params);
    if (cb) req.on('response', cb);
    return req;
};

When issuing a request using a relative path that omits the scheme, host, and port. The library correctly obtains the host from window.location.host, but does not correctly obtain the port from window.location.port.

response.content._body.length

... is the only wicked way i found to access to real length property of the response.
Doing anything else returns an unusable length.

It is very useful to manage the case where i accept JSON, but the response is empty.
If i do
response.content.data
it triggers an exception because it calls JSON.parse on an empty string.

wrong content-length with accentuated characters

Hi,
this will eat the last char of the stringified request body :

shred.put({
    url: someurl,
    content: {
        test: "testé"
    },
    headers: {
        accept: "application/json",
        content_type: "application/json"
    }
});

Because (new Buffer('testé')).length == 6 and 'testé'.length == 5
A crude fix is replacing in content.js

get: function() { return this.body.length; }

by

get: function() { return new Buffer(this.body).length; }

Regards.

Shred is broken by [email protected]

We have a Node.js project that was working has just broken after an "npm update". Further investigation revealed that only thing that changed is that shred's dependency on "ax>= 0.1.5" means ax updated to version [email protected] and now even simple things break e.g.:

var shred = require('shred');

var s = new shred();

var req = s.get(
{
url: "http://www.google.co.uk/"},{
on: {
// If we get a successful response
200: function(response) {
console.log(response)
}
}
});

/Users/gbird/projects/wolf/node_modules/shred/node_modules/ax/logger.coffee:1
ction (exports, require, module, __filename, __dirname) { {sprintf} = require
^
SyntaxError: Unexpected token =
at Module._compile (module.js:437:25)
at Object.Module._extensions..js (module.js:467:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.require (module.js:362:17)
at require (module.js:378:17)
at Object. (/Users/gbird/projects/wolf/node_modules/shred/lib/shred.js:10:10)
at Module._compile (module.js:449:26)
at Object.Module._extensions..js (module.js:467:10)
at Module.load (module.js:356:32)

I suspect the bug is in "ax", but it makes Shred unusable.

Attach original request to shred error

This way, if an error bubbles up from within some nested Shred requests, we can easily tell which request caused the error.

current:

{ message: 'ResponseError: 406',
  response: 
   <Shred Response>  406
   - Headers:
   ,
  status: 406 }

proposed:

{ message: 'ResponseError: 406',
  response: 
   <Shred Response>  406
   - Headers:
   ,
  status: 406,
  request: <original Shred request>
 }

I'd like to use your library but....

You show a simple http get. However, you don't show about having to provide CookieJar with that request, hence it is not simple...I've been going over this stuff to try to make a "simple" http request but to no avail since I keep getting this obnoxious error: ReferenceError: CookieAccessInfo is not defined...if this Cookie stuff is needed then your "simple" http request becomes NOT simple.

Problem using a proxy

Submitted by Fuli via email

Hi,

I used your nodeJS module "shred" for a while, it is pretty good.
Recently, I want to support proxy in my project, but I found it does not
work when I set "proxy" attribute in options, the request is sent to
proxy server, but it can not get response back. In network monitor
application, I found the http request sent to proxy is unreadable.

the proxy server I used is CCProxy.


I output the request header in shred/lib/request.js before it calls
http.request().
it is as below:

reqParams = {
    host: '10.10.7.252',
    port: '808',
    method: 'get',
    path: 'http://10.10.7.64:2000/lib/defaultsite/publish/50dd19a9f11bc57013000012/playlist_20130828_20130828.publ',
    headers: {
        'User-Agent': 'Shred',
        Connection: 'keep-alive',
        Accept: '*/*',
        'If-Modified-Since': '',
        Cookie: 'connect.sid=s%3A5ufL%2BBYOw41Gtnz71YalDECz.MC%2FXf8wWZe12bzF1Pm7q4RmvMrgaGk6HymceW3AQvgU'
    },
    scheme: '',
    agent: undefined
}

With my test code which just call http.request with above request data,
it can get correct data in response event. the source code is:

this.sendRequestToServer = function (urlString, cookie, proxy, callback) {
    var req = null;
    var urlObj = {};
    var postString = '';
    var that = this;
    var authBuffer = null;
    var bufferArray = [];
    var bufferIndex = 0;
    var finalBuffer = null;
    var status = false;
    var returncookie = '';
    var bufferSize = 0;

    if (!urlString || (urlString.indexOf('http') !== 0)) {
        return callback(4,
            '');
    }
    if (proxy && proxy.host && proxy.port) {
        urlObj.host = proxy.host;
        urlObj.port = proxy.port;
        urlObj.headers = {};
        urlObj.path = urlString;
    } else {
        urlObj = url.parse(urlString);
        urlObj.headers = {};
    }

    urlObj.method = 'get';
    urlObj.headers.connection = 'keep-alive';
    urlObj.headers.accept = 'application/json, text/javascript, */*; q=0.01';
    urlObj.headers['Content-Type'] = 'application/x-www-form-urlencoded';
    urlObj.headers.Cookie = cookie;

    req = http.request(urlObj, function (res) {
        res.setEncoding('utf8');
        res.on('data', function (chunk) {
            if (res.statusCode === 200) {
                bufferArray[bufferIndex] = chunk;
                bufferIndex++;
                bufferSize += chunk.length;

                if (res.headers['set-cookie']) {
                    returncookie = res.headers['set-cookie'];
                }
                status = true;
            } else {
                status = false;
            }
        });

        res.on('end', function () {
            if (status) {
                if (bufferIndex === 0) { //never download at all
                    return callback(11, '', returncookie);
                } else {
                    if (bufferIndex === 1) {
                        finalBuffer = bufferArray[0];
                    } else {
                        finalBuffer = Buffer.concat(bufferArray, bufferSize);
                    }
                    return callback(0, finalBuffer, returncookie);
                }
            } else {
                return callback(11, res.statusCode, '');
            }
        });
        res.on('close', function (err) {
            console.log('receive close event.' + err);
        });
    });

    req.on('error', function (e) {
        console.log('receive error event.');
        console.log(e);
        return callback(11, '', '');
    });

    req.end();
}

I do not know where has problem in shred, can you help me?

error in browser

shred throws the following error when making a request from the browser:

TypeError: 'undefined' is not an object (evaluating 'STATUS_CODES[response.status]')

looks like this is because http-browserify doesn't define status codes which the node core lib does.

response.content.body is still confusing to me...

Today, I tried to use response.content.body from a response that had a content type of json. I expected that to be the json string (and I would expect response.content.data to be the object). Instead, it seemed to be a stream or large array of bytes. Eventually, I relented and just used .data (and fixed up the code that was expecting some json to parse).

Either I'm missing something that's not made clear in the docs or there's something not working when getting response.content.body from a response with a content type of json.

include port in relative URLs

right now a relative url maps to the hostname without the port

http.get
  url: "/content/hello-world.json"
  on:
    200: (response) -> 

yields

GET "http://localhost/content/hello-world.json"

instead of

GET "http://localhost:8888/content/hello-world.json"

Duplicate request headers shown in debug output.

Duplicate case-insensitive headers are being sent. For example, a request sends both an "Accept" and "accept" headers when setting the headers option with lower case header names. I suspect changes which resolved Issue #54 are causing the problem.

shred = new Shred ({ logCurl: true }); // to log the headers being sent

shred.get({....
headers: {accept: "text/html"}

---- log:
curl -X GET http://.... -H "accept: text/html" -H "Accept: text/html"
.... -H "accept: text/html" -H "Accept: text/html"

Support charset in Content object

Right now we assume the charset is UTF-8. It would be cool if you could set the charset of the Content object when you constructed it.

Introduce a request error event type

We don't emit an event presently for the case where the request errors out before generating a response. On the one hand, we don't want error handlers to have to check for a null value for the response. On the other hand, we should generate a catchable event here, with the idea that this would never be an expected result. So right now we simply log the error. Other options would be to raise an exception or generate a different kind of event.

no effect from timeout

node v0.8.8, shred 0.8.3

shred.get({
url: url,
headers: headers,
timeout: 2000,
on: {
    error: function (response) {
    log.info("error cb", response);
    }
}
});

I have yet to see that log line print, and instead I get neverending hangs. #28 seems like it was trying to fix and test the timeout system.

Make corset-case conversions optional?

Hi, great project! I'm currently integrating shred into swagger.js so the swagger client library is portable between node.js and the browser. So far everything with shred has gone great.

I did hit an issue today where the conversion to corsetCase applies to ALL headers, which means names like "api_key" become "Api-Key". This causes some clients to break since they expect the underscore and lower-case convention. While I can patch this for browser use, my goal is to include shred as an npm dependency, which means it will behave differently when pulled directly from npmjs.org.

Is there a way that this conversion can be configured away, or made optional?

CookieJar 1.3.1 breaks Shred

CookieAccessInfo was exposed in 1.3.0 as a global, and Shred is using it thusly. The variable was made local in CookieJar 1.3.1, which causes Shred to fail.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.