Giter Club home page Giter Club logo

lua-http's Introduction

HTTP library for Lua.

Features

  • Optionally asynchronous (including DNS lookups and TLS)
  • Supports HTTP(S) version 1.0, 1.1 and 2
  • Functionality for both client and server
  • Cookie Management
  • Websockets
  • Compatible with Lua 5.1, 5.2, 5.3, 5.4 and LuaJIT

Documentation

Can be found at https://daurnimator.github.io/lua-http/

Status

Build Status Coverage Status CII Best Practices

Installation

It's recommended to install lua-http by using luarocks. This will automatically install run-time lua dependencies for you.

$ luarocks install http

Dependencies

To use gzip compression you need one of:

To check cookies against a public suffix list:

If using lua < 5.3 you will need

If using lua 5.1 you will need

For running tests

Development

Getting started

  • Clone the repo:

    $ git clone https://github.com/daurnimator/lua-http.git
    $ cd lua-http
    
  • Install dependencies

    $ luarocks install --only-deps http-scm-0.rockspec
    
  • Lint the code (check for common programming errors)

    $ luacheck .
    
  • Run tests and view coverage report (install tools first)

    $ busted -c
    $ luacov && less luacov.report.out
    
  • Install your local copy:

    $ luarocks make http-scm-0.rockspec
    

Generating documentation

Documentation is written in markdown and intended to be consumed by pandoc. See the doc/ directory for more information.

lua-http's People

Contributors

aleclarson avatar daurnimator avatar leafo avatar mehgugs avatar pspacek avatar russellhaley avatar vavrusa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lua-http's Issues

Should we use less strict uri parsing?

At the moment url parsing is strict: it doesn't allow forbidden characters like {.

Should things like request.new_from_uri normalise to the encoded form?

References

HTTP2: Check for malformed stream due to content-length

h2spec failure:

  8.1. HTTP Request/Response Exchange
    8.1.2. HTTP Header Fields
      8.1.2.6. Malformed Requests and Responses
        × Sends a HEADERS frame that contains the "content-length" header field which does not equal the sum of the DATA frame payload lengths
          - The endpoint MUST respond with a stream error of type PROTOCOL_ERROR.
            Expected: GOAWAY frame (ErrorCode: PROTOCOL_ERROR)
                      RST_STREAM frame (ErrorCode: PROTOCOL_ERROR)
                      Connection close
              Actual: DATA frame (Length: 3, Flags: 1)
        × Sends a HEADERS frame that contains the "content-length" header field which does not equal the sum of the multiple DATA frame payload lengths
          - The endpoint MUST respond with a stream error of type PROTOCOL_ERROR.
            Expected: GOAWAY frame (ErrorCode: PROTOCOL_ERROR)
                      RST_STREAM frame (ErrorCode: PROTOCOL_ERROR)
                      Connection close
              Actual: DATA frame (Length: 3, Flags: 1)

HTTP2: dynamic table size update MUST occur at the beginning of the first header block following the change to the dynamic table size

h2spec failure:

   4.3. Header Compression and Decompression
    × Encodes Dynamic Table Size Update (RFC 7541, 6.3) after common header fields
      - The endpoint MUST terminate the connection with a connection error of type COMPRESSION_ERROR.
        Expected: GOAWAY frame (ErrorCode: COMPRESSION_ERROR)
                  Connection close
          Actual: DATA frame (Length: 3, Flags: 1)

HPACK Section 4.2:

This dynamic table size update MUST occur at the beginning of the first header block following the change to the dynamic table size. In HTTP/2, this follows a settings acknowledgment.

luaossl build fails on FreeBSD 11 rc2

russell@freebsd11rc2:~/git% sudo luarocks install --server=http://luarocks.org/dev http
Installing http://luarocks.org/dev/http-scm-0.rockspec
Missing dependencies for http scm-0:
   luaossl >= 20150727 (not installed)
   basexx >= 0.2.0 (not installed)
   lpeg_patterns >= 0.2 (not installed)
   fifo (not installed)

http scm-0 depends on luaossl >= 20150727 (not installed)
Installing https://luarocks.org/luaossl-20151221-0.src.rock
cc -O2 -fPIC -I/usr/local/include -c src/openssl.c -o src/openssl.o -D_REENTRANT -D_THREAD_SAFE -D_GNU_SOURCE -DLUA_COMPAT_APIINTCASTS -I/usr/local/include -I/usr/local/include
cc -shared -o _openssl.so -L/usr/local/lib src/openssl.o -L/usr/local/lib -L/usr/local/lib -lssl -lcrypto -lpthread -ldl -lm
/usr/bin/ld: cannot find -ldl
cc: error: linker command failed with exit code 1 (use -v to see invocation)

Error: Failed installing dependency: https://luarocks.org/luaossl-20151221-0.src.rock - Build error: Failed compiling module _openssl.so

Linker problem? I think I had to take a -ldl out of something else I was building once. I don't know where luarocks puts the source so I can't check the makefile. The makefile from wahern is pretty advanced for me, I couldn't find any linker flags.

Start making releases

Could you please start making releases (git tags and LuaRocks)? I’d like to make an Alpine package for this module, but it’s kinda problematic to make a package for something that doesn’t have any version number.

seg fault when using https on request.bin

This example which works with cURL :

  local uri = "https://requestb.in/XXXXXX"
  local req_body = "ljmsjkfdm"
  local req_timeout = 10

  local request = require "http.request"

  local req = request.new_from_uri(uri)
  if req_body then
    req.headers:upsert(":method", "POST")
    req:set_body(req_body)
  end

  print("# REQUEST")
  print("## HEADERS")
  for k, v in req.headers:each() do
    print(k, v)
  end
  print()
  if req.body then
    print("## BODY")
    print(req.body)
    print()
  end

  print("# RESPONSE")
  local headers, stream = req:go(req_timeout)

Upsert Content-Type header causes error in http2 & https websites

Tried with multiple https/http2 websites, including http2.golang.org and my own website

package.path = package.path .. ";./libs/?.lua;./libs/?/init.lua"
package.cpath = package.cpath .. ";./libs/?.so"
local request = require "http.request"

-- This endpoint returns a never-ending stream of chunks containing the current time
local req = request.new_from_uri("https://web.site/login")
req.headers:upsert(":method", "POST")
req.headers:upsert("Content-Type", "application/x-www-form-urlencoded") <-- Adding this makes the example crash
req:set_body("Name=Bla&LastName=Bla")
local _, stream = assert(req:go())
local body, err = stream:get_body_as_string()
print(body)

How I fixed this is by appending the header server-side and the reason I needed the header:
http://stackoverflow.com/questions/39122458/get-post-data-using-golang-http-package

cqueues and luaossl rockspecs ignore LDFLAGS, broken on OS X

Couldn't find the project dedicated to luaossl/cqueues rockspec dependencies, apologies if this is the wrong place.

Problem

The rockspecs for luaossl/cqueues link against different openssl version on OS X.

Why

OS X since El Capitan doesn't ship OpenSSL headers, so they have to be installed separately.
The rockspec accepts OPENSSL_DIR and CRYPTO_DIR variables to set CFLAGS/LDFLAGS, but the rockspec sets CFLAGS only. As a result, it is built with mismatching header/library version and crashes.

How to reproduce

(Same thing for cqueues/openssl)

$ cd cloned_cqueues
$ make CC=cc CFLAGS=-I/usr/local/opt/openssl/include
$ otool -L src/5.1/cqueues.so
src/5.1/cqueues.so:
    /usr/lib/libssl.0.9.8.dylib (compatibility version 0.9.8, current version 0.9.8)
    /usr/lib/libcrypto.0.9.8.dylib (compatibility version 0.9.8, current version 0.9.8)

How to build it right

$ make CC=cc CFLAGS=-I/usr/local/opt/openssl/include LDFLAGS="-L/usr/local/opt/openssl/lib"
$ otool -L src/5.1/cqueues.so
src/5.1/cqueues.so:
    /usr/local/opt/openssl/lib/libssl.1.0.0.dylib (compatibility version 1.0.0, current version 1.0.0)
    /usr/local/opt/openssl/lib/libcrypto.1.0.0.dylib (compatibility version 1.0.0, current version 1.0.0)

lbase64 for Lua 5.0-5.2, not 5.3

Can we use a Lua 5.3-compatible fork of lbase64 or use something else? If not, could we include installation instructions for Lua 5.3 compatibility?

Able to send multipart/form-data

As requested by @fur-q

2016-01-27 13:29:47 daurnimator furq: so the other day I played with multipart form encoding
2016-01-27 13:29:53 daurnimator furq: I'm wondering how you envision the API
2016-01-27 13:31:49 furq er
2016-01-27 13:32:19 furq i guess the same as a normal post but allowing for file handles as values
2016-01-27 13:33:31 furq i take it the api for a form-urlencoded post is request(url, { key = "value" })
2016-01-27 13:33:34 daurnimator furq: so this is where I got up to: https://github.com/daurnimator/lua-http/compare/WIP-multipart-form-data?expand=1
2016-01-27 13:33:34 furq or something similar
2016-01-27 13:34:04 daurnimator furq: the thing about those is that you can provide all sort of fun things in the parts: they all have their own headers section
2016-01-27 13:34:32 daurnimator a section usually has a least content-type. but also often content-disposition
2016-01-27 13:34:40 furq oh yeah
2016-01-27 13:34:58 furq it's been ages since i've touched any of this
2016-01-27 13:35:32 daurnimator furq: so the current encoding function I've got there, you provide an iterator that returns headers, body pairs
2016-01-27 13:36:00 furq it would be nice if you could just do a post with a file handle and have it automatically generate multipart with the file set to application/octet-stream
2016-01-27 13:36:09 daurnimator where body can be a string, a file object, or another iterator...
2016-01-27 13:36:19 furq as far as manual control goes you probably know better than i do
2016-01-27 13:36:58 daurnimator (because multipart/form-data sometimes contain another multipart/form-data as one of their parts.... its often very nested)
2016-01-27 13:37:28 furq i can see why so many http libraries don't bother with this
2016-01-27 13:38:23 furq also isn't the content-disposition always form-data
2016-01-27 13:38:54 daurnimator furq: no. it can be e.g. attachment
2016-01-27 13:38:56 furq oh never mind
2016-01-27 13:39:20 furq for forms it's always form-data but it can also be 'form-data; name="foo"' or whatever
2016-01-27 13:39:38 daurnimator furq: except for file uploads.
2016-01-27 13:40:01 furq that's not what this w3 page says
2016-01-27 13:40:17 daurnimator in which case it can be attachment; name="foo"; filename=somethingascii; filename*=someunicodeencodingmess
2016-01-27 13:40:38 furq maybe the rfc says different
2016-01-27 13:41:06 furq actually the rfc doesn't mention that either
2016-01-27 13:41:31 daurnimator furq: there's a whole IANA registry for it
2016-01-27 13:41:38 furq nice
2016-01-27 13:41:41 daurnimator furq: https://www.iana.org/assignments/cont-disp/cont-disp.xhtml
2016-01-27 13:42:22 furq "attachment" says it's for emails
2016-01-27 13:43:35 daurnimator furq: splitting hairs here; but nothing says that you can't serve an email over HTTP.
2016-01-27 13:43:39 zash You can do attachment in HTTP too, I think browsers ask you were to save it then
2016-01-27 13:43:42 furq "multipart/form-data" contains a series of parts. Each part is expected to contain a content-disposition header [RFC 2183] where the disposition type is "form-data", and where the disposition contains an (additional) parameter of "name", where the value of that parameter is the original field name in the form.
2016-01-27 13:43:43 daurnimator furq: perhaps you're confusing HTML vs HTTP?
2016-01-27 13:43:45 daurnimator zash: yep.
2016-01-27 13:44:21 furq i'm only interested in multipart/form-data
2016-01-27 13:44:33 daurnimator furq: e.g. https://support.microsoft.com/en-us/kb/260519
2016-01-27 13:44:37 zash Uploading files uses some disposition in a form-data thing too
2016-01-27 13:44:40 furq i wasn't aware any other multipart http requests existed, but you can always count on http to have some fucking dismal hole
2016-01-27 13:45:11 furq er
2016-01-27 13:45:18 furq isn't that for responses?
2016-01-27 13:45:26 daurnimator furq: well that link is.
2016-01-27 13:45:31 TheCycoONE a bunch of multiparts
2016-01-27 13:45:43 furq that has nothing to do with multipart though
2016-01-27 13:45:48 daurnimator furq: but why does that matter? this code path is hit for both client and server.
2016-01-27 13:46:02 zash https://en.wikipedia.org/wiki/MIME#Mixed-Replace is fun btw
2016-01-27 13:46:45 zash Server push from 1998 :)
2016-01-27 13:46:49 furq oh christ, multipart responses?
2016-01-27 13:46:55 furq that sounds really awful
2016-01-27 13:47:01 zash furq: No it's awesome
2016-01-27 13:47:17 zash Especially the mixed-replace one, which means you can animate ASCII and stuff
2016-01-27 13:47:36 TheCycoONE saves a lot of tcp sessions
2016-01-27 13:47:51 daurnimator furq: so at this point; I've very tempted to say "get a MIME library; not lua-http's problem"
2016-01-27 13:48:14 furq form-data posts are pretty common though
2016-01-27 13:48:15 zash daurnimator: luasocket includes one...
2016-01-27 13:48:25 furq i don't care if you ignore all the other edge cases that seemed like a good idea in 1998
2016-01-27 13:48:27 daurnimator zash: luasocket also includes an SMTP library.
2016-01-27 13:48:56 zash and base64 encoding
2016-01-27 13:49:08 TheCycoONE ... why not a luacurl that wraps everything curl supports?
2016-01-27 13:49:10 furq apparently browsers don't even support multipart/mixed etc any more
2016-01-27 13:49:18 furq at least firefox dropped it ages ago
2016-01-27 13:49:22 daurnimator infact, IIRC the luasocket MIME routines are locked away in the SMTP section, and can't readily be used by HTTP clients.
2016-01-27 13:49:42 TheCycoONE I think the WAP Push spec requires multipart/mixed
2016-01-27 13:49:43 zash daurnimator: eh, just require"mime"?
2016-01-27 13:49:53 zash WAP? Hahahahaha
2016-01-27 13:50:01 daurnimator TheCycoONE: WAP is something else entirely
2016-01-27 13:50:10 furq oh nvm that's in XHR
2016-01-27 13:51:07 furq even so i doubt it's worth your time supporting that on the server side
2016-01-27 13:51:15 zash Yet I still have no idea how to respond to a form POST with 2xx without the browser re-posting if you refresh the page...
2016-01-27 13:51:20 furq even if you've already written a multipart encoder
2016-01-27 13:51:46 daurnimator zash: the multipart stuff is locked up behind smtp.message()
2016-01-27 13:51:51 daurnimator zash: use a 303.
2016-01-27 13:52:30 TheCycoONE zash, PRG pattern (https://en.wikipedia.org/wiki/Post/Redirect/Get) - what daurnimator said
2016-01-27 13:52:34 zash daurnimator: but that's not very RESTful
2016-01-27 13:52:38 TheCycoONE ah
2016-01-27 13:52:56 rjek What is this, hipster central? Get the fuck out with your REST
2016-01-27 13:53:00 zash also already did that http://hg.prosody.im/issue-tracker/rev/cd49d6c23da3
2016-01-27 13:53:10 TheCycoONE use PUT in rest for things that you don't want duplicated
2016-01-27 13:53:24 TheCycoONE and then handle it with the id
2016-01-27 13:53:54 zash rjek: I'm using RESTful ironically!
2016-01-27 13:54:04 TheCycoONE but why is a web browser interacting with REST directly?
2016-01-27 13:54:49 furq anyway
2016-01-27 13:55:23 furq daurnimator: beyond request(url, { file = handle }) and request(url, { file = { handle = handle, type = "image/png" }) i don't care what you do
2016-01-27 13:55:32 furq +}
2016-01-27 13:56:11 furq i'm pretty sure that will cover 99% of use cases anyway
2016-01-27 13:57:09 furq and by 99% i of course mean by number of requests made, not number of potential bullshit rfc-says-SHOULD uses
2016-01-27 13:57:13 daurnimator furq: how about: myrequest:set_multipart_body(function() local h = new_headers(); h:append("content-type", "image/png"); h:append("content-disposition", "form-data; name=file"); coroutine.yield(h, handle) end)
2016-01-27 13:57:41 furq that seems like the kind of thing you could magic up for me when i pass a file handle
2016-01-27 13:58:46 daurnimator furq: I can't inger the file name; and I don't want to infer the mime type :p
2016-01-27 13:58:49 daurnimator s/inger/infer/
2016-01-27 13:59:04 furq the mime type should default to application/octet-stream
2016-01-27 13:59:13 furq and the name should be whatever the key is
2016-01-27 14:00:04 daurnimator furq: the alternative is some sort of multipart/form-data construction class..... local mb = new_multipart_body(); mb:add_form_data("myname", "image/png", handle); request:set_body(mb);
2016-01-27 14:00:41 furq is that how you add form data anyway
2016-01-27 14:00:52 furq minus the new_multipart_body ofc
2016-01-27 14:01:52 furq also if you do it that way then new_multipart_body should take a table of initial form-data values
2016-01-27 14:02:06 furq as well as add_form_data
2016-01-27 14:02:22 daurnimator ehhhhhh
2016-01-27 14:03:05 furq new_multipart_body({ name = { handle, "image/png" } }) is much less hassle
2016-01-27 14:03:34 daurnimator furq: problem is I want to imply order (cause the order large things are sent tends to matter)
2016-01-27 14:03:58 furq if you want a defined order then use add_form_data
2016-01-27 14:04:03 furq and by you i mean the user
2016-01-27 14:05:00 furq http://docs.python-requests.org/en/latest/user/quickstart/#post-a-multipart-encoded-file
2016-01-27 14:05:11 furq it would be nice if it was this simple, although i'm aware this is a high-level api wrapper
2016-01-27 14:05:26 furq maybe you could separate that stuff into a helper libary
2016-01-27 14:05:47 daurnimator furq: helper libraries are hard when you want them to be methods on existing objects....
2016-01-27 14:09:07 furq also you should definitely infer the content-type as "text/plain" for string form-data values
2016-01-27 14:09:22 furq that is the official(tm) default
2016-01-27 14:10:17 furq actually
2016-01-27 14:10:44 furq As with all multipart MIME types, each part has an optional "Content-Type", which defaults to text/plain. If the contents of a file are returned via filling out a form, then the file input is identified as the appropriate media type, if known, or "application/octet-stream".
2016-01-27 14:11:00 furq there are defaults defined for files and strings
2016-01-27 14:11:49 furq also nice grammar "l. masinter"

Support OCSP-Must-Staple

OCSP-Must-Staple makes certificate revocation work and scale: it is a standardized X.509 extension that specifies that the user agent must do a hard-fail revocation check, using a stapled OCSP response.

This requires:

  • supporting OCSP stapling;
  • supporting RFC 7633 TLS Features Extensions (might belong in luaossl);
  • adding logic to handle this specific extension.

SOCKS5 Proxy

Support SOCKS5 encapulation for outgoing connections

HTTP status lines ending directly after the status code are not accepted

As sent by http://lion.rusfur.net/ (reported by @torhve)

read(lion.rusfur.net./[91.200.224.200]:80): rcvd 188 bytes
  000000  48 54 54 50 2f 31 2e 31  20 34 30 34 0d 0a 53 65  |HTTP/1.1.404..Se|
  000010  72 76 65 72 3a 20 6c 69  6f 6e 68 6f 73 74 0d 0a  |rver:.lionhost..|
  000020  43 6f 6e 6e 65 63 74 69  6f 6e 3a 20 43 6c 6f 73  |Connection:.Clos|
  000030  65 0d 0a 44 61 74 65 3a  20 53 75 6e 2c 20 32 35  |e..Date:.Sun,.25|
  000040  20 53 65 70 20 32 30 31  36 20 30 32 3a 35 32 3a  |.Sep.2016.02:52:|
  000050  31 33 20 47 4d 54 0d 0a  53 74 61 74 75 73 3a 20  |13.GMT..Status:.|
  000060  34 30 34 20 4e 6f 74 20  46 6f 75 6e 64 0d 0a 58  |404.Not.Found..X|
  000070  2d 50 6f 77 65 72 65 64  2d 42 79 3a 20 50 48 50  |-Powered-By:.PHP|
  000080  2f 35 2e 34 2e 36 0d 0a  43 6f 6e 74 65 6e 74 2d  |/5.4.6..Content-|
  000090  74 79 70 65 3a 20 74 65  78 74 2f 68 74 6d 6c 0d  |type:.text/html.|
  0000a0  0a 0d 0a                                          |...

Reading RFC 7230 this isn't allowed

status-line = HTTP-version SP status-code SP reason-phrase CRLF

nor was it allowed by RFC 2616.

In the original HTTP 1.0 spec the trailing SP was explicity called out.

Add HTTP2 debugging

e.g. a connection.debug or stream.debug flag. Should be initialised to an environment variable.

SSL key logging

HTML docs should allow collapse/expand of modules

I'm thinking sort of like in the windows file explorer: a little + that expands all the functions under a module.

Implementation constraints:

  • You can't edit the html directly (only the template) as it's generated by pandoc
  • Must work offline (i.e. no pulling in stylesheet/js from an external URL)

banned cipher check doesn't work

The banned cipher list is currently in HTTP2 spec form and it's being checked against OpenSSL identifiers
This means the check always succeeds.

Read proxy from environmental variables

Many users require use of a http proxy to make requests outside of their local network.
Provide a way to pick up common environmental variables that specify proxies.

From curl:

The lower case version has precedence. http_proxy is an exception as it is only available in lower case.

  • http_proxy: Sets the proxy server to use for HTTP.
  • HTTPS_PROXY: Sets the proxy server to use for HTTPS.
  • ALL_PROXY: Sets the proxy server to use if no protocol-specific proxy is set.
  • NO_PROXY: list of host names that shouldn't go through any proxy. If set to a asterisk '*' only, it matches all hosts.

See also:

one of dependency (cqueues) not compatible with Lua 5.1

In this project's README

Compatible with Lua 5.1, 5.2, 5.3 and LuaJIT

My system has Luajit and Lua 5.1, but it needs Lua 5.3 to be installed

Missing dependencies for cqueues:
lua == 5.3

Complete error log with the command

$ sudo luarocks install --server=http://luarocks.org/dev http
Installing http://luarocks.org/dev/http-scm-0.rockspec...
Using http://luarocks.org/dev/http-scm-0.rockspec... switching to 'build' mode

Missing dependencies for http:
cqueues >= 20150907
lpeg_patterns >= 0.2
luaossl >= 20150727
bit32 

Using http://luarocks.org/repositories/rocks/cqueues-20160316.53-0.src.rock... switching to 'build' mode
Archive:  /tmp/luarocks_luarocks-rock-cqueues-20160316.53-0-4753/cqueues-20160316.53-0.src.rock
  inflating: cqueues-20160316.53-0.rockspec  
  inflating: rel-20160316.tar.gz     

Missing dependencies for cqueues:
lua == 5.3


Error: Failed installing dependency: http://luarocks.org/repositories/rocks/cqueues-20160316.53-0.src.rock - Could not satisfy dependency: lua == 5.3

file descriptor not closed after socket compat http request

local cqueues = require("cqueues")
local cq = cqueues.new()

local http = require("http.compat.socket")

cq:wrap(function()
  while true do
    assert(http.request("http://leafo.net"))
  end
end)

assert(cq:loop())

Run this code while monitoring open file descriptors.

Garbage collection does not happen fast enough to clean up all the resources created, so eventually you'll hit your system file descriptor limit and the program will crash. Callling collectgarbage() in the loop fixes the problem, but it should not be necessary.

HTTP2: priority frames should not bump known highest stream ids

From http 2 spec section 5.1.1:

The identifier of a newly established stream MUST be numerically greater than all streams that the initiating endpoint has opened or reserved. This governs streams that are opened using a HEADERS frame and streams that are reserved using PUSH_PROMISE
The first use of a new stream identifier implicitly closes all streams in the "idle" state that might have been initiated by that peer with a lower-valued stream identifier. For example, if a client sends a HEADERS frame on stream 7 without ever sending a frame on stream 5, then stream 5 transitions to the "closed" state when the first frame for stream 7 is sent or received.

h2spec failure when using examples/server_hello.lua:

5.1. Stream States
 6.3. PRIORITY
   × Sends a PRIORITY frame for an idle stream, then send a HEADER frame for a lower stream id
     - The endpoint MUST respond to the HEADER request with no connection error.
         Actual: RST_STREAM frame (Length: 4, Flags: 0, ErrorCode: STREAM_CLOSED)

Don't send userinfo in referer

It's insecure to send userinfo in referer.

From RFC 7231 section 5.5.2:

A user agent MUST NOT include the fragment and userinfo components of the URI reference [RFC3986], if any, when generating the Referer field value.

Introduced in 5d068d8

Streaming doesn't work in my particular case.

I'm following one of your examples for the http streaming (shown here) with only a few things changed... url and variable names:

--[[
Makes a request to an HTTP2 endpoint that has an infinite length response.
Usage: lua examples/h2_streaming.lua
]]

local request = require "http.request"

-- This endpoint returns a never-ending stream of chunks containing the current time
local req = request.new_from_uri("https://http2.golang.org/clockstream")
local _, stream = assert(req:go())
for chunk in stream:each_chunk() do
    io.write(chunk)
end

It was working great up until about this weekend, now whenever I call this, it only returns one chunk and stops. I run the example code with the example url and it works. I have no idea what's going on as I have not changed a single thing in my API (serves the stream). Nor have I changed my code...

Currently my code is this:

local request = require "http.request"

local channelHokuyo = love.thread.getChannel("channelHokuyo")

local streamHokuyo = request.new_from_uri("http://localhost:5000/hokuyo_stream")
local _, stream = assert(streamHokuyo:go())
for chunk in stream:each_chunk() do
    channelHokuyo:push(chunk)
end

What I expect is an endless stream of this:
stream_output.txt

However, I only get one chunk. Like I said before, it used to work.

Any help would be great!

Note: I am using Love2D and this code is in a seperate thread. But when I isolate the code and run it with lua5.1 as it's own program, it still does the same thing.

Client connection pooling

Should probably be done at the http.client level?

Need a 'multi-key' cache structure that allows us to index by a tuple of host, port, version, tls context, etc.

Need a way to know when connection is unused to add it back to a pool. EDIT: Now have connection:onidle

Misc things:

  • When grabbing a http2 connection from the pool, may need to update settings.
  • connections can be shared: http2 should be by default; but don't do http1 pipelining by default.
  • Interaction with proxys?
  • Should allow multiple connections per tuple
    • low and high water marks?
  • tie expiry of connection to expiry of cert? expiry of dns entry?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.