Giter Club home page Giter Club logo

micro's Introduction

Micro — Async ES6 HTTP microservices

Build Status Coverage Status Slack Channel XO code style

Features

  • Easy. Designed for usage with async and await (more)
  • Fast. Ultra-high performance (even JSON parsing is opt-in).
  • Micro. The whole project is ~100 lines of code.
  • Agile. Super easy deployment and containerization.
  • Simple. Oriented for single purpose modules (function).
  • Explicit. No middleware. Modules declare all dependencies.
  • Standard. Just HTTP!
  • Lightweight. The package is small and the async transpilation is fast and transparent

Usage

Firstly, install it:

npm install --save micro

Then add a start script to your package.json like this:

{
  "main": "index.js",
  "scripts": {
    "start": "micro"
  }
}

Then create an index.js file and populate it with function, that accepts standard http.IncomingMessage and http.ServerResponse objects:

module.exports = (req, res) => { res.end('Welcome to Micro') }

Micro provides useful helpers but also handles return values – so you can write it even shorter!

module.exports = () => 'Welcome to Micro'

Once all of that is done, just start the server:

npm start

And go to this URL: http://localhost:3000 - 🎉

Now make sure to check out awesome-micro - a collection of plugins for Micro!

async & await

Examples

Micro is built for usage with async/await. You can read more about async / await here

const sleep = require('then-sleep')

module.exports = async (req, res) => {
  await sleep(500)
  return 'Ready!'
}

Transpilation

We use is-async-supported combined with async-to-gen, so that the we only convert async and await to generators when needed.

If you want to do it manually, you can! micro(1) is idempotent and should not interfere.

micro exclusively supports Node 6+ to avoid a big transpilation pipeline. async-to-gen is fast and can be distributed with the main micro package due to its small size.

Body parsing

Examples

For parsing the incoming request body we included an async functions buffer, text and json

const {buffer, text, json} = require('micro')

module.exports = async (req, res) => {
  const buf = await buffer(req)
  console.log(buf)
  // <Buffer 7b 22 70 72 69 63 65 22 3a 20 39 2e 39 39 7d>
  const txt = await text(req)
  // '{"price": 9.99}'
  const js = await json(req)
  // { price: 9.99 }
  console.log(js.price)
  return ''
}

API

buffer(req, { limit = '1mb', encoding = 'utf8' })
text(req, { limit = '1mb', encoding = 'utf8' })
json(req, { limit = '1mb', encoding = 'utf8' })
  • Buffers and parses the incoming body and returns it.
  • Exposes an async function that can be run with await.
  • Can be called multiple times, as it caches the raw request body the first time.
  • limit is how much data is aggregated before parsing at max. Otherwise, an Error is thrown with statusCode set to 413 (see Error Handling). It can be a Number of bytes or a string like '1mb'.
  • If JSON parsing fails, an Error is thrown with statusCode set to 400 (see Error Handling)

For other types of data check the examples

Sending a different status code

So far we have used return to send data to the client. return 'Hello World' is the equivalent of send(res, 200, 'Hello World').

const {send} = require('micro')

module.exports = async (req, res) => {
  const statusCode = 400
  const data = { error: 'Custom error message' }

  send(res, statusCode, data)
}

API

send(res, statusCode, data = null)
  • Use require('micro').send.
  • statusCode is a Number with the HTTP error code, and must always be supplied.
  • If data is supplied it is sent in the response. Different input types are processed appropriately, and Content-Type and Content-Length are automatically set.
    • Stream: data is piped as an octet-stream. Note: it is your responsibility to handle the error event in this case (usually, simply logging the error and aborting the response is enough).
    • Buffer: data is written as an octet-stream.
    • object: data is serialized as JSON.
    • string: data is written as-is.
  • If JSON serialization fails (for example, if a cyclical reference is found), a 400 error is thrown. See Error Handling.

Programmatic use

You can use Micro programmatically by requiring Micro directly:

const micro = require('micro')
const sleep = require('then-sleep')

const server = micro(async (req, res) => {
  await sleep(500)
  return 'Hello world'
})

server.listen(3000)

API

micro(fn)
  • This function is exposed as the default export.
  • Use require('micro').
  • Returns a http.Server that uses the provided function as the request handler.
  • The supplied function is run with await. So it can be async

Error handling

Micro allows you to write robust microservices. This is accomplished primarily by bringing sanity back to error handling and avoiding callback soup.

If an error is thrown and not caught by you, the response will automatically be 500. Important: Error stacks will be printed as console.error and during development mode (if the env variable NODE_ENV is 'development'), they will also be included in the responses.

If the Error object that's thrown contains a statusCode property, that's used as the HTTP code to be sent. Let's say you want to write a rate limiting module:

const rateLimit = require('my-rate-limit')

module.exports = async (req, res) => {
  await rateLimit(req)
  // ... your code
}

If the API endpoint is abused, it can throw an error with createError like so:

if (tooMany) {
  throw createError(429, 'Rate limit exceeded')
}

Alternatively you can create the Error object yourself

if (tooMany) {
  const err = new Error('Rate limit exceeded')
  err.statusCode = 429
  throw err
}

The nice thing about this model is that the statusCode is merely a suggestion. The user can override it:

try {
  await rateLimit(req)
} catch (err) {
  if (429 == err.statusCode) {
    // perhaps send 500 instead?
    send(res, 500)
  }
}

If the error is based on another error that Micro caught, like a JSON.parse exception, then originalError will point to it.

If a generic error is caught, the status will be set to 500.

In order to set up your own error handling mechanism, you can use composition in your handler:

const {send} = require('micro')

const handleErrors = fn => async (req, res) => {
  try {
    return await fn(req, res)
  } catch (err) {
    console.log(err.stack)
    send(res, 500, 'My custom error!')
  }
}

module.exports = handleErrors(async (req, res) => {
  throw new Error('What happened here?')
})

API

sendError(req, res, error)
  • Use require('micro').sendError.
  • Used as the default handler for errors thrown.
  • Automatically sets the status code of the response based on error.statusCode.
  • Sends the error.message as the body.
  • Stacks are printed out with console.error and during development (when NODE_ENV is set to 'development') also sent in responses.
  • Usually, you don't need to invoke this method yourself, as you can use the built-in error handling flow with throw.
createError(code, msg, orig)
  • Use require('micro').createError.
  • Creates an error object with a statusCode.
  • Useful for easily throwing errors with HTTP status codes, which are interpreted by the built-in error handling.
  • orig sets error.originalError which identifies the original error (if any).

Testing

Micro makes tests compact and a pleasure to read and write. We recommend ava, a highly parallel Micro test framework with built-in support for async tests:

const micro = require('micro')
const test = require('ava')
const listen = require('test-listen')
const request = require('request-promise')

test('my endpoint', async t => {
  const service = micro(async (req, res) => {
    micro.send(res, 200, {
      test: 'woot'
    })
  })

  const url = await listen(service)
  const body = await request(url)

  t.deepEqual(JSON.parse(body).test, 'woot')
})

Look at test-listen for a function that returns a URL with an ephemeral port every time it's called.

Transpilation

We use is-async-supported combined with async-to-gen, so that we only convert async and await to generators when needed.

If you want to do it manually, you can! micro(1) is idempotent and should not interfere.

micro exclusively supports Node 6+ to avoid a big transpilation pipeline. async-to-gen is fast and can be distributed with the main micro package due to its small size.

To use native async/await on Node v7.x, run micro like the following.

node --harmony-async-await node_modules/.bin/micro .

Deployment

You can use the micro CLI for npm start:

{
  "name": "my-microservice",
  "dependencies": {
    "micro": "x.y.z"
  },
  "main": "microservice.js",
  "scripts": {
    "start": "micro"
  }
}

Then simply run npm start!

Port based on environment variable

When you want to set the port using an environment variable you can use:

micro -p $PORT

Optionally you can add a default if it suits your use case:

micro -p ${PORT:-3000}

${PORT:-3000} will allow a fallback to port 3000 when $PORT is not defined

Contribute

  1. Fork this repository to your own GitHub account and then clone it to your local device
  2. Link the package to the global module directory: npm link
  3. Transpile the source code and watch for changes: npm start
  4. Within the module you want to test your local development instance of Micro, just link it to the dependencies: npm link micro. Instead of the default one from npm, node will now use your clone of Micro!

As always, you can run the AVA and ESLint tests using: npm test

Credits

Thanks Tom Yandell and Richard Hodgson for donating the micro npm name.

Authors

micro's People

Contributors

albinekb avatar alexfreska avatar allain avatar amio avatar bganicky avatar borischumichev avatar coffee-mug avatar domachine avatar dotcypress avatar floatdrop avatar greenkeeper[bot] avatar greenkeeperio-bot avatar helfer avatar iamstarkov avatar kevin-roark avatar knpwrs avatar leebyron avatar leo avatar matheuss avatar mxstbr avatar nkzawa avatar onbjerg avatar randallsquared avatar rauchg avatar timneutkens avatar tylkomat avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.