PLEASE NOTE: THIS PROJECT IS A WORK IN PROGRESS
Before we get this application running locally and fruitlessly attempt to dethrone George Costanza from the #1 spot in Frogger, let's make sure we have the following installed:
Already done? Great! Now let's clone the repo, install packages, run migrations/seeds, get the server running and take things for a test spin!
- Clone the repo: In your terminal, run
git clone [email protected]:dannytannertantrum/merchant-mill-arcade.git
- Install and use nvm: This project uses
nvm
. If you need to install it on your machine, follow the instructions outlined here. Then, inside of the root directory, runnvm use
and follow the commands to install the correct node version if you do not have it. - Install packages: Inside of root, run
npm i
- Start the Postgres servers: Open Postgres and click the "Start" button.
- Set local environment variables: Create a
.env
file in the root directory and add the following keys:
POSTGRES_CONNECTION_STRING=postgres://glc@localhost:5432/merchant_mill_arcade
TEST_POSTGRES_CONNECTION_STRING=postgres://glc@localhost:5432/test_merchant_mill_arcade
In your terminal, run the rest of the commands below in the root directory:
- Create the database schema:
npm run create-db-schema
- Create the test database schema:
npm run create-test-db-schema
- Run migrations:
npm run migrate:up
- Run migrations for test:
npm run migrate-test:up
- Seed data:
npm run seed
- Seed test data:
npm run seed-test
- Start the server:
npm run dev
At this point, the server should be running. A local database and test database should be populated with some data. Feel free to use Postman or any REST client of your choice, but if you're using VS Code, check out the .requests.http
file in root. In order to make use of it, install the REST Client
extension for VS Code. This allows us to send requests right from VS Code. Take note of the little "Send Request" link above each HTTP method and try it out!
You can also take a peek at what request and response bodies should look like via swagger: http://localhost:7000/docs
First things first, create a .env
file in the client
directory and add the following key/value pairs:
BASE_URL=http://localhost:7000
CUSTOM_SEARCH_API_KEY=
CUSTOM_SEARCH_ENGINE_ID=
This application uses Google's Programmable Search Engine. A fallback exists in the UI, so feel free to skip this section if you do not wish to set it up. Otherwise, follow these instructions to create an instance of a PSE and generate an API key. Once setup is complete, plug your API key and Search Engine ID into their corresponding values in the .env
file.
In your terminal, run the following commands in the client
directory:
- Install packages:
npm i
- Start the server:
npm start
Note: From the root directory, we can also run
npm run start-client
to serve up the frontend.
If both server and client side are running, the UI can be accessed at http://localhost:1234/
We are using Slonik because it promotes writing raw SQL while still baking in basic protections such as SQL injection.
We're using @slonik/migrator for migrations. The migrate.js
file in the database
directory sets up our ability to use it. The migrator auto-generates a down
and up
file; we do the rest by writing raw SQL.
Running up
will take us to the latest migration. Per the docs: It is also possible to migrate up or down "to" a specific migration. For example, if you have run migrations one.sql
, two.sql
, three.sql
and four.sql
, you can revert three.sql
and four.sql
by running node migrate down --to three.sql
. Note that the range is inclusive. To revert all migrations in one go, run node migrate down --to 0
...Conversely, node migrate up
runs all up migrations by default. To run only up to a certain migration, run node migrate up --to two.sql
. This will run migrations one.sql
and two.sql
- again, the range is inclusive of the name.
We have scripts set up to migrate up and down:
$ npm run migrate:up
$ npm run migrate:down
$ npm run migrate:fully-down
As outlined above, we can run separate seed scripts for local and test data.
$ npm run seed
- This script seeds our local database.$ npm run seed-test
- This script seeds our test database.
We have a test database for integration tests to avoid polluting our dev database (in case we forget to clean something up, there's an error, etc.). Just like the commands above, we can create the test schema, migrate, seed, etc. with the following:
$ npm run create-test-db-schema
$ npm run migrate-test:up
$ npm run seed-test
We use jest with ts-jest
so we can get TypeScript support. To run tests:
$ npm t
- run all tests$ npm run test-watch
- run tests in watch mode$ npm t /pattern/
- Run a subset of tests based on a matching pattern. E.g. if you just want to run tests inadd-game.integration.ts
, you can run$ npm run test-watch add-game.i
- this also works in watch mode.
Sometimes, we get errors. Sometimes these errors are from not taking enough code breaks (bad!) and they leave us feeling silly. Here are some common ones and what to look out for:
- Just starting out and seeing the error below? We're probably missing local environment variables. Take a look at step 5 under Getting Started.
...api/node_modules/slonik/dist/src/utilities/parseDsn.js:10
if (dsn.trim() === '') {
^
TypeError: Cannot read properties of undefined (reading 'trim')
connect ECONNREFUSED
- Is Postgres running?Exceeded timeout of 5000 ms for a hook
- is Postgres running?“connect ECONNREFUSED 127.0.0.1:80”
- are you seeing this while running/adding tests or trying to hit routes? If so, make sure routes are valid, e.g. prefixed with a forward slash:/game/:id
and notgame/:id
Error: Cannot find module 'fs/promises'
orcode: 'MODULE_NOT_FOUND'
- did you run$ nvm use
?- Are processes hanging? If so, check and make sure all our hooks are calling
done()
and passing to the next handler.