Systems allows manage products
Name | Type | Description |
---|---|---|
destination | object | |
address | object | The destination’s street address details. |
number | string | The number component of this address, it may also contain letters. |
street | string | The name of the street. |
apartment | string | The suite or apartment number, or any additional relevant information. |
city | string | The name of the municipality. |
state | string | The name of the state, province or jurisdiction. |
-
Clone this repository.
-
Install docker
-
Install docker-compose
-
Open a terminal and run next commands:
- Go to project root directory
- Build services
- sudo docker-compose -f docker-compose-local.yml build
- Start application
- sudo docker-compose -f docker-compose-local.yml up
- This process sets up the services for this environmet:
- Django
- Postgres
- Redis
- Note: If you add some new models, before run docker-compose build and docker-compose up.
It is necessary to run the migrations with the following command, after docker-compose build and before docker-compose up.
- sudo docker-compose -f docker-compose-local.yml run --rm django python manage.py makemigrations
-
Open other terminal and run sudo docker ps, it should show 3 containers:
- django
- postgres
- redis
-
Go to http://localhost:8000/admin/ in your browser, and you should see the admin login
-
Now, you need to make the first user
- sudo docker-compose -f docker-compose-local.yml run --rm django python manage.py createsuperuser
- Note: All superuser created with this method is administrator (They are not anonymous users)
-
Congratulation, you can now enter the admin!
For local environment
- Build and run the services with docker-compose-local.yml file
- It's define environment variables at .env.local path
- And send emails with Django Backend Email
- If you want run AWS SES at local environtment, add aws variables in ./envs/.local/.aws/* path and change ENV=local to ENV=prod in ./envs/.local/.django
For production environment
- Build and run the services with docker-compose-production.yml file
- It's define environment variables at .env.production path
- And send emails with AWS SES
Use the super user, previously created in the verify section
Endpoint: {{host}}/users/v1/login/
HTTP Verb: POST
Header:
- Content-Type:application/json
- Accept:application/json
Body
{
"email":"[email protected]",
"password":"zxcvbnm12345"
}
Response
{
"user": {
"email": "[email protected]",
"username": "jmendoza",
"first_name": "Jonathan",
"last_name": "Mendoza",
"phone_number": "5523097299",
"is_admin": true
},
"access_token": "cdb93a784426fa5f7fffbaacd4709d5999c80a51"
}
Endpoint: {{host}}/users/v1/signup/
HTTP Verb: POST
Header:
- Authorization: Token {{access_token}}
- Content-Type:application/json
- Accept:application/json
Body:
{
"email":"[email protected]",
"username":"jmendoza",
"phone_number":"5523097299",
"password":"zxcvbnm12345",
"password_confirmation":"zxcvbnm12345",
"first_name":"Jonathan",
"last_name":"Mendoza",
"is_admin": true
}
Response
{
"email": "[email protected]",
"username": "jmendoza",
"first_name": "Jonathan",
"last_name": "Mendoza",
"phone_number": "5523097299",
"is_admin": true
}
Endpoint: {{host}}/users/v1/jmendoza/ (jmendoza is username and lookup field)
HTTP Verb: GET
Header:
- Authorization: Token {{access_token}}
- Accept:application/json
Response
{
"email": "[email protected]",
"username": "jmendoza",
"first_name": "Jonathan",
"last_name": "Mendoza",
"phone_number": "5523097299",
"is_admin": true
}
Endpoint: {{host}}/users/v1/jmendoza/ (jmendoza is username and lookup field)
HTTP Verb: PUT
Header:
- Authorization: Token {{access_token}}
- Content-Type:application/json
- Accept:application/json
Body
{
"email":"[email protected]",
"username":"dmendoza",
"phone_number":"5523097291",
"first_name":"Yonathan",
"last_name":"Mendez",
"is_admin": false
}
Response
{
"email": "[email protected]",
"username": "dmendoza",
"first_name": "Yonathan",
"last_name": "Mendez",
"phone_number": "5523097291",
"is_admin": false
}
Endpoint: {{host}}/users/v1/jmendoza/ (jmendoza is username and lookup field)
HTTP Verb: DELETE
Header:
- Authorization: Token {{access_token}}
Endpoint: {{host}}/products/v1/
HTTP Verb: POST
Header:
- Authorization: Token {{access_token}}
- Content-Type:application/json
- Accept:application/json
Body
{
"name":"Playera",
"sku":"P0001",
"price": 100.01,
"brand": "Patito",
"is_public": true
}
Response
{
"name": "Playera",
"sku": "P0001",
"price": "100.01",
"brand": "Patito",
"is_public": true
}
Endpoint: {{host}}/products/v1/
HTTP Verb: GET
Header:
- Authorization: Token {{access_token}}
- Accept:application/json
Response
{
"count": 2,
"next": null,
"previous": null,
"results": [
{
"name": "Playera",
"sku": "P0001",
"price": "100.01",
"brand": "Patito",
"is_public": true
},
{
"name": "Pantalon",
"sku": "P0002",
"price": "100.01",
"brand": "Patito",
"is_public": true
}
]
}
Endpoint: {{host}}/products/v1/P0001/ (P0001 is sku and lookup field)
HTTP Verb: GET
Header:
- Accept:application/json
Response
{
"name": "Playera",
"sku": "P0001",
"price": "100.01",
"brand": "Patito",
"is_public": true
}
Endpoint: {{host}}/products/v1/P0002/ (P0002 is sku and lookup field)
HTTP Verb: DELETE
Header:
- Authorization: Token {{access_token}}
-
Open a terminal and run the containers
- sudo docker-compose -f docker-compose-local.yml up
-
Open another terminal and get container id of postgres service
- sudo docker ps
-
Go to environment variables path (.envs/.local/.postgres) and get POSTGRES_USER and POSTGRES_DB
-
Run next command
-
sudo docker exec -it CONTAINER_ID psql -U POSTGRES_USER -a POSTGRES_DB
-
For example
- sudo docker exec -it f22dae79a480 psql -U sBLRWyyPsInwHftmHAWmYJURGWBGFpLs -a catalog
-
5.- Some interesting commands
- Show tables
- \dt
- Show table definition
- \d TABLE_NAME
- To go out
- \q
You need to run only the django service (from the docker-compose.yml file). Run the make migrations command and end the django service container.
- sudo docker-compose -f docker-compose-local.yml run --rm django python manage.py makemigrations
Note: Deleting volumes will wipe out their data. Back up any data that you need before deleting a container.
-
Stop the containers with the following command:
-
if you have the stack executed in a console (that is, you executed sudo docker-compose -f docker-compose-local.yml up) You must stop the processes run ctrl + c
-
if you have the stack executed in detached mode (that is, you executed sudo docker-compose -f docker-compose-local.yml up -d) You must stop the processes run
- sudo docker-compose -f docker-compose-local.yml down
-
-
Delete all containers using the following command:
- sudo docker rm -f $(sudo docker ps -a -q)
-
Delete all volumes using the following command:
- sudo docker volume rm $(sudo docker volume ls -q)
-
Restart the containers using the following command:
- sudo docker-compose -f docker-compose-local.yml up
Sometimes it is necessary to delete the images
-
Delete all images using the following command:
- sudo docker rmi $(sudo docker images -q)
-
Build services
- sudo docker-compose -f docker-compose-local.yml build
-
Restart the containers using the following command:
- sudo docker-compose -f docker-compose-local.yml up