- Docker
- Ruby 3.2.2
- Rails 7.1.1
- PostgreSQL
- ElasticSearch
This assessment has been broken into a list of tasks, which has been listed below. MVC pattern has been used as the primary pattern the assessment is developed on, and provides an JSON based API. Which supports nested resources. Elasticsearch would be used to implement a standaized search. OOP concepts are used as the core.
For the ease of the reviewer I have included a Postman collection.
-
Build docker container which includes the above stack
- Setup initial ruby image based container
- Integrate PostgreSQL
- Integrate ElasticSearch
-
Develop
verticals
- Develop model
- Develop controller / API end points
- specs
-
Develop
categories
- Develop model
- Develop controller / API end points
- specs
-
Develop
courses
- Develop model
- Develop controller / API end points
- specs
-
Integrate ElasticSearch to
search#search
-
Update seed
-
Implement authentication layer
- setup devise with tokens for authentication, signup and login would provide a bearer authorization token, which would be used with subsequent API calls.
- authorize API
Implementing oauth would have to be stopped due to the following reasons, without which it would prevent me from completing this assessment.
- To implement the entire flow a proper frontend is needed.
- The frontend would contain the major portion of work with regard to oauth integration.
- Whilst the backend would only store that information, such as uid, provider, name and avatar, and a few calls.
Since docker has been configued to setup the enviorement use the following command:
docker-compose up --build
Since OAuth has been implemented you will first need to get application credentials (client application has been created using the seeds), which you can do using the following:
- access the enviorenment using
rails c
- get the
client_id
usingDoorkeeper::Application.first.uid
- get the
client_secret
usingDoorkeeper::Application.first.secret
Use the client_id
and client_secret
along with the user credentials to login, please refer the following image:
Use the access_token
to access the rest of the API
A Postman collection has been included for easier access.
The code and the Database has been optimized and where necessary indexes are created. However there are more oppertunities to improve the solution such as refactor the author
into a seperate class.
Currently the Database has been developed to allow indexing and with Elasticsearch it will use NoSQL to improve the search. If the solution is to handle large amount of imformation, the hardware layer can be scaled to improve the speed. To that point since docker has been implemented, with Kubernetes the scalling can be automated.
- Create an
author
model and move the author out of thecourse
model course
could have multiplecategories
- Optimize serializer to only include required fields
- Optimize Elasticsearch, so that Enums such as the state would be properly worded.