techrail / bark Goto Github PK
View Code? Open in Web Editor NEWA small and easy-to-use tool that uses PostgreSQL for collecting logs from multiple sources
Home Page: https://techrail.in/projects/bark/what-is-bark
License: MIT License
A small and easy-to-use tool that uses PostgreSQL for collecting logs from multiple sources
Home Page: https://techrail.in/projects/bark/what-is-bark
License: MIT License
Bark client now uses the log/slog
package which is part of Go release 1.21 and above. We need to update the required go version in the README.
When we are in production, we typically don't need the debug level logs. The client should allow us to either enable or disable the debug level logs for being sent from the client to whatever is the destination (server, stdout, file etc.).
We need to create one more function in the client like client.NewClientWithServer
- which can take the logs and instead of doing a network call and all, directly start the routine to save logs to the DB (the server side routine) and bypass the network call and rest API and send the logs directly to the channel from which it would be sent to the DB.
Develop a LogManager (or LogClient) package that can be used by an end user (any Go developer).
Some relevant examples from other libraries that we can consider.
static Logger logObj = Logger.getLogger(Random.class.getName());
logObj
to write logs to the console or standard output.logger, _ := zap.NewProduction()
logger
's various methods to write the logs to the console or standard output.Both of these libraries then have multiple methods to write various log levels.
We have to add more documentation in the README file.
The field session_name
in the database (and the corresponding struct fields in the code) actually refer to the instance of the service that sent the log message to the server. The current name session_name
is ambiguous, especially when we are talking about user login session and similar stuff. We should rename it to something which makes more sense, e.g. service_instance_name
would be a good choice.
We need to create a docker container for bark so that the server can be started easily by others.
docker-compose.yml
fileWe are still referring to an older Docker image in the README.
Load the database URL from the environment variable instead of hardcoding it.
NOTE: A new issue (#67) was created to better capture the needs according to the existing structure.
Just like we have .error .info .warn
, we can also have a .webhook
that will insert a log to the database and also call the webhook url with that log. You can find more information about webhooks here
These can be used when a critical part of your application fails. Instead of just logging to the database, you can also call a webhook to notify you immediately or run an automation.
The function may look something like this:
func (c *Config) Webhook(webhookURL, logMsg string) {
// ..logic
}
Currently, the project supports single inserts into the DB through endpoints. However, a set of logs needs to be appended in the most optimized way possible through batch inserts such as during graceful exits, multiple inserts, etc.
Adding a connection pool to reduce the overhead of opening and closing a connection. Database connections are cached so that any future request can reuse an existing connection.
We can use a simple boxed diagram for this!
We get an empty response body while trying to insert a log.
Here's the curl
for the above request,
curl --request POST \
--url http://127.0.0.1:8080/insertSingle \
--data '{
"id": -100,
"logLevel": "info",
"serviceName": "TestPostman",
"sessionName": "testSession",
"code": "1KE0H8",
"msg": "Shows up in DB or not?",
"moreData": {
"name":"vaibhav"
}
}'
This is true for all endpoints we have right now.
We should modify docker-compose.yml file to add Bark service.
The current implementation of the project relies on the use of the Reflect API, which is considered suboptimal due to its impact on performance and maintainability. To enhance the overall codebase, this issue proposes refactoring the existing code to replace the Reflect API with SQLX, a more efficient and structurally sound approach that offers significant improvements in terms of both performance and code maintainability.
Right now, we just have the functions which accept a line of text. We also need a way that allows the user to send a full log struct (including the moreData
field) to the server. We need to create that function in client.
The MoreData member of BarkLog takes a json as value. We need to create a functionality to allow user to set this to any value of the user's choice. By default, this will be empty.
Something like:
more_data:
{
errorFileName: "",
errorFunctionName:"",
errorLine:
}
Or anything relevant to the user's code base this is just a random example.
We are performing way too many allocations in the StartWritingLogs
function in the server log writer file. We need to optimize that so that high bursts of incoming log traffic could be ingested. Things we can try:
BarkLogDao.InsertBatch
function to the StartWritingLogs
function - that should avoid the memory copy required for making the function call and the memory allocations happening inside the InsertBatch
function.Problem: Right now Bark exposes a simple HTTP endpoint to insert data into the database, but doesn't have a package that can be imported into another project to be used directly.
Proposed behavior: To include a package logger
which exposes a method GetLogger
, which can be used to obtain a logger function that can be readily used in other projects just by importing the logger package.
Example:
log:= logger.GetLogger(serviceName)
Then log
is used like this,
bark(INT_LEVEL, "code", "message", some_json_data)
In the world of PostgreSQL, we have two very famous connectors:
Now, bark is supposed to be fast. So we need to know which one is fast and by what margin. There are other questions around PostgreSQL connections, pooling, support for goroutines, error handling etc. but for now we need to just check the performance difference between the two libraries!
We want to implement the slog interface to handle structured logging in Bark.
A webhook is a function that was to be supplied by the user. The usage ties to the ALERT log level. So that when the client (and maybe someday the server too) encounters an ALERT
level log, it will call the supplied webhook function. The function of this Webhook is to call an external service, primarily for alerting the user of a near-fatal event, something that needs to be looked into urgently.
Now, different people have different needs. My alerting mechanism might be to send the event to slack while another person or team might want to send alert to PagerDuty while someone else might want to trigger am email to their SRE team or whatever. We can't determine what the user will do.
So the best we can do is to leave the option open - let the user write the function that does what they need. The authentication mechanism, the method to call (HTTP/GRPC/send to another channel etc.) is open to their choices. Hence that choice.
That is the reason the Webhook types was defined (and is currently commented) in the client code as:
type webhook func(models.BarkLog) error
and was added to the Client struct type as:
AlertWebhook webhook
and the user function to hook their method to the client was defined like this:
func (c *Config) SetAlertWebhook(f webhook) {
c.AlertWebhook = f
}
We need to write the mechanism to call this function when a Alert level log is recieved.
We need to set up GitHub Actions for automated testing in our repository to ensure code quality and reliability. Automated testing is a crucial part of our development workflow and will help catch issues early, reduce manual testing efforts, and maintain the overall health of our project.
The name of this variable might easily clash with env vars of other projects. The goal of this task is to prefix that variable with BARK_
Default flow of bark after collecting logs is to send it to send to app_log table. However, if the user requires the logs to be written to a file, they should be able to configure it.
Using a simple variable while creating logger, we can ask the user input. WriteToDB ? Yes/No.
If it is no, then the logs after being sent to the channel, will be written to a .txt file.
We should write an injector which will be able to read a log file that was outputted by Uber's zap library (running in Production
mode) and input the logs into barks database.
This will enable anyone with Bark codebase to see the request signature for Bark easily.
We need to create the Dockerfile for bark.
Addition of contributing markdown file in the project.
Let's see if it fires!
To modify endpoint /
to provide current information including, but not limited to, app version app name, etc.
We need to create a new script to release new versions. It should:
Tag the latest commit on the main branch with the new version number and push it to github
For:
The output name for each binary should reflect the platform and
For:
The docker images are supposed to have the right tag (same as the version number)
Then it should be able to tag the builds and push to Docker Hub!
It currently sets the Default Log Level to the constants.DefaultLogLevel
whereas it must set it to what was set by the user when creating a new log client (using the client.New
function).
We need to create a demo application that will showcase the capability of Bark. The application may highlight the following features:
This demo application will serve as a powerful example for developers, demonstrating the benefits and advantages of using Bark as a logging library and utilizing PostgreSQL for log storage.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.