Giter Club home page Giter Club logo

ecoindex_api's Introduction

⚠️ This project is being replaced by ecoindex_python_fullstack and will be archived soon. Please refer to the new project for any contribution or issue. ⚠️

Ecoindex-Api

This tool provides an easy way to analyze websites with Ecoindex on a remote server. You have the ability to:

  • Make a page analysis
  • Define screen resolution
  • Save results to a DB
  • Retrieve results
  • Limit the number of request per day for a given host

This API is built on top of ecoindex-scraper with FastAPI and Celery

OpenAPI specification

The API specification can be found in the documentation. You can also access it with Redoc.

Requirements

Installation

With this docker setup you get 6 services running that are enough to make it all work:

  • db: A MySQL instance
  • api: The API instance running FastAPI application
  • worker: The celery task worker that runs ecoindex analysis
  • redis (optional): The redis instance that is used by the Celery worker
  • flower (optional): The Celery monitoring interface
  • db-backup (optional): A useful utility that can handle automaticaly database backup

First start

cp docker-compose.yml.dist docker-compose.yml && \
docker  compose up -d --build

Then you can go to:

Upgrade

To upgrade your server version, you have to:

  1. Checkout the source version you want to deploy
  2. Re-build the server
  3. Re-start the server
git pull && \
docker compose up -d --build

We use Alembic to handle database migrations. Migrations are automaticaly played at instance startup

Configuration

Here are the environment variables you can configure:

Service Variable Name Default value Description
API WAIT_BEFORE_SCROLL 3 You can configure the wait time of the scenario when a page is loaded before it scrolls down to the bottom of the page
API WAIT_AFTER_SCROLL 3 You can configure the wait time of the scenario when a page is loaded after having scrolled down to the bottom of the page
API CORS_ALLOWED_CREDENTIALS True See MDN web doc
API CORS_ALLOWED_HEADERS * See MDN web doc
API CORS_ALLOWED_METHODS * See MDN web doc
API CORS_ALLOWED_ORIGINS * See MDN web doc
API, Worker DAILY_LIMIT_PER_HOST 0 When this variable is set, it won't be possible for a same host to make more request than defined in the same day to avoid overload. If the variable is set, you will get a header x-remaining-daily-requests: 6 in your response. It is used for the POST methods. If you reach your authorized request quota for the day, the next requests will give you a 429 response. If the variable is set to 0, no limit is set
API, Worker DATABASE_URL sqlite+aiosqlite:///./sql_app.db If you run your mysql instance on a dedicated server, you can configure it with your credentials. By default, it uses an sqlite database when running in local
API, Worker WORKER_BROKER_URL redis://localhost:6379/0 The url of the redis broker used by Celery
API, Worker WORKER_BACKEND_URL redis://localhost:6379/1 The url of the redis backend used by Celery
Worker ENABLE_SCREENSHOT False If screenshots are enabled, when analyzing the page the image will be generated in the ./screenshot directory with the image name corresponding to the analysis ID and will be available on the path /{version}/ecoindexes/{id}/screenshot
Worker CHROME_VERSION 107.0.5304.121-1 This is the version of chrome to download and run. Can be removed if you want to install latest version of chrome
Worker CHROME_VERSION_MAIN 107 This is the major version of chromethat is used for chromedriver. You have to set it accordingly to CHROME_VERSION. Be careful that if you remove CHROME_VERSION and CHROME_VERSION_MAIN or that they do not match, chromedriver will fail

Local development

You can use docker-compose.override.yml to override the default configuration of the docker-compose.yml file. For example, you can use it to mount your local code in the container and run the server in debug mode.

cp docker-compose.override.yml.dist docker-compose.override.yml

Then you can run the server:

docker-compose up -d --build

Testing

We use Pytest to run unit tests for this project. The test suite are in the tests folder. Just execute :

poetry run pytest --cov-report term-missing:skip-covered --cov=. --cov-config=.coveragerc tests

This runs pytest and also generate a coverage report (terminal and html)

Disclaimer

The LCA values used by ecoindex_api to evaluate environmental impacts are not under free license - ©Frédéric Bordage Please also refer to the mentions provided in the code files for specifics on the IP regime.

ecoindex_api's People

Contributors

dependabot[bot] avatar github-actions[bot] avatar vvatelot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

ecoindex_api's Issues

Endpoint valeurs min, max, med

Afin de

Pouvoir récupérer les valeurs minimales, maximales, médianes des tailles de page, nombres d'éléments dans une page et nombres de requêtes dans une page

Je veux

Créer un nouveau endpoint GET /{version}/ecoindexes/statistics qui renverra la structure suivante :

{
  "dom": {
    "min": 1,
    "max": 10000,
    "median": 450
  },
  "request": {
    "min": 1,
    "max": 10000,
    "median": 450
  },
  "size": {
    "min": 10,
    "max": 10000,
    "median": 450
  }
}

Infos

  • Ce endpoint doit permettre de récupérer ces infos par version
  • Les données pourront être mises en cache pendant un certain temps (une journée ?) pour éviter des appels inutiles à la BDD

Code d'erreur via l'API (425 Too Early)

Hello la team,

J'obtiens des code d'erreur de la part du serveur ou de l'API lorsque j'essaye d'interroger les résultats d'une page.

Sur Flower, le statut est à "success", je peux voir le résultat, mais en passant par l'API, impossible d'obtenir un résultat, et j'obtiens ce code de statut : "425 Too Early"

Quelqu'un pour m'aider ?

Maxence.

Ranking de la page

Afin de

Pouvoir situer notre analyse de page sur un panel d'analyse

Je veux

Indiquer le "ranking" de cette analyse sur l'échantillon total

Infos

  • Je veux modifier le modèle de données du résultats d'une analyse pour la requête GET /ecoindexes/{id} pour ajouter un champs initial_ranking
  • Enregistrer en base cette donnée initial_ranking au format JSON du type
{
  "rank": 1234,
  "total": 987654
}
  • Faut il rajouter un champs ranking pour renvoyer le classement de cette analyse à un instant T...?
  • Il faut également créer une commande d'initialisation pour gérer toutes les analyses déjà enregistrées en base

[Bug]: Certaines URLs trop grande ne sont pas prises en compte dans la database

What happened?

Hello,

Je remonte le bug que je t'avais remonté lors de la visio.

L'URL semble beaucoup trop grande pour la colonne dans MySQL (272 caractères).

Merci.
Maxence.

Version

^3.6

What OS do you use?

Linux

urls

https://www.edf.gp/edf-en-guadeloupe/l-actualite-dans-l-archipel-guadeloupe/toutes-les-actualites/edf-archipel-guadeloupe-partenaire-de-la-10eme-edition-nationale-des-semaines-de-sensibilisation-des-jeunes-a-l-entrepreneuriat-portee-par-l-association-100-000-entrepreneurs

Relevant log output

File "/usr/local/lib/python3.11/site-packages/aiomysql/connection.py", line 641, in _read_packet
    packet.raise_for_error()
  File "/usr/local/lib/python3.11/site-packages/pymysql/protocol.py", line 221, in raise_for_error
    err.raise_mysql_exception(self._data)
  File "/usr/local/lib/python3.11/site-packages/pymysql/err.py", line 143, in raise_mysql_exception
    raise errorclass(errno, errval)
sqlalchemy.exc.DataError: (pymysql.err.DataError) (1406, "Data too long for column 'url' at row 1")
[SQL: INSERT INTO apiecoindex (width, height, url, size, nodes, requests, grade, score, ges, water, ecoindex_version, date, page_type, id, host, version, initial_ranking, initial_total_results) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)]
[parameters: (1920, 1080, AnyHttpUrl('https://www.edf.gp/edf-en-guadeloupe/l-actualite-dans-l-archipel-guadeloupe/toutes-les-actualites/edf-archipel-guadeloupe-partenaire-de-la-10eme-edition-nationale-des-semaines-de-sensibilisation-des-jeunes-a-l-entrepreneuriat-portee-par-l-association-100-000-entrepreneurs', ), 831.317, 390, 33, 'B', 72.0, 1.56, 2.34, '5.4.1', datetime.datetime(2023, 3, 24, 8, 49, 25, 671085), 'article', 'eaa315f9ac35405190ac923be9e84aa8', '[www.edf.gp',](http://http//www.edf.gp%26/#39;,) 1, 316, 1085)]
(Background on this error at:

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]:

What happened?

Hello,

after an API call /v1/tasks/ecoindexes, my task goes to FAILURE status. Using Flower I noticed a typing error on the version of chrome. To get past this, I added the ENV var CHROME_VERSION_MAIN to my .env file. This enabled me to move on to the next stage and obtain a SUCCESS. Unfortunately, the result is still inconclusive:

'{"status": "FAILURE", "detail": null, "error": {"detail": {"args": [], "exception": "WebDriverException", "message": "unknown error: cannot connect to chrome at 127.0.0.1:59747\nfrom chrome not reachable"}, "exception": "WebDriverException", "message": "unknown error: cannot connect to chrome at 127.0.0.1:59747\nfrom chrome not reachable", "status_code": 500, "url": "https://www.apple.com"}}'

Could you help me?

Thank you

Version

^3.6

What OS do you use?

Linux

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Get screenshot of the analyzed page

In order to

Get a proof of the analyzed page has been really loaded

I want to

Get a screenshot when the page is loaded and save it to the server in a public path

Infos

  • Name the image {id}.jpg using save_screenshot method from undetected chrome driver
  • Store it to a folder public
  • Create a volume mount on docker-compose.yml.dist
  • Create a Fastapi static mounting point name /static
  • Add the url in the object response

Add notification on error

In order

To be notified when something goes wrong

I want to

Be notified by a given way of the system error

Ideas

  • Can be a mattermost webhook on Exception

Set user agent parameter for selenium

In order to

Run analysis with valid user-agent

I want to

Allow users to define their own user-agent in the POST /v1/ecoindexes request

Technical details

  • Read header from request x-user-agent
  • Set a default user agent
  • Pass this user-agent to ecoindex get_page_analysis

[Bug]: L'API retourne l'erreur "You can't write against a read only replica"

What happened?

Suite à l'installation du service sur Scaleway (le soucis était le même chez Digital Ocean)

Après quelques jours d'utilisation
Lors d'un appel à l'API, le message suivant est retourné :
"You can't write against a read only replica"

(retour complet : { ["detail"]=> object(stdClass)#15113 (3) { ["args"]=> array(1) { [0]=> string(44) "You can't write against a read only replica." } ["exception"]=> string(16) "OperationalError" ["message"]=> NULL } } )

Apparemment le soucis est lié à Redis et à l'accessibilité public du port 6379

Une solution identifiée serait :

Remediation of this issue will take just a few minutes and is relatively straightforward. You will need to open /etc/redis/redis.conf and uncomment (remove the “#”) or modify the line beginning with:

#bind 127.0.0.1 ::1

Afterwards, restart redis with:

sudo systemctl restart redis

Merci pour ton aide

A dispo

Version

^3.6

What OS do you use?

Linux

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Enhancement]: Remonter un résultat lorsqu'on a un code 429

What happened?

Remonter le dernier résultat connu au lieu d'une erreur 429.
Ou renvoyer une erreur 429 tout en envoyant un résultat.

Version

^3.6

What OS do you use?

No response

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: url length limit

What happened?

Hello sir I contact you to report bug in your api official url length max is 2083 chars and your api do not accept this size (below is url of size 379).

height 1960
url "https://geo.captcha-delivery.com/captcha/?initialCid=AHrlqAAAAAMAy5ctdzFrZe4Awsd1kw%3D%3D&hash=05B30BD9055986BD2EE8F5A199D973&cid=u.Olw7yzo_CdpULe.-yA3S28H_64TvF2acb1NxBopd-ZAqXMMRD1TeiM69GGPHAkklgHIdirA-qQIxvbJhrjeVy9IE1rOwH9imodn~PBuwB1FZdOWpp3xL_ocqweuG_&t=fe&referer=https%3A%2F%2Fwww.leboncoin.fr%2F&s=2089&e=8302bf5d5a4c3eef1bc6f25ed604931c7605f831275f3bcaa1d07e423e14d75a"
width 1080

detail Object { exception: "DataError", args: […], message: null }
args [ "(pymysql.err.DataError) (1406, "Data too long for column 'url' at row 1")" ]
exception "DataError"
message null
You - 2022-05-23, 19:29:11

Version

^3.6

What OS do you use?

Windows

urls

https://geo.captcha-delivery.com/captcha/?initialCid=AHrlqAAAAAMAy5ctdzFrZe4Awsd1kw%3D%3D&hash=05B30BD9055986BD2EE8F5A199D973&cid=u.Olw7yzo_CdpULe.-yA3S28H_64TvF2acb1NxBopd-ZAqXMMRD1TeiM69GGPHAkklgHIdirA-qQIxvbJhrjeVy9IE1rOwH9imodn~PBuwB1FZdOWpp3xL_ocqweuG_&t=fe&referer=https%3A%2F%2Fwww.leboncoin.fr%2F&s=2089&e=8302bf5d5a4c3eef1bc6f25ed604931c7605f831275f3bcaa1d07e423e14d75a

Relevant log output

detail	Object { exception: "DataError", args: […], message: null }
args	[ "(pymysql.err.DataError) (1406, \"Data too long for column 'url' at row 1\")" ]
exception	"DataError"
message	null

Code of Conduct

  • I agree to follow this project's Code of Conduct

Add endpoint to list all hosts

In order to

Know on which domain an analysis has already been made

I want to

Provide an endpoint /hosts to retrieve the list of hosts recorded in the DB

Technical details

  • Method: GET
  • Filters: q (query filter), date_from and date_to
  • Order by Ascending name
  • Activate pagination

[Bug]: Trop d'URLs tue l'analyse

What happened?

J'ai l'impression que les fenêtres de Chrome ne se ferment pas pendant le scrapping quand il y a des erreurs du côté de l'API ou du scrap, j'ai remonté une erreur déjà sur l'issue #316 mais là ça devient problématique pour l'analyse de plusieurs URLs (environ 5000).

Mon docker-compose.override.yml :

x-logging: &default-logging
  options:
    max-size: "12m"
    max-file: "5"
  driver: json-file
services:
  worker:
    entrypoint: |
      bash -c  "celery -A worker.tasks worker -P threads --concurrency=1"
    deploy:
      replicas: 3
  api:
    logging: *default-logging
  flower:
    logging: *default-logging

Version

^3.6

What OS do you use?

Linux

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Healthcheck chromedriver

In order to

Have a check on the chromedriver

I want to

Add a chromedriver information in the healtcheck endoint

Info

If the healthcheck is OK, then the value of the field chromedriver is true else false

Endpoint to get remaining daily analysis

in order to

Know how many analysis I Can run for a given Host today

I want to

Have a dedicated endpoint

details

Create an endpoint /v1/hosts/{Host}/remaining that replies with an integer

[Bug]: Erreur lors du déploiement - ERROR [worker requirements-stage 6/9]

Bonjour !

What happened?

Depuis la branche Main

Lors du build : docker compose up -d --build

Cette erreur apparait et bloc le déploiement :

=> ERROR [worker requirements-stage 6/9] RUN poetry export --with=worker --output=requirements.txt --without-hashes


[worker requirements-stage 6/9] RUN poetry export --with=worker --output=requirements.txt --without-hashes:
#0 0.890
#0 0.891 Group(s) not found: worker (via --with)


failed to solve: process "/bin/sh -c poetry export --with=worker --output=requirements.txt --without-hashes" did not complete successfully: exit code: 1

Je n'avais pas cette erreur lors de précédent déploiement.

Version

^3.6

What OS do you use?

Linux

urls

No response

Relevant log output

=> ERROR [worker requirements-stage 6/9] RUN poetry export --with=worker --output=requirements.txt --without-hashes

------
 > [worker requirements-stage 6/9] RUN poetry export --with=worker --output=requirements.txt --without-hashes:
#0 0.890 
#0 0.891 Group(s) not found: worker (via --with)
------
failed to solve: process "/bin/sh -c poetry export --with=worker --output=requirements.txt --without-hashes" did not complete successfully: exit code: 1

Merci beaucoup pour votre code et pour votre aide

Code of Conduct

  • I agree to follow this project's Code of Conduct

Stats of queued analysis

In order to

Follow analysis results

I want to

Be able to retrieve statistics and details about analysis in celery queue

Infos

  • Get the count of tasks / Success tasks / Failed tasks
  • Get the count of failed tasks by reason
  • Expose an API Endpoint ? CLI ?

[Bug]: Pagination is slow

What happened?

When requesting GET /v1/ecoindexes we use the plugin Fastapi Pagination

Problem: this gets all the results in DB and then send the response filtered. When you have a lot of records in your DB, the process is resource-intensive

Version

^3.6

What OS do you use?

Linux

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Add a check on the chrome browser in healthcheck

In order to

Improve the healthcheck coverage

I want to

Add a check on the availability of the chrome browser on the system

Technical details

  • https://github.com/Kludex/fastapi-health is already implemented in api/main.py
  • You have to take in account the env var REMOTE_CHROME_URL: If defined, you have to check that remote chrome is Up, else you have to check that you have the chrome binary on your system
  • You also have to check the version of the chromedriver-binary == chrome binary

[Bug]: Erreur 500 quand quota journalier atteint

What happened?

Lorsque le quota journalier est atteint on obtient une erreur 500 avec une exception QuotaExceededException

Version

^3.6

What OS do you use?

Linux

urls

https://novagaia.fr

Relevant log output

raise QuotaExceededException(common.exception.QuotaExceededException: You have already reached the daily limit of 10 requests for host novagaia.fr today)

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Enhancement]: Améliorer les retours de l'API

Aucune mesure disponible pour https://www.yourdomain.tld

https://bff.ecoindex.fr/api/results/?url=https://novagaiaa.fr/test
L'API retourne une erreur 404, mais retourne un arbre de données vide.

{
  "count": 0,
  "latest-result": {
    "id": "",
    "grade": "",
    "score": 0,
    "date": "",
    "requests": 0,
    "size": 0,
    "nodes": 0,
    "url": "",
    "color": ""
  },
  "older-results": null,
  "host-results": null
}

Proposition

  • Utiliser le code 204
  • Ne pas retourner d'arbre vide, mais un message.
{
  "code": 204,
  "message": "Aucune mesure disponible pour https://www.yourdomain.tld"
}

Aucune mesure disponible pour la page https://www.yourdomain.tld/test

https://bff.ecoindex.fr/api/results/?url=https://novagaia.fr/test
L'API retourne une erreur 200, mais retourne un latest-result vide.

{
  "count": 17,
  "latest-result": {
    "id": "",
    "grade": "",
    "score": 0,
    "date": "",
    "requests": 0,
    "size": 0,
    "nodes": 0,
    "url": "",
    "color": ""
  },
  "older-results": null,
  "host-results": [
    {
      "id": "46df881f-0cc3-4153-9fa9-871f65b22362",
      "grade": "B",
      "score": 74,
      "date": "2023-03-21T22:30:57",
      "requests": 39,
      "size": 1210.65,
      "nodes": 264,
      "url": "https://novagaia.fr/offres/creez-un-site-web-performant-et-moins-polluant/",
      "color": "#51B84B"
    },
    {
      "id": "430b0d3c-0bd2-4b75-bdea-8d410f2615d3",
      "grade": "B",
      "score": 72,
      "date": "2023-03-21T22:29:10",
      "requests": 55,
      "size": 779.939,
      "nodes": 273,
      "url": "https://novagaia.fr/",
      "color": "#51B84B"
    },
    // ...
  ]
}

Proposition

  • Retourner un code 204 ;
  • Un message ;
  • Ne pas retourner latest-result.

Je ne sais pas comment gérer, à par tester que l'id est vide...

{
  "code": 204,
  "message": "Aucune mesure disponible pour la page https://www.yourdomain.tld/test",
  "count": 17,
  "latest-result": {
    "id": "",
    "grade": "",
    "score": 0,
    "date": "",
    "requests": 0,
    "size": 0,
    "nodes": 0,
    "url": "",
    "color": ""
  },
  "older-results": null,
  "host-results": [
    {
      "id": "46df881f-0cc3-4153-9fa9-871f65b22362",
      "grade": "B",
      "score": 74,
      "date": "2023-03-21T22:30:57",
      "requests": 39,
      "size": 1210.65,
      "nodes": 264,
      "url": "https://novagaia.fr/offres/creez-un-site-web-performant-et-moins-polluant/",
      "color": "#51B84B"
    },
    {
      "id": "430b0d3c-0bd2-4b75-bdea-8d410f2615d3",
      "grade": "B",
      "score": 72,
      "date": "2023-03-21T22:29:10",
      "requests": 55,
      "size": 779.939,
      "nodes": 273,
      "url": "https://novagaia.fr/",
      "color": "#51B84B"
    },
    // ...
  ]
}

Atteinte de limit de 10 test/j/domain

J'ai une erreur 500, pas d'infos et l'extension chrome ne sait pas gérer, car elle mouline dans vide...

Proposition

  • code erreur: 429
  • message: Le domain https://www.yourdomain.tld/test a atteint le nombre des n mesures proposées par Ecoindex pour la journée du DD/MM/YY
  • nombre de tests restants sur ce domain pour cette journée.
{
  "code": 429,
  "message": "Le domain https://www.yourdomain.tld/test a atteint le nombre des `n` mesures proposées par Ecoindex pour la journée du DD/MM/YY",
  "count": 17,
  "latest-result": {
    "id": "",
    "grade": "",
    "score": 0,
    "date": "",
    "requests": 0,
    "size": 0,
    "nodes": 0,
    "url": "",
    "color": ""
  },
  "older-results": null,
  "host-results": [
    {
      "id": "46df881f-0cc3-4153-9fa9-871f65b22362",
      "grade": "B",
      "score": 74,
      "date": "2023-03-21T22:30:57",
      "requests": 39,
      "size": 1210.65,
      "nodes": 264,
      "url": "https://novagaia.fr/offres/creez-un-site-web-performant-et-moins-polluant/",
      "color": "#51B84B"
    },
    {
      "id": "430b0d3c-0bd2-4b75-bdea-8d410f2615d3",
      "grade": "B",
      "score": 72,
      "date": "2023-03-21T22:29:10",
      "requests": 55,
      "size": 779.939,
      "nodes": 273,
      "url": "https://novagaia.fr/",
      "color": "#51B84B"
    },
    // ...
  ]
}

Demande de mesure d'une URL

Pourquoi recevoir un numéro de task (ce devrait être du json), plutôt qu'attendre et retourner l'arbre ?

"35c64f8e-9bca-4c0d-b5db-9dda4d1c6fe5"

Proposition

  • renvoyer le code 201
  • renvoyer l'arbre rempli par le test courant dans result ;
  • indiquer le nombre de test restant dans testLeftToday.
{
  "code": 201,
  "testLeftToday": 4,
  "count": 17,
  "result": {
      "id": "46df881f-0cc3-4153-9fa9-871f65b22362",
      "grade": "B",
      "score": 74,
      "date": "2023-03-21T22:30:57",
      "requests": 39,
      "size": 1210.65,
      "nodes": 264,
      "url": "https://novagaia.fr/offres/creez-un-site-web-performant-et-moins-polluant/",
      "color": "#51B84B"
    },
}

Gestion des migrations

Utiliser Alembic pour gérer les modifications sur la base de données et automatiser les déploiements :

  • Installer Alembic
  • Configurer pour générer les fichiers de migrations
  • Générer une migration d'initialisation
  • Modifier le build docker pour automatiquement jouer toutes les migrations en retard

[Bug]: Out of memory: killed process xxxxx (chrome)

What happened?

After some scans (15-20 about) the server overloads.
The API runs on an ESX-Server with 4 GB of RAM and 4 vCPUs. I incremented it to 12 GB RAM and 8 vCPUs, but the problem occurs after some scans again. If I connect to the flower interface (localip:5555) I see the tasks running well and completing in 5-10 seconds, then the last one takes 400+ seconds and results in a failure and the interface gets inaccessible.
Only way to fix is to restart the virtualized machine.

I also did a fresh install, clean Debian only with the needed dependencies, but the same behavior.

The tasks/scans are filled one-by-one, after the last one finishes and gets a result, a new one gets filled. This error first appeared 2 months ago and is still present until today.

Attached is the output from: ps -aux

What output should/can I attach to give you more feedback?

Thanks in advance
ps -aux.txt

Version

^3.6

What OS do you use?

Linux

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: Handle `ERR_SSL_PROTOCOL_ERROR`

What happened?

When an error occurs on a website with certificate issue, we get a ERR_SSL_PROTOCOL_ERROR

Version

^3.6

What OS do you use?

Linux

urls

https://kliste.fr

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: CORS

What happened?

Erreur CORS lors de l'appel de l'API via un frontend autre que l'url de l'API

Version

^3.6

What OS do you use?

Linux

urls

No response

Relevant log output

Blocage d’une requête multiorigines (Cross-Origin Request) : la politique « Same Origin » ne permet pas de consulter la ressource distante située sur https://ecoindex.my.hosting:8001/v0/ecoindexes?page=1&size=50. Raison : l’en-tête CORS « Access-Control-Allow-Origin » est manquant. Code d’état : 200.

Code of Conduct

  • I agree to follow this project's Code of Conduct

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.