Giter Club home page Giter Club logo

cyhy-core's Introduction

CISA: Cyber Hygiene Core Libraries

This project contains the core libraries and executables for the CISA Cyber Hygiene program. It coordinates the multiple scanners and allows the creation of pretty reports.

CyHy configuration

The cyhy-core library requires a configuration be created. The default location for this file is /etc/cyhy/cyhy.conf. An example configuration is below.

/etc/cyhy/cyhy.conf

[DEFAULT]
default-section = cyhy-ops-ssh-tunnel-docker-mac
report-key = master-report-password

[cyhy-ops-ssh-tunnel-docker-mac]
database-uri = mongodb://<MONGO_USERNAME>:<MONGO_PASSWORD>@host.docker.internal:27017/cyhy
database-name = cyhy

[cyhy-ops-ssh-tunnel-docker]
database-uri = mongodb://<MONGO_USERNAME>:<MONGO_PASSWORD>@localhost:27017/cyhy
database-name = cyhy

[cyhy-ops-staging-read]
database-uri = mongodb://<MONGO_USERNAME>:<MONGO_PASSWORD>@database1:27017/cyhy
database-name = cyhy

Using CyHy commands with Docker

The CyHy commands implemented in the docker container can be aliased into the host environment by using the procedure below.

Alias the container commands to the local environment:

eval "$(docker run cisagov/cyhy-core)"

To run a CyHy command:

cyhy-tool status NASA

Caveats and gotchas

Whenever an aliased CyHy command is executed, it will use the current working directory as its home volume. This limits your ability to use absolute paths as parameters to commands, or relative paths that reference parent directories, e.g.; ../foo. That means all path parameters to a CyHy command must be in the current working directory, or a subdirectory.

Do this? Command Reason
Yes cyhy-import NASA.json parameter file is in the current directory
Yes cyhy-import space_agencies/NASA.json parameter file is in a sub-directory
NO! cyhy-import ../WH.json parameter file is in a parent directory
NO! cyhy-import /tmp/SPACE_FORCE.json parameter file is an absolute path

Advanced configuration

By default, the container will look for your CyHy configurations in /etc/cyhy. This location can be changed by setting the CYHY_CONF_DIR environment variable to point to your CyHy configuration directory. The commands will also attempt to run using the cisagov/cyhy-core image. A different image can be used by setting the CYHY_CORE_IMAGE environment variable to the image name.

Example:

export CYHY_CONF_DIR=/private/etc/cyhy
export CYHY_CORE_IMAGE=cisagov/cyhy-core

Building the cyhy-core container

If you want to include a MaxMind GeoIP2 database in the docker image you must provide a key and optionally the type of key to Docker. The default type is "lite", but if you have the subscription license you can instead use the "full" type.

The following commands show how to build the Docker container for cyhy-core.

No MaxMind license

docker build --tag cisagov/cyhy-core .

MaxMind GeoLite2 license

docker build --tag cisagov/cyhy-core \
             --build-arg maxmind_license_key=<license key> .

MaxMind GeoIP2 license

docker build --tag cisagov/cyhy-core \
             --build-arg maxmind_license_type="full" \
             --build-arg maxmind_license_key=<license key> .

WARNING: Be careful- you really don't want to push a Docker image to the public Docker Hub if it was built with your MaxMind license key!

Building a cyhy-core image for distribution

The helper script generate_cyhy_docker_image.sh can be used to automate building and saving a Docker image.

Usage:
  generate_cyhy_docker_image.sh [options]

Options:
  -k, --maxmind-key KEY  The MaxMind GeoIP2 key to use (defaults to
                         retrieval from AWS SSM)
  -n, --image-name NAME  Image name to use [default: cisagov/cyhy-core].
  -t, --image-tag TAG    Image tag to use [default: latest].
  -h, --help             Display this message.

Notes:
- Requires Docker and optionally the AWS CLI to run.

Use default name and tag with MaxMind license key from AWS

./generate_cyhy_docker_image.sh

Use default name and tag with a provided MaxMind license key

./generate_cyhy_docker_image.sh --maxmind-key <license key>

Use specified name and default tag

./generate_cyhy_docker_image.sh --image-name cisagov/cyhy-env

Use default name and specified tag

./generate_cyhy_docker_image.sh  --image-tag testing

Use specified name and tag

./generate_cyhy_docker_image.sh --image-name cisagov/cyhy-env --image-tag testing

Manual installation

Required third party libraries can be installed using:

pip install --requirement requirements.txt

Required configurations: The commander will read /etc/cyhy/commander.conf If you do not have this file, please create one (even if empty)

IP Address Geolocation Database: The geolocation database is not included in the source tree due to size and licensing. Please cd into the var directory and run the get-geo-db.sh script to get the latest database.

Development installation

If you are developing the source the following installation will allow in place editing with live updates to the libraries, and command line utilities.

sudo pip install numpy
sudo pip install geos
sudo pip install --requirement requirements.txt

cyhy-core's People

Contributors

dav3r avatar felddy avatar hillaryj avatar jeffkause avatar jsf9k avatar mcdonnnj avatar st0rmbl3ss3d avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cyhy-core's Issues

Improve scan window validation in cyhy-import

Recently, a request document JSON file was imported using cyhy-import and it had an incorrect value in the hour field of one of the scan windows:

{ 
    "duration" : 9, 
    "start" : "57:00:00", 
    "day" : "Saturday"
}

cyhy-import happily imported this JSON to the database, but the commander immediately choked on it:

2020-05-23 14:16:47,427 CRITICAL cyhy_commander.commander - hour must be in 0..23: Saturday 57:00:00
2020-05-23 14:16:47,427 CRITICAL cyhy_commander.commander - Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/cyhy_commander/commander.py", line 459, in do_work
    self.__ch_db.balance_ready_hosts()
  File "/usr/local/lib/python2.7/dist-packages/cyhy/db/chdatabase.py", line 97, in balance_ready_hosts
    limits = self.request_limits()
  File "/usr/local/lib/python2.7/dist-packages/cyhy/db/chdatabase.py", line 71, in request_limits
    request["windows"], when
  File "/usr/local/lib/python2.7/dist-packages/cyhy/db/time_calc.py", line 19, in in_windows
    dt = parser.parse(parse_me)
  File "/usr/local/lib/python2.7/dist-packages/dateutil/parser/_parser.py", line 1374, in parse
    return DEFAULTPARSER.parse(timestr, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/dateutil/parser/_parser.py", line 657, in parse
    six.raise_from(ParserError(e.args[0] + ": %s", timestr), e)
  File "/usr/lib/python2.7/dist-packages/six.py", line 718, in raise_from
    raise value
ParserError: hour must be in 0..23: Saturday 57:00:00

We should update cyhy-import to validate the scan window JSON so that we can avoid problems like this in the future.

For reference, see CYHYOPS-5536.

Fix GNIS_data_import.py script

When load_places.sh is run it will in turn runs GNIS_data_import.py three times to import two GNIS data files and then a custom places file. When run the GNIS_data_import.py script checks to see if a minimum number of places are already in the database (currently 200,000) and if there are it will not load the provided file.

This results in two issues:

  1. The custom places file is never loaded because one of the GNIS files has enough entries to exceed the minimum number to insert.
  2. The database will become stale as subsequent calls with newly retrieved GNIS data will not be inserted since the minimum number of entries is reached with the initial population of the database.

New python module request for cyhy-core docker image

requesting BeautifulSoup (bs4) and lxml to be added to the cyhy-core image.

This will be to assist in scripts that need to perform web scraping, as it is a useful module to parse html results.

There is an immediate need for this as I am developing a script to get a pulse on how a change from CVSS2 to CVSS3 would affect our Federal customer base in regards to BOD 18-01 and need to be able to both pull form the cyhy database and perform some web scraping.

lxml would also be nice, if it is not already included, as it has a lot of useful XML parsing utility and can be used as a parse for bs4 which is faster than the built-in html parser

Geoipupdate.sh error

🐛 Summary

While creating a new docker image, geoipupdate.sh is failing due to a MaxMind error.

To reproduce

Steps to reproduce the behavior:

  1. Running ./generate_cyhy_docker_image.sh --maxmind-key <key>, results in in the following error:
ERROR [15/18] RUN var/geoipupdate.sh full (key)                                                                                        0.4s
------                                                                                                                                                  
 > [15/18] RUN var/geoipupdate.sh full (key):                                                                                                    
0.200   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current                                                                   
0.202                                  Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
0.331 
0.331 gzip: stdin: unexpected end of file
0.331 tar: Child returned status 1
0.331 tar: Error is not recoverable: exiting now
------

ERROR: failed to solve: process "/bin/sh -c var/geoipupdate.sh $maxmind_license_type $maxmind_license_key" did not complete

Expected behavior

We would have expected to successfully create a new docker image.

Notes

@dav3r observed that this was failing due to a HTTP 302 response when attempting to fetch the database tarball via curl. Adding the --location flag to the curl command remedies the problem.

Flatten GNIS data that contains Unicode characters

💡 Summary

The GNIS_data_import.py script should flatten any Unicode characters in the GNIS data being imported so that it only contains ASCII.

Motivation and context

This would resolve cisagov/cyhy-system#109.

Implementation notes

I believe that processing a raw row using the Unidecode package before splitting it up into a document for import to the database would resolve the issue. The GNIS_data_import.py would need to be run with the --force switch once it is updated to flatten Unicode data to propagate the desired changes to the database.

Acceptance criteria

How do we know when this work is done?

  • There are no documents containing non-ASCII characters in the places collection.

Migrate from MongoKit to MongoEngine for MongoDB ODM Needs

🚀 Feature Proposal

Migrate the MongoDB ODM we use from MongoKit to MongoEngine.

Motivation

MongoKit does not support Python 3 and has been archived, making it no longer suitable for our purposes. After team discussion that was documented in #52, we came to the conclusion to migrate to MongoEngine. The latest MongoEngine modern versions only support Python 3, but v0.19.1 release on 2020-01-04 still supported Python 2. This is recent enough to allow us to migrate from MongoKit to MongoEngine without having to migrate to Python 3 at the same time.

We can always integrate a Python 3 migration mid-stream if needed, but doing an ODM migration in this way will make it easier to separate out breaking changes.

This work will take place on the improvement/switch_to_mongoengine_from_mongokit branch.

Add entity enrollment date

💡 Summary

Add an enrollment date for every CyHy entity.

Motivation and context

Our analysts have requested that a true enrollment date be added for every entity in CyHy to improve the quality of data reporting.

Implementation notes

  • For new entities, the enrollment date/time will be based on when the request document is first created. This will require modification of the cyhy-import script.
  • Once the enrollment date has been set, it should not be overwritten, even if a new request document is imported via cyhy-import --force. If the enrollment date must be changed, it will have to be changed via a manual database command.
  • For existing entities, create a script to backfill the enrollment date using the start_time timestamp in their first snapshot, if one exists. This script will only need to be run once.

Acceptance criteria

  • cyhy-import has been modified to add the enrollment date to the request document for new entities.
  • cyhy-import will NOT overwrite the enrollment date, even if run with the --force flag.
  • A script has been run which added an enrollment date to the request document of every existing entity with at least one snapshot in the database.
  • The data dictionary has been updated to include the new enrollment date field.

Check for non-ASCII characters in cyhy-import

🚀 Feature Proposal

cyhy-import should check that the document(s) to import do not contain ASCII-incompatible Unicode characters and throw an error if it detects any.

Motivation

A document was imported which contained an en-dash (\u2013), which is a Unicode character that is not ASCII-compatible. This caused an error when generating the snapshots and reports and had to be manually fixed.

Example

When attempting to import a document that contains non-ASCII Unicode characters, the tool should throw an error that indicates what line the incompatible character is on and preferably what character it is (e.g. \u2013).

Pitch

To save time and effort on fixing, and prevent copy/paste problems from, e.g. MS Word.

Fix load_places.sh script

The cyhy-core/var/load_places.sh script no longer works correctly, due to a change in the URL where the GNIS (places) data resides. The new URLs for this data are:

In case you need it, the page that links to those URLs above is here: https://www.usgs.gov/core-science-systems/ngp/board-on-geographic-names/download-gnis-data

Also, while you're at it, please update and clean up this script so that it passes shellcheck and uses the same conventions as our shell scripts in cisagov/cyhy_amis (e.g. deploy_new_reporter_ami.sh).

geoipupdate.sh Script Fails Without MaxMind License

🐛 Bug Report

When building a cyhy-core Docker image, the var/geoipupdate.sh script fails if no license key is provided.

To Reproduce

Steps to reproduce the behavior:

  • docker build -t ncats/cyhy-core . in a cloned cyhy-core repository.

Expected behavior

Docker image is successfully built with no errors.

Any helpful log output

When Docker gets to the var/geoipupdate.sh script it fails with:

curl: (6) Could not resolve host: geolite.maxmind.com

Expand Testing for the cyhy.db.database Submodule

🚀 Feature Proposal

Add more thorough testing for the cyhy.db.database submodule to the existing file.

Motivation

As we migrate to MongoEngine from MongoKit in #54, and to Python 3 in the future, it will be helpful to have robust testing in place to ensure that changes are not breaking existing functionality. There is currently only the barest testing for this essential piece of this library, which makes it difficult to test for regressions and matching functionality.

Work for this will take place on the improvement/expand_cyhy.db.database_testing branch.

[Research] Find Python 3 Compatible Replacement for MongoKit

This project currently relies on the MongoKit package to provide ODM functionality. That project was never made Python 3 compatible, nor was it updated for later versions of PyMongo. As it is now archived, that is likely to never happen. A suitable replacement needs to be picked to replace it, and the code in this repository updated, so that Python 3 migration can progress.

A compare and contrast needs to be done between options like MongoKat and the projects listed in the MongoDB tools page. It would be ideal if it operated in a fashion similar to MongoKit while still retaining similar features to ease the transition.

Check that email addresses look valid in `cyhy-simple` and `cyhy-import`

💡 Summary

We should verify email addresses in cyhy-simple and cyhy-import, or possible just when the request document is saved in the database.

Motivation and context

I got this error for an organization when sending out the CyHy reports this week:

cyhy-mailer-mailer-1  | botocore.exceptions.ClientError: An error occurred (InvalidParameterValue) when calling the SendRawEmail operation: Domain contains illegal character

The issue turned out to be a duplicated at sign (@) in the distribution email address. A simple regex would prevent errors like this from happening in the future.

Check for Acronym when on-boarding a new stakeholder

🐛 Summary

When importing a new stakeholder, cyhy-simple and cyhy-import does not check for a blank id or acronym.

To reproduce

Steps to reproduce the behavior:

If the Service Support team leaves the id "blank" during request template creation, the imported JSON file will have a no information for the stakeholder id or acronym field.

Expected behavior

During the importing of the JSON file into the database, an error should be generated if the id and acronym are a blank value.

Any helpful log output or screenshots

cyhy-ip check
AlXXXXX. enter (): 1
XX.XX.XX.1/32

Update the restricted countries list

💡 Summary

CISA has released a new restricted countries list which needs to be incorporated into cyhy-core.

Motivation and context

This will increase our security and notify us of any restricted IP address during the on-boarding process.

Implementation notes

Update the RESTRICTED_COUNTRIES list inside of geoloc.py:

RESTRICTED_COUNTRIES = ["China", "Iran", "North Korea", "Russia"]

Updated Countries List:
Will be sent to the DEV team.

Acceptance criteria

How do we know when this work is done?

  • The list has been updated
  • A sample IP list demonstrates the updated restricted list is being flagged.

Perform some code base cleanup

💡 Summary

Just some small, general cleanup on the code base including:

  • Run isort against the code base to constructively organized imports.
  • Re-run black to ensure that the code base is still formatted in the preferred manner.
  • Remove unnecessary iPython breakpoints per #78 (comment).

Motivation and context

Improving code quality is always good and due to the nature of this code base (Python 2) we can not trivially automate these auto-formatters.

Implementation notes

An appropriate .isort.cfg file will need to be added and the isort and black tools should run without changing any files.

Update cyhy-nvdsync to use JSON feed instead of XML feed

According to this post by NIST, the XML National Vulnerability Database (NVD) feed is being replaced by a JSON feed:

The NVD highly encourages all users of its data feeds to transition to use of the 1.0 JSON feeds as soon as possible! The JSON feeds are intended to supplant the XML 2.0 and XML 1.2.1 feeds entirely. The NVD will be providing a grace period for this transition that will last until April, 2019. After that date the NVD will permanently disable and remove the XML 2.0 and XML 1.2.1 feeds. To reiterate, all users will no longer be able to use the XML 2.0 and XML 1.2.1 data feeds after April, 2019!

We need to update cyhy-nvdsync to use the JSON feed before April 2019.

CC: @felddy @jsf9k @KyleEvers

Private IPs Get Added With New Stakeholder Imports!

🐛 Summary

While adding some new stakeholders (if one overlooks that the request has IPs in the text template which gets changed to JSON and imported to the DB), we found that private IPs are allowed to be added. It doesn't even require the --force option to complete, but it does causes a float error when initiating/syncing.

Error when attempting to initialize scope that contains private/reserved IPs:
float() argument must be a string or a number (None, None)

Error when attempting to sync scope that contains private/reserved IPs:

Traceback (most recent call last):
  File "/usr/local/bin/cyhy-tool", line 271, in <module>
    main()
  File "/usr/local/bin/cyhy-tool", line 234, in main
    status(db, args["OWNER"], args["--sync"])
  File "/usr/local/bin/cyhy-tool", line 147, in status
    sync_tallies(db, owner)
  File "/usr/local/bin/cyhy-tool", line 117, in sync_tallies
    if SCAN_TYPE.CYHY not in db.RequestDoc.get_by_owner(owner_id)["scan_types"]:
TypeError: 'NoneType' object has no attribute '__getitem__'

We've found that in order to resolve this improper add, you can remove all IPs from their scope and add back only the public IPs, both with the $ cyhy-ip command (removing just the private ranges probably would also be sufficient though) and finish setting them up after. Feel free to reference my solution walkthrough in CYHYOPS-7760 if needed!

To reproduce

You can see this as early as CYHYOPS-4884 and as recently as CYHYOPS-7760. CYHYOPS-6190 is a good show of the error output.

Expected behavior

When private or reserved IPs are included in a JSON that is getting imported to the DB, it should error out similarly to when IPs are already assigned to existing stakeholders.

Any helpful log output or screenshots

See comments in above listed Jira tickets or output in above "Summary" section.

Allow restricted IPs to be added with an override flag

💡 Summary

We need to provide a flag that will allow users of the cyhy-ip tool to add restricted IPs if desired.

Motivation and context

Although generally undesirable there are specific scenarios where this is necessary. The most common is if a restricted IP is removed while a scan for the IP is in progress. When the scan completes it may cause host information for the remote IP to be updated (including re-opening tickets). The cleanest way to resolve this is to re-add and then remove the affected IP to ensure any re-opened tickets are once again closed. Currently this is not readily possible using the provided tools.

Implementation notes

This should only bypass the final check to fail if restricted IPs are provided. It should still output all logging information that informs a user that a restricted IP is included and warn that a restricted IP will be added to the database. We should also add the same flag to cyhy-tool to ensure parity between the tools.

Acceptance criteria

  • There is a flag available that will allow restricted IPs to be added to the database.

Update Scripts in bin to be Python 3 Compatible

The scripts in bin need to be updated to be as Python 3 compatible as possible. Since we are dropping support for Python 2 completely we should avoid bridging tools like six, and just focus on converting everything to a Python 3 style that we can as a first phase. The only things that should be left unchanged are Python 2 specific features required for functionality. Those changes should be saved for the "last" PR to complete the migration and officially break Python 2 compatibility.

Add Ability to Update Org IDs to cyhy-tool

🚀 Feature Proposal

Add the ability to update organization ids to the cyhy-tool script.

Motivation

We currently have a one-off script to update org IDs. This requires the CyHy team to contact the dev team so we can run this script as desired. It would be better for all parties if this functionality was integrated into the cyhy-tool script (which the CyHy team has access to as part of the CyHy Docker image).

Example

cyhy-tool update-id OLDID NEWID

Pitch

Similar functionality is already handled by the cyhy-tool script, and the Docker image built from this project is part of the CyHy team's toolkit already.

Migrate `cyhy-nvdsync` to the NIST NVD API 2.0

💡 Summary

Update the cyhy-nvdsync script to use the NIST NVD API 2.0.

Motivation and context

Currently we rely on the NIST NVD data feeds to get CVE information. These feeds are being retired in September of 2023. We will need to migrate to the API 2.0 to continue to get the CVE data we need after this point.

Implementation notes

We will have to determine whether or not we can get appropriate functionality without an API key.

Acceptance criteria

  • The cyhy-nvdsync script uses the NIST NVD API 2.0 for functionality

Decouple the Tools in the bin Directory from the Library

💡 Summary

We should decouple the scripts in bin/ and break them out into their own project.

Motivation

The tools should be their own installation so the library is performing only the functionality it needs when included in other projects. As an example, when cyhy-core is installed on an instance that uses cisagov/cyhy-reports, the scripts that would be installed are unneeded (and possibly unwanted) additions to that instance's environments. This project should be focused on being a library, because that is its primary purpose.

This pare down to essential functionality is I believe important as part of the long term health of this library. In a future move to a serverless setup, having this code not install tools would be preferred. This would also allow us to break apart testing in a saner manner.

Implementation notes

This would require the creation of a new project, cyhy-tools or something similar, and the removal of the the scripts and related installation from this project. This would help toward the work for cisagov/cyhy-system#14.

Acceptance criteria

  • The bin/ directory, the scripts, and any installation requirements for them are removed from this project.
  • A new project, tentatively cyhy-tools, would have the scripts included here as their own project.
  • A new Ansible role to handle installing this new project would need to be created.
  • Test that a deployment that would otherwise have only required cisagov/ansible-role-cyhy-core has the same functionality when this new Ansible role is used.

Create Testing for the cyhy.db.chdatabase Submodule

🚀 Feature Proposal

Add testing for the cyhy.db.chdatabase submodule.

Motivation

As we migrate to MongoEngine from MongoKit in #54, and to Python 3 in the future, it will be helpful to have robust testing in place to ensure that changes are not breaking existing functionality. There is currently no testing for this essential piece of this library, which makes it difficult to test for regressions and matching functionality.

Work for this will take place on the improvement/implement_database.db.chdatabase_testing branch.

Sporadic failures from `cyhy-nvdsync`

🐛 Summary

The cyhy-nvdsync process occasionally fails due to a ValueError exception thrown from json.load(). When run later the command succeeds.

To reproduce

I am unable to reproduce this bug, but you can see instances of it in /var/log/syslog on the database1 instance:

$ sudo zgrep ValueError /var/log/syslog|less
Feb 23 08:16:55 database1 cyhy-nvdsync:     raise ValueError("No JSON object could be decoded")
Feb 23 08:16:55 database1 cyhy-nvdsync: ValueError: No JSON object could be decoded

Expected behavior

If the file published by NIST is corrupted, then replaced with an uncorrupted version, or if there are multiple download nodes and one of them contains a bad copy, then that would explain the behavior. If that is not the case then the observed behavior is unexplained.

Any helpful log output or screenshots

Here is a full stacktrace from a failure in /var/log/syslog:

ERROR:root:Unexpected exception
Feb 23 08:16:55 database1 cyhy-nvdsync: Traceback (most recent call last):
Feb 23 08:16:55 database1 cyhy-nvdsync:   File "/usr/local/bin/cyhy-nvdsync", line 97, in main
Feb 23 08:16:55 database1 cyhy-nvdsync:     process_url(db, url)
Feb 23 08:16:55 database1 cyhy-nvdsync:   File "/usr/local/bin/cyhy-nvdsync", line 77, in process_url
Feb 23 08:16:55 database1 cyhy-nvdsync:     parse_json(db, f)
Feb 23 08:16:55 database1 cyhy-nvdsync:   File "/usr/local/bin/cyhy-nvdsync", line 39, in parse_json
Feb 23 08:16:55 database1 cyhy-nvdsync:     data = json.load(json_stream)
Feb 23 08:16:55 database1 cyhy-nvdsync:   File "/usr/lib/python2.7/json/__init__.py", line 291, in load
Feb 23 08:16:55 database1 cyhy-nvdsync:     **kw)
Feb 23 08:16:55 database1 cyhy-nvdsync:   File "/usr/lib/python2.7/json/__init__.py", line 339, in loadsFeb 23 08:16:55 database1 cyhy-nvdsync:     return _default_decoder.decode(s)
Feb 23 08:16:55 database1 cyhy-nvdsync:   File "/usr/lib/python2.7/json/decoder.py", line 364, in decodeFeb 23 08:16:55 database1 cyhy-nvdsync:     obj, end = self.raw_decode(s, idx=_w(s, 0).end())
Feb 23 08:16:55 database1 cyhy-nvdsync:   File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
Feb 23 08:16:55 database1 cyhy-nvdsync:     raise ValueError("No JSON object could be decoded")
Feb 23 08:16:55 database1 cyhy-nvdsync: ValueError: No JSON object could be decoded
Feb 23 08:16:55 database1 cyhy-nvdsync:  . . . . . . . . . . . . . . . . . . . . . . x . . . . . . . . . . . . x x x . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x . x . . x 
Feb 23 08:16:55 database1 cyhy-nvdsync: 
Feb 23 08:16:55 database1 cyhy-nvdsync: 
Feb 23 08:16:55 database1 cyhy-nvdsync: ---------- https://nvd.nist.gov/feeds/json/cve/1.1/nvdcve-1.1-2024.json.gz ----------

Note that there is likely some interleaving due to both standard output and standard error being redirected to the same file.

Bug in `check_restricted_ip()` error-handling code

🐛 Summary

When adding new IP addresses using cyhy-ip add TEST_ENTITY <IP address>, some IPs are returning an error.

To reproduce

cyhy-ip add TEST_ENTITY <IP address>
Traceback (most recent call last):
  File "/usr/local/bin/cyhy-ip", line 506, in <module>
    main()
  File "/usr/local/bin/cyhy-ip", line 484, in main
    add(db, args["OWNER"], nets)
  File "/usr/local/bin/cyhy-ip", line 175, in add
    country = geo_loc_db.check_restricted_ip(ip)
  File "/usr/local/lib/python2.7/dist-packages/cyhy/core/geoloc.py", line 81, in check_restricted_ip
    print >> sys.stderr, "CIDR %s not found in geolocation database" % cidr
NameError: global name 'cidr' is not defined

Expected behavior

We would expect for the code to return:

$ cyhy-ip add TEST_ENTITY <IP address>
IPs added to request, and initialized.  Tally sync required to start scan of new IPs.
▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶▶| ETA:  0:00:00

Any helpful log output or screenshots

To fix this bug, change this line from:

print >> sys.stderr, "CIDR %s not found in geolocation database" % cidr

to this:

print >> sys.stderr, "IP %s not found in geolocation database" % ip

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.