Giter Club home page Giter Club logo

nse-stock-scraper's Introduction

Daily Stock Price Scraper

Badges

Python application forthebadge made-with-python

MIT license Open Source Love svg1

Overview

Web scraper utilizing scrapy to scrape live stock prices from the Nairobi Stock Exchange. The prices are then saved in MongoDB Database after each scrape, we use pymongo to connect to MongoDb Atlas. We then proceed to use Atlas Charts to visualize the data.

The accompanying article can be found here

Screenshots

App Screenshot

Atlas DB

Charts Dashboard

The actual platform we are scraping is afx website.

Getting Started

Prerequisites

  • Python and pip (I am currently using 3.9.2) Any version above 3.7 should work.
  • An Africas Talking account.
    • Api Key and username from your account. Create an app and take note of the api key.
  • MongoDB Atlas account, create a free account here
    • Create a cluster and take note of the connection string.

Installation

Clone this repo

  git clone https://github.com/KenMwaura1/nse-stock-scraper

Step 1

Change into the directory

cd stock-price-scraper

Step 2

Create a virtual environment (venv) to hold all the required dependencies.Here we use the built-in venv module.

python -m venv env

Activate the virtual environment

source env/bin/activate

Alternatively if you are using pyenv.

pyenv virtualenv nse_scraper
pyenv activate nse_scraper

Step 3

Install the required dependencies:

pip install -r requirements

Step 4

Change into the nse_scraper folder and create an environment file.

cd nse_scraper
touch .env 

Add your credentials as specified in the example file.

OR

Copy the provided example and edit as required:

cp .env-example env

Step 5

Navigate up to the main folder stock-price-scraper Run the scraper and check the logs for any errors .

cd .. 
scrapy crawl afx_scraper

or Run the scraper and have it output to a json file to preview.

scrapy crawl afx_scraper -o test.json 

Tweak the project name as necessary.

Ensure you add your configuration variables in ‘Settings’ → ‘Reveal Config Vars‘. This will allow Heroku to get and set the required environment configuration for our web scraper to run.

scrapy crawl afx_scraper

Scheduling Text Notifications

Now we need add a scheduler for Heroku to run our notifiction script which will inturn send us texts. Since we already have an instance of Heroku running in our app we need an alternative. Advanced scheduler is a good option as it offers a free trial and if need be a $5 per month for an upgrade.

  1. Setup Inside our daily-nse-scraper app, search for the advanced scheduler addon. Select the trail-free plan and submit order form. Alt Text

  2. Configuration

Click on the Advanced Scheduler addon. Inside the overview page. Click on Create trigger button. The free trial allows up-to 3 triggers. We'll set a trigger for 11.00 am each day, specify the command python nse_scraper/stock_notification.py to run. Remember to select the correct timezone in my case its Africa/Nairobi and save the changes. Alt Text 3. Testing To ensure the scheduler will execute as expected. We can manually run the trigger: on the overview page, click on the more button and select execute trigger. Alt Text

You should now have received a notification text if everything went as expected.

Alt Text

License

MIT

nse-stock-scraper's People

Contributors

kenmwaura1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Forkers

hartl3y94

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.