Giter Club home page Giter Club logo

kafka_spark_structured_streaming's Introduction

Hi! I'm Dogukan

Data Engineer

About Me :


I am working as a Data Engineer. I develop many data engineering projects using the below tools and frameworks. I always start working on the projects and jobs with a can-do approach. I always learn the best practices to increase efficiency and apply them while handling the tasks. I am a strong believer of sustainability and we, as data engineers, should complete the tasks as sustainable as possible. I also believe that theoretical part of the data is quite important. That's why I always learn and keep myself up-to-date about the principles of distributed computing, big data applications, how to handle the streaming data, scheduling/orchestrating the pipelines, working principles of cloud environments and many other topics. While doing those, I also develop myself in coding especially at Python and SQL. I create many complicated Python scripts as well as many SQL queries. You can find lots of data engineering projects here, welcome to my GitHub :)

Connect with me:

Tech Stack:

python sql aws metabase spark snowflake kafka airflow docker postgresql mysql hadoop dbt cassandra elasticsearch looker metabase mongodb terraform flink jenkins kubernetes

kafka_spark_structured_streaming's People

Contributors

dogukannulu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

kafka_spark_structured_streaming's Issues

Ran into issue as soon as I started to follow along.

Hi Dogukan,

I was wondering if you could help with this.

Basically, I just did git clone and ran: docker build --rm --build-arg AIRFLOW_DEPS="datadog,dask" --build-arg PYTHON_DEPS="flask_oauthlib>=0.9" -t puckel/docker-airflow .

Then I get below error after it runs for 10 minutes.

...
482.6   Building wheel for JPype1 (pyproject.toml): started
499.3   Building wheel for JPype1 (pyproject.toml): finished with status 'done'
499.3   Created wheel for JPype1: filename=JPype1-1.5.0-cp37-cp37m-linux_aarch64.whl size=435875 sha256=77f2cbf570d347210b7f6f0ff41c2473ac3d796cb8a0d08330873ac60e150453
499.3   Stored in directory: /root/.cache/pip/wheels/64/84/2d/2de63cbea3d5322bee5ac095f7a148ffc882cab8466eabcead
499.4 Successfully built flask-admin flask-swagger json-merge-patch pendulum termcolor flask-login future mysqlclient psutil pyhive pysftp thrift tornado tzlocal unicodecsv Flask-JWT-Extended JPype1
499.4 Failed to build pandas
499.4 ERROR: Could not build wheels for pandas, which is required to install pyproject.toml-based projects
------
Dockerfile:31
--------------------
  30 |     
  31 | >>> RUN set -ex \
  32 | >>>     && buildDeps=' \
  33 | >>>         freetds-dev \
  34 | >>>         libkrb5-dev \
  35 | >>>         libsasl2-dev \
  36 | >>>         libssl-dev \
  37 | >>>         libffi-dev \
  38 | >>>         libpq-dev \
  39 | >>>         git \
  40 | >>>     ' \
  41 | >>>     && apt-get update -yqq \
  42 | >>>     && apt-get upgrade -yqq \
  43 | >>>     && apt-get install -yqq --no-install-recommends \
  44 | >>>         $buildDeps \
  45 | >>>         freetds-bin \
  46 | >>>         build-essential \
  47 | >>>         default-libmysqlclient-dev \
  48 | >>>         apt-utils \
  49 | >>>         curl \
  50 | >>>         rsync \
  51 | >>>         netcat \
  52 | >>>         locales \
  53 | >>>     && sed -i 's/^# en_US.UTF-8 UTF-8$/en_US.UTF-8 UTF-8/g' /etc/locale.gen \
  54 | >>>     && locale-gen \
  55 | >>>     && update-locale LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 \
  56 | >>>     && useradd -ms /bin/bash -d ${AIRFLOW_USER_HOME} airflow \
  57 | >>>     && pip install -U pip setuptools wheel \
  58 | >>>     && pip install pytz \
  59 | >>>     && pip install pyOpenSSL \
  60 | >>>     && pip install ndg-httpsclient \
  61 | >>>     && pip install pyasn1 \
  62 | >>>     && pip install apache-airflow[crypto,celery,postgres,hive,jdbc,mysql,ssh${AIRFLOW_DEPS:+,}${AIRFLOW_DEPS}]==${AIRFLOW_VERSION} \
  63 | >>>     && pip install 'redis==3.2' \
  64 | >>>     && if [ -n "${PYTHON_DEPS}" ]; then pip install ${PYTHON_DEPS}; fi \
  65 | >>>     && apt-get purge --auto-remove -yqq $buildDeps \
  66 | >>>     && apt-get autoremove -yqq --purge \
  67 | >>>     && apt-get clean \
  68 | >>>     && rm -rf \
  69 | >>>         /var/lib/apt/lists/* \
  70 | >>>         /tmp/* \
  71 | >>>         /var/tmp/* \
  72 | >>>         /usr/share/man \
  73 | >>>         /usr/share/doc \
  74 | >>>         /usr/share/doc-base
  75 |     
--------------------
ERROR: failed to solve: process "/bin/sh -c set -ex     && buildDeps='         freetds-dev         libkrb5-dev         libsasl2-dev         libssl-dev         libffi-dev         libpq-dev         git     '     && apt-get update -yqq     && apt-get upgrade -yqq     && apt-get install -yqq --no-install-recommends         $buildDeps         freetds-bin         build-essential         default-libmysqlclient-dev         apt-utils         curl         rsync         netcat         locales     && sed -i 's/^# en_US.UTF-8 UTF-8$/en_US.UTF-8 UTF-8/g' /etc/locale.gen     && locale-gen     && update-locale LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8     && useradd -ms /bin/bash -d ${AIRFLOW_USER_HOME} airflow     && pip install -U pip setuptools wheel     && pip install pytz     && pip install pyOpenSSL     && pip install ndg-httpsclient     && pip install pyasn1     && pip install apache-airflow[crypto,celery,postgres,hive,jdbc,mysql,ssh${AIRFLOW_DEPS:+,}${AIRFLOW_DEPS}]==${AIRFLOW_VERSION}     && pip install 'redis==3.2'     && if [ -n \"${PYTHON_DEPS}\" ]; then pip install ${PYTHON_DEPS}; fi     && apt-get purge --auto-remove -yqq $buildDeps     && apt-get autoremove -yqq --purge     && apt-get clean     && rm -rf         /var/lib/apt/lists/*         /tmp/*         /var/tmp/*         /usr/share/man         /usr/share/doc         /usr/share/doc-base" did not complete successfully: exit code: 1

View build details: docker-desktop://dashboard/build/desktop-linux/desktop-linux/r95e7ihz5gd509e271l7ez41q

network airflow-kafka declared as external, but could not be found

Hi, I recently stumbled upon your repo when I was learning about streaming data, I tried to follow step by steps carefully but I encountered an error when running
docker-compose -f docker-compose-LocalExecutor.yml up -d

Are there any prerequisites in order to follow the steps

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.