Giter Club home page Giter Club logo

aws-samples / amazon-ecs-mythicalmysfits-workshop Goto Github PK

View Code? Open in Web Editor NEW
333.0 39.0 178.0 26.63 MB

A tutorial for developers who want to learn about how to containerized applications on top of AWS using AWS Fargate. You will build a sample website that leverages infrastructure as code, containers, CI/CD, and more! If you're planning on running this, let us know @ [email protected]. At re:Invent 2018, these sessions were run as CON214/CON321/CON322.

License: Apache License 2.0

Dockerfile 2.78% Python 21.32% Shell 35.79% HTML 40.11%

amazon-ecs-mythicalmysfits-workshop's Issues

Allow the application to be deployed in EKS leveraging the IRSA

IRSA: IAM role for service account are the new recommended way to configure & deploy applications that needs programatic access to underlying AWS ressources.

In our case it's the mystical mysfit application that needs the permissions to connect to DynamoDB.

Instead of adding the policy to the underlying EC2 instance, we need to just allow the pod to connect to the DynamoDB Table.

Error while making the setup of the course at $ script/setup

I'm getting yhis error when trying to complete the setup of the lab:

"Uploading static site to S3...
which: no gsed in (/home/ec2-user/.nvm/versions/node/v10.19.0/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3@global/bin:/home/ec2-user/.rvm/rubies/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3@global/bin:/home/ec2-user/.rvm/rubies/ruby-2.6.3/bin:/usr/local/bin:/bin:/usr/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin:/home/ec2-user/.rvm/bin:/usr/local/sbin:/usr/sbin:/sbin:/opt/aws/bin:/usr/local/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin:/home/ec2-user/.rvm/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin)
upload failed: ../../../../../tmp/tmp.gTzQ61PhLc/js/amazon-cognito-identity.min.js to s3://mysfits-fargate-mythicalbucket-6hnhj4c5wgip/js/amazon-cognito-identity.min.js An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.gTzQ61PhLc/index.html to s3://mysfits-fargate-mythicalbucket-6hnhj4c5wgip/index.html An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.gTzQ61PhLc/confirm.html to s3://mysfits-fargate-mythicalbucket-6hnhj4c5wgip/confirm.html An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.gTzQ61PhLc/js/aws-sdk-2.246.1.min.js to s3://mysfits-fargate-mythicalbucket-6hnhj4c5wgip/js/aws-sdk-2.246.1.min.js An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.gTzQ61PhLc/js/aws-cognito-sdk.min.js to s3://mysfits-fargate-mythicalbucket-6hnhj4c5wgip/js/aws-cognito-sdk.min.js An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.gTzQ61PhLc/register.html to s3://mysfits-fargate-mythicalbucket-6hnhj4c5wgip/register.html An error occurred (AccessDenied) when calling the PutObject operation: Access Denied"

Unable to release ressources in AWS since one month

Hi Community, I've been through this online tutorial, and since 1 month, AWS is charging me for ressources I don't use any more and that I can't delete ! The load balancer is still active :-(
I've been searching for hours how to delete the load balancer, with no success !!!

Load balancer 'arn:aws:elasticloadbalancing:eu-west-1:784189488573:loadbalancer/net/mysfits-nlb/48178a2f4e28b5f2' cannot be deleted because it is currently associated with another service

Any idea ?

Docker run detached

Add the "-d" flag to the "docker run" command in Lab 1 to run the container in the background and return to the command line. This negates the need to be stuck on the waiting for (ctrl-c).

example:
docker run -d -p 8000:80 -e AWS_DEFAULT_REGION=eu-west-1 -e DDB_TABLE_NAME=Table-mythical-mysfits-fargate monolith-service

Cloud Formation Outputs - Script run error

After running script/setup

Installing dependencies...
Loaded plugins: priorities, update-motd, upgrade-helper
amzn-main | 2.1 kB 00:00:00
amzn-updates | 2.5 kB 00:00:00
1065 packages excluded due to repository priority protections
Resolving Dependencies
--> Running transaction check
---> Package jq.x86_64 0:1.5-1.2.amzn1 will be installed
--> Processing Dependency: jq-libs(x86-64) = 1.5-1.2.amzn1 for package: jq-1.5-1.2.amzn1.x86_64
--> Processing Dependency: oniguruma for package: jq-1.5-1.2.amzn1.x86_64
--> Processing Dependency: libonig.so.2()(64bit) for package: jq-1.5-1.2.amzn1.x86_64
--> Processing Dependency: libjq.so.1()(64bit) for package: jq-1.5-1.2.amzn1.x86_64
--> Running transaction check
---> Package jq-libs.x86_64 0:1.5-1.2.amzn1 will be installed
---> Package oniguruma.x86_64 0:5.9.1-3.1.2.amzn1 will be installed
--> Finished Dependency Resolution

Dependencies Resolved

====================================================================================================================================================================================================
Package Arch Version Repository Size

Installing:
jq x86_64 1.5-1.2.amzn1 amzn-main 44 k
Installing for dependencies:
jq-libs x86_64 1.5-1.2.amzn1 amzn-main 121 k
oniguruma x86_64 5.9.1-3.1.2.amzn1 amzn-main 149 k

Transaction Summary

Install 1 Package (+2 Dependent packages)

Total download size: 314 k
Installed size: 885 k
Downloading packages:
(1/3): jq-1.5-1.2.amzn1.x86_64.rpm | 44 kB 00:00:00
(2/3): jq-libs-1.5-1.2.amzn1.x86_64.rpm | 121 kB 00:00:01
(3/3): oniguruma-5.9.1-3.1.2.amzn1.x86_64.rpm | 149 kB 00:00:01

Total 202 kB/s | 314 kB 00:00:01
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
Installing : oniguruma-5.9.1-3.1.2.amzn1.x86_64 1/3
Installing : jq-libs-1.5-1.2.amzn1.x86_64 2/3
Installing : jq-1.5-1.2.amzn1.x86_64 3/3
Verifying : oniguruma-5.9.1-3.1.2.amzn1.x86_64 1/3
Verifying : jq-1.5-1.2.amzn1.x86_64 2/3
Verifying : jq-libs-1.5-1.2.amzn1.x86_64 3/3

Installed:
jq.x86_64 0:1.5-1.2.amzn1

Dependency Installed:
jq-libs.x86_64 0:1.5-1.2.amzn1 oniguruma.x86_64 0:5.9.1-3.1.2.amzn1

Complete!
Fetching CloudFormation outputs...
jq: error (at :31): Cannot iterate over null (null)

Screenshot 2019-08-05 at 3 14 13 PM

Workshop 3 CloudFormation template validation fails on valid email domain.

The CloudFormation template for workshop 3 has the following validation for CognitoAdminEmail:

CognitoAdminEmail:
 Type: String
 Default: [email protected]
 AllowedPattern: '^\w+([\.-]?\w+)*@\w+([\.-]?\w+)*(\.\w{2,3})+$'
 Description: Enter a valid email address to be used for Kibana Cognito authentication.

This will cause any valid TLD with a length > 3 (such as .limited) to fail.

Incompatible with Ubuntu:20.20

Hi there,

You reference the package python-pip in several Dockerfiles however since you're including FROM ubuntu:latest it uses Ubuntu 20.20 which no longer supports Python2. There is a python3-pip however the like service is very much Python 2.

Even if I pin the Docker image to FROM ubuntu:18.04 I get some exceptions and thus the healthcheck keeps killing my containers.

CORS Error

When running the initial monolith deployment in workshop-1, after updating the Fargate service sitting behind the ALB to use the updated task definition, the Mysfits data fails to load on the S3-hosted site.

Chrome Developer Tools showed a Cross-Origin Request Blocked: Same Origin Policy disallows... error in the console with the following output: Reason: CORS header ‘Access-Control-Allow-Origin’ missing

I tried updating the python requirements file to use flask-cors 3.0.8 instead of 3.0.0, and that fixed the error. I will make a pull request to update all the pip requirements files accordingly.

module 2 create stack fails

aws cloudformation create-stack --stack-name MythicalMysfitsCoreStack --capabilities CAPABILITY_NAMED_IAM --template-body file://~/environment/aws-modern-application-workshop/module-2/cfn/core.yml

This fails, which causes the stack to roll back. I noticed in the events that PublicSubnetTwo, PrivateSubnetTwo, PublicRouteTable, PrivateRouteTableOne and FargateContainerSecurityGroup all "CREATE_FAILED." I am in region us-east-2.

CloudFormation Script fails on s3 bucket creation

Maybe I missed something but I wasn't able to press the magic button on the setup section. I got a 'permissions denied' error on the bucket creation. I think this may be due to the change in the way s3 buckets can be configured for website (public access).

To get that script to work I did this:

  • pull json from the designer
  • modify the output section,adding condition to S3WebsiteURL section:
 "S3WebsiteURL": {
            "Description": "This is the DNS name of your S3 site",
            "Value": {
                "Fn::GetAtt": [
                    "MythicalBucket",
                    "WebsiteURL"
                ]
            },
	    "Condition": "MakeBucket"
        }
  • run the script with MakeBucket false
  • create an s3 website capable bucket
  • in the cloudnine environment run setup script passing in unique s3 name of website bucket

when performing cleanup, lots of things failed, this may help, before deleting stack:

  • remove all images from the repositories
  • stop all services on the clusters
  • wait a long time, (1 hr) still will get failures mostly with the IPs and VPC but they can be deleted manually

Hope this helps somebody. This was a VERY useful tutorial

WorkShop Set up Fails

Running script/setup under workshop-1 fails

See the below error-

An error occurred (ValidationError) when calling the DescribeStacks operation: Stack with id mysfits does not exist
Populating DynamoDB table...

Parameter validation failed:
Invalid length for parameter RequestItems (key: ), value: 0, valid min length: 3

Parameter Store with Fargate

Please write some detailed example on how to handle secrets with Fargate using Parameter Store. How to modify the Dockerfile for that purpose, how to get the secrets into the applications' config files etc.

ImportError: cannot import name 'json' from 'itsdangerous' (/usr/local/lib/python3.8/dist-packages/itsdangerous/__init__.py)

TeamRole:~/environment/amazon-ecs-mythicalmysfits-workshop/workshop-1/app/monolith-service (master) $ docker run -p 8000:80 -e AWS_DEFAULT_REGION=$AWS_REGION -e DDB_TABLE_NAME=$TABLE_NAME monolith-service
Traceback (most recent call last):
  File "mythicalMysfitsService.py", line 3, in <module>
    from flask import Flask, jsonify, json, Response, request
  File "/usr/local/lib/python3.8/dist-packages/flask/__init__.py", line 21, in <module>
    from .app import Flask, Request, Response
  File "/usr/local/lib/python3.8/dist-packages/flask/app.py", line 25, in <module>
    from . import cli, json
  File "/usr/local/lib/python3.8/dist-packages/flask/json/__init__.py", line 21, in <module>
    from itsdangerous import json as _json
ImportError: cannot import name 'json' from 'itsdangerous' (/usr/local/lib/python3.8/dist-packages/itsdangerous/__init__.py)

This is at the end of lab 1

TABLE_NAME=$(aws dynamodb list-tables | jq -r .TableNames[0])
docker run -p 8000:80 -e AWS_DEFAULT_REGION=$AWS_REGION -e DDB_TABLE_NAME=$TABLE_NAME monolith-service

Based on this post issue appears to stem from:
Flask 1.1.2 is set up to require itsdangerous >= 0.24. The latest released (itsdangerous) version (2.10) deprecated the json API. To continue using Flask 1.1.2, you need to require at most itdangerous 2.0.1 (not 2.10)

From that page:
Found the same problem today. I used Flask in version 1.1.2. The problem disappeared after update to version 1.1.4.

And also impacted the AWS SAM CLI:

With the temporary fix being:


Downgrading markupsafe to 2.0.1 fixes the issue on my side.
pip install markupsafe==2.0.1

Unable to upload S3 artefacts.

After running the CloudFormation stack, I am running script/setup command. It fails with two errors viz. unable to locate gsed and unable to upload to S3 bucket. See below.

To install gsed, I had to first install brew, brew install gsed and then alias gsed=sed. But, this does not work.

which: no gsed in (/home/ec2-user/.linuxbrew/bin:/home/ec2-user/.linuxbrew/sbin:/home/ec2-user/.nvm/versions/node/v10.23.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3@global/bin:/home/ec2-user/.rvm/rubies/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3@global/bin:/home/ec2-user/.rvm/rubies/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3/bin:/home/ec2-user/.rvm/gems/ruby-2.6.3@global/bin:/home/ec2-user/.rvm/rubies/ruby-2.6.3/bin:/usr/local/bin:/bin:/usr/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin:/usr/local/bin:/home/ec2-user/.rvm/bin:/usr/local/sbin:/usr/sbin:/sbin:/opt/aws/bin:/usr/local/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin:/usr/local/bin:/home/ec2-user/.rvm/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin:/opt/aws/bin:/usr/local/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin:/usr/local/bin:/home/ec2-user/.rvm/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin)
upload failed: ../../../../../tmp/tmp.JS3hWcAhhP/confirm.html to s3://mysfits-fargate-mythicalbucket-1u1bgbgeqd152/confirm.html An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.JS3hWcAhhP/js/aws-sdk-2.246.1.min.js to s3://mysfits-fargate-mythicalbucket-1u1bgbgeqd152/js/aws-sdk-2.246.1.min.js An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.JS3hWcAhhP/js/amazon-cognito-identity.min.js to s3://mysfits-fargate-mythicalbucket-1u1bgbgeqd152/js/amazon-cognito-identity.min.js An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.JS3hWcAhhP/index.html to s3://mysfits-fargate-mythicalbucket-1u1bgbgeqd152/index.html An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.JS3hWcAhhP/js/aws-cognito-sdk.min.js to s3://mysfits-fargate-mythicalbucket-1u1bgbgeqd152/js/aws-cognito-sdk.min.js An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
upload failed: ../../../../../tmp/tmp.JS3hWcAhhP/register.html to s3://mysfits-fargate-mythicalbucket-1u1bgbgeqd152/register.html An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
n

docker run fails

Run the following command to reproduce the issue:

 $ docker run -p 8000:80 -e AWS_DEFAULT_REGION=REGION -e DDB_TABLE_NAME=TABLE_NAME monolith-service

Error:

Traceback (most recent call last):
  File "mythicalMysfitsService.py", line 7, in <module>
    import requests
ImportError: No module named requests

Expected output:

* Running on http://0.0.0.0:80/

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.