Skip to content

Instantly share code, notes, and snippets.

@Mikodes
Forked from keeth/00-packages.config
Created November 14, 2018 14:25
Show Gist options
  • Select an option

  • Save Mikodes/8cdb295ddccf1182d4b1ad15698a3583 to your computer and use it in GitHub Desktop.

Select an option

Save Mikodes/8cdb295ddccf1182d4b1ad15698a3583 to your computer and use it in GitHub Desktop.

Revisions

  1. @keeth keeth revised this gist Jul 21, 2017. 1 changed file with 3 additions and 1 deletion.
    4 changes: 3 additions & 1 deletion notes.md
    Original file line number Diff line number Diff line change
    @@ -10,7 +10,9 @@ You may also need to do:

    `eb setenv PYCURL_SSL_LIBRARY=nss`

    Also ensure your instance profile (aws-elasticbeanstalk-ec2-role) has permission to talk to SQS, and possibly SES if you use django-ses.
    Note that you should not include AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in your settings. Boto will pick them up from the instance profile automatically.

    Ensure your instance profile (aws-elasticbeanstalk-ec2-role) has permission to talk to SQS, and possibly SES if you use django-ses.

    ```
    {
  2. @keeth keeth revised this gist Jul 21, 2017. 3 changed files with 51 additions and 10 deletions.
    26 changes: 25 additions & 1 deletion notes.md
    Original file line number Diff line number Diff line change
    @@ -10,7 +10,31 @@ You may also need to do:

    `eb setenv PYCURL_SSL_LIBRARY=nss`

    Also ensure your instance profile (aws-elasticbeanstalk-ec2-role) has permission to talk to SQS.
    Also ensure your instance profile (aws-elasticbeanstalk-ec2-role) has permission to talk to SQS, and possibly SES if you use django-ses.

    ```
    {
    "Version": "2012-10-17",
    "Statement": [
    {
    "Effect": "Allow",
    "Action": [
    "sqs:*"
    ],
    "Resource": "arn:aws:sqs:*:*:*"
    },
    {
    "Effect": "Allow",
    "Action": [
    "ses:SendRawEmail",
    "ses:SendEmail",
    "ses:GetSendQuota"
    ],
    "Resource": "*"
    }
    ]
    }
    ```

    Much of this was sourced from:

    26 changes: 26 additions & 0 deletions prod-settings.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,26 @@
    SECRET_KEY = os.environ.get('DJANGO_SECRET_KEY')

    DATABASES = {
    'default': {
    'ENGINE': 'django.db.backends.postgresql',
    'NAME': os.environ['RDS_DB_NAME'],
    'USER': os.environ['RDS_USERNAME'],
    'PASSWORD': os.environ['RDS_PASSWORD'],
    'HOST': os.environ['RDS_HOSTNAME'],
    'PORT': os.environ['RDS_PORT'],
    }
    }

    CELERY_TASK_SERIALIZER='json'
    CELERY_RESULT_BACKEND = 'django-db'
    CELERY_BROKER_URL = 'sqs://'
    CELERY_BROKER_TRANSPORT_OPTIONS = {
    "region": "us-west-2",
    'queue_name_prefix': 'myapp-something-',
    'visibility_timeout': 360,
    'polling_interval': 1
    }

    # Use django-ses for email
    EMAIL_BACKEND = 'django_ses.SESBackend'
    AWS_SES_REGION_ENDPOINT = 'email.us-west-2.amazonaws.com'
    9 changes: 0 additions & 9 deletions settings.py
    Original file line number Diff line number Diff line change
    @@ -1,9 +0,0 @@
    CELERY_TASK_SERIALIZER='json'
    CELERY_RESULT_BACKEND = 'django-db'
    CELERY_BROKER_URL = 'sqs://'
    CELERY_BROKER_TRANSPORT_OPTIONS = {
    "region": "us-west-2",
    'queue_name_prefix': 'myapp-something-',
    'visibility_timeout': 360,
    'polling_interval': 1
    }
  3. @keeth keeth revised this gist Jul 21, 2017. 2 changed files with 11 additions and 0 deletions.
    2 changes: 2 additions & 0 deletions notes.md
    Original file line number Diff line number Diff line change
    @@ -10,6 +10,8 @@ You may also need to do:

    `eb setenv PYCURL_SSL_LIBRARY=nss`

    Also ensure your instance profile (aws-elasticbeanstalk-ec2-role) has permission to talk to SQS.

    Much of this was sourced from:

    https://stackoverflow.com/questions/41161691/how-to-run-a-celery-worker-with-django-app-scalable-by-aws-elastic-beanstalk
    9 changes: 9 additions & 0 deletions settings.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,9 @@
    CELERY_TASK_SERIALIZER='json'
    CELERY_RESULT_BACKEND = 'django-db'
    CELERY_BROKER_URL = 'sqs://'
    CELERY_BROKER_TRANSPORT_OPTIONS = {
    "region": "us-west-2",
    'queue_name_prefix': 'myapp-something-',
    'visibility_timeout': 360,
    'polling_interval': 1
    }
  4. @keeth keeth created this gist Jul 21, 2017.
    13 changes: 13 additions & 0 deletions 00-packages.config
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,13 @@
    packages:
    yum:
    libjpeg-turbo-devel: []
    libpng-devel: []
    libcurl-devel: []

    commands:
    01_install_rhel_pg:
    command: "(yum repolist |grep -q pgdg96) || sudo yum install https://download.postgresql.org/pub/repos/yum/9.6/redhat/rhel-6-x86_64/pgdg-ami201503-96-9.6-2.noarch.rpm -y"
    02_install_pg_devel:
    command: "sudo yum install postgresql96-devel -y"
    03_link_pg_config:
    command: "sudo ln -sf /usr/pgsql-9.6/bin/pg_config /usr/bin/"
    110 changes: 110 additions & 0 deletions 01-python.config
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,110 @@
    files:
    "/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh" :
    mode: "000755"
    owner: root
    group: root
    content: |
    #!/usr/bin/env bash

    # Get django environment variables
    celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g' | sed 's/%/%%/g'`
    celeryenv=${celeryenv%?}

    # Create celery configuration script
    celeryconf="[program:celeryd-worker]
    ; Set full path to celery program if using virtualenv
    command=/opt/python/run/venv/bin/celery worker -A assembly --loglevel=INFO

    directory=/opt/python/current/app
    user=nobody
    numprocs=1
    stdout_logfile=/var/log/celery-worker.log
    stderr_logfile=/var/log/celery-worker.log
    autostart=true
    autorestart=true
    startsecs=10

    ; Need to wait for currently executing tasks to finish at shutdown.
    ; Increase this if you have very long running tasks.
    stopwaitsecs = 600

    ; When resorting to send SIGKILL to the program to terminate it
    ; send SIGKILL to its whole process group instead,
    ; taking care of its children as well.
    killasgroup=true

    ; if rabbitmq is supervised, set its priority higher
    ; so it starts first
    priority=998

    environment=$celeryenv

    [program:celeryd-beat]
    ; Set full path to celery program if using virtualenv
    command=/opt/python/run/venv/bin/celery beat -A assembly --loglevel=INFO --workdir=/tmp -S django

    directory=/opt/python/current/app
    user=nobody
    numprocs=1
    stdout_logfile=/var/log/celery-beat.log
    stderr_logfile=/var/log/celery-beat.log
    autostart=true
    autorestart=true
    startsecs=10

    ; Need to wait for currently executing tasks to finish at shutdown.
    ; Increase this if you have very long running tasks.
    stopwaitsecs = 600

    ; When resorting to send SIGKILL to the program to terminate it
    ; send SIGKILL to its whole process group instead,
    ; taking care of its children as well.
    killasgroup=true

    ; if rabbitmq is supervised, set its priority higher
    ; so it starts first
    priority=998

    environment=$celeryenv"

    # Create the celery supervisord conf script
    echo "$celeryconf" | tee /opt/python/etc/celery.conf

    # Add configuration script to supervisord conf (if not there already)
    if ! grep -Fxq "[include]" /opt/python/etc/supervisord.conf
    then
    echo "[include]" | tee -a /opt/python/etc/supervisord.conf
    echo "files: celery.conf" | tee -a /opt/python/etc/supervisord.conf
    fi

    # Reread the supervisord config
    supervisorctl -c /opt/python/etc/supervisord.conf reread

    # Update supervisord in cache without restarting all services
    supervisorctl -c /opt/python/etc/supervisord.conf update

    # Start/Restart celeryd through supervisord
    supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-beat
    supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-worker

    commands:
    01_upgrade_pip_global:
    command: "if test -e /usr/bin/pip; then sudo /usr/bin/pip install --upgrade pip; fi"
    02_upgrade_pip_global:
    command: "if test -e /usr/local/bin/pip; then sudo /usr/local/bin/pip install --upgrade pip; fi"
    03_upgrade_pip_for_venv:
    command: "if test -e /opt/python/run/venv/bin/pip; then sudo /opt/python/run/venv/bin/pip install --upgrade pip; fi"

    container_commands:
    02_celery_tasks_run:
    command: "/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"
    leader_only: true
    03_migrate:
    command: "./manage.py migrate"
    leader_only: true

    option_settings:
    aws:elasticbeanstalk:container:python:
    WSGIPath: assembly/wsgi.py
    aws:elasticbeanstalk:application:environment:
    DJANGO_SETTINGS_MODULE: django_app.settings
    15 changes: 15 additions & 0 deletions notes.md
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,15 @@
    This configuration will run celery-worker and celery-beat on a single (leader) node, using supervisor.

    Also it'll run './manage.py migrate' on a single node after each deployment

    Ensure that requirements.txt includes:

    `pycurl==7.43.0 --global-option="--with-nss"`

    You may also need to do:

    `eb setenv PYCURL_SSL_LIBRARY=nss`

    Much of this was sourced from:

    https://stackoverflow.com/questions/41161691/how-to-run-a-celery-worker-with-django-app-scalable-by-aws-elastic-beanstalk