Some notes on AI / ML tools that seem interesting/useful (largely aiming to focus on open source tools)
| <TaskerData sr="" dvi="1" tv="5.8.3"> | |
| <Profile sr="prof2" ve="2"> | |
| <cdate>1565066300570</cdate> | |
| <edate>1567394645768</edate> | |
| <id>2</id> | |
| <mid0>3</mid0> | |
| <nme>Bank SMS Forward</nme> | |
| <Event sr="con0" ve="2"> | |
| <code>7</code> | |
| <pri>0</pri> |
| REGION=eu-west-1 | |
| VER=1.7.3 | |
| RUNTIME=python3.7 | |
| docker run -v $(pwd):/out -it lambci/lambda:build-$RUNTIME \ | |
| pip install scrapy==$VER -t /out/build/scrapy/python | |
| cd build/scrapy | |
| zip -r ../../scrapy.zip python/ | |
| cd ../.. |
UPDATE (March 2020, thanks @ic): I don't know the exact AMI version but yum install docker now works on the latest Amazon Linux 2. The instructions below may still be relevant depending on the vintage AMI you are using.
Amazon changed the install in Linux 2. One no-longer using 'yum' See: https://aws.amazon.com/amazon-linux-2/release-notes/
sudo amazon-linux-extras install docker
sudo service docker startLet's Encrypt - Synology NAS + sameersbn/docker-gitlab
Getting HTTPS on a Synology NAS + Gitlab container is a bit tricky. Using self-assigned OpenSSL certificates is great, but it can only provide SSL certificates that inevitably will be flagged as untrusted by the browser due to the common name being unrecognized/not associated with a trusted SSL provider:
The downside will be that every user that remotely accesses your NAS will be greeted with the above message unless they manually add the certificate to their browser's approved SSL provider list. Instead, here's a work-around to enable HTTPS for both your Synology NAS and a Gitlab container using just one Let's Encrypt certification.
For more information regarding the docker-gitlab installation and set up: Synology Docker
| const Apify = require('apify'); | |
| Apify.main(async () => { | |
| // Get queue and enqueue first url. | |
| const requestQueue = await Apify.openRequestQueue(); | |
| const enqueue = async url => requestQueue.addRequest(new Apify.Request({ url })); | |
| await enqueue('https://news.ycombinator.com/'); | |
| // Create crawler. | |
| const crawler = new Apify.PuppeteerCrawler({ |
- Core Components
/apps/core/wcm/components- Latest Release: https://github.com/Adobe-Marketing-Cloud/aem-core-wcm-components
- JSP foundation components (most deprecated)
foundation/components
- HTL Foundation components (most deprecated)
wcm/foundation/components
| #!/usr/bin/env bash | |
| # install docker | |
| # https://docs.docker.com/engine/installation/linux/ubuntulinux/ | |
| # install docker-compose | |
| # https://docs.docker.com/compose/install/ | |
| # install letsencrypt | |
| # https://www.digitalocean.com/community/tutorials/how-to-secure-nginx-with-let-s-encrypt-on-ubuntu-16-04 |
| <?php | |
| /** | |
| * Assumes https://github.com/Spomky-Labs/jose library is installed and autoloading is set up | |
| * Decode and verify token guide: https://github.com/Spomky-Labs/jose/blob/master/doc/operation/Verify.md | |
| */ | |
| use Jose\Factory\JWKFactory; | |
| use Jose\Loader; | |
| // We load the key set from a URL | |
| // JSON Key URL (JKU) - https://cognito-idp.{region}.amazonaws.com/{userPoolId}/.well-known/jwks.json. |
