| #!/bin/bash | |
| # Function to display usage information | |
| usage() { | |
| echo "Usage: $0 /path/to/input.mp4 [ /path/to/output_directory ]" | |
| exit 1 | |
| } | |
| # Check if at least one argument (input file) is provided | |
| if [ $# -lt 1 ]; then |
| # SETUP # | |
| DOMAIN=example.com | |
| PROJECT_REPO="git@github.com:example.com/app.git" | |
| AMOUNT_KEEP_RELEASES=5 | |
| RELEASE_NAME=$(date +%s--%Y_%m_%d--%H_%M_%S) | |
| RELEASES_DIRECTORY=~/$DOMAIN/releases | |
| DEPLOYMENT_DIRECTORY=$RELEASES_DIRECTORY/$RELEASE_NAME | |
| # stop script on error signal (-e) and undefined variables (-u) |
| # A one liner to leverage the GPU on a mac to transcribe audio files | |
| # Inspired by https://simonwillison.net/2024/Aug/13/mlx-whisper/ | |
| llm_transcribe_recording () { | |
| local file_path="$1" | |
| python3 -c " | |
| import mlx_whisper | |
| result = mlx_whisper.transcribe('$file_path', path_or_hf_repo='mlx-community/distil-whisper-large-v3') | |
| print(result['text']) | |
| " | |
| } |
I am investigating how to use Bend (a parallel language) to accelerate Symbolic AI; in special, Discrete Program Search. Basically, think of it as an alternative to LLMs, GPTs, NNs, that is also capable of generating code, but by entirely different means. This kind of approach was never scaled with mass compute before - it wasn't possible! - but Bend changes this. So, my idea was to do it, and see where it goes.
Now, while I was implementing some candidate algorithms on Bend, I realized that, rather than mass parallelism, I could use an entirely different mechanism to speed things up: SUP Nodes. Basically, it is a feature that Bend inherited from its underlying model ("Interaction Combinators") that, in simple terms, allows us to combine multiple functions into a single superposed one, and apply them all to an argument "at the same time". In short, it allows us to call N functions at a fraction of the expected cost. Or, in simple terms: why parallelize when we can share?
A
| import { Client } from "@upstash/qstash" | |
| import { NextRequest } from "next/server" | |
| const baseUrl = process.env.NEXT_PUBLIC_VERCEL_PROJECT_PRODUCTION_URL | |
| ? `https://${process.env.NEXT_PUBLIC_VERCEL_PROJECT_PRODUCTION_URL}` | |
| : "http://localhost:3000" | |
| interface Step<I> { | |
| create: <O>(action: (prevResult: Awaited<I>) => O) => Step<O> | |
| finally: (action: (prevResult: Awaited<I>) => any) => any |
| This uses llm.datasette.io and OpenAI. | |
| I use `git commit --template` to provide the output from the LLM to Git. This way, if you do not like the results, you | |
| can quit your editor and no commit will be made. | |
| # Shell function for generating a diff and editing it in your default editor: | |
| gcllm() { | |
| GIT_DIR="$(git rev-parse --git-dir)" | |
| TEMPLATE="$GIT_DIR/COMMIT_EDITMSG_TEMPLATE" |
| 1. # create new .py file with code found below | |
| 2. # install ollama | |
| 3. # install model you want “ollama run mistral” | |
| 4. conda create -n autogen python=3.11 | |
| 5. conda activate autogen | |
| 6. which python | |
| 7. python -m pip install pyautogen | |
| 7. ollama run mistral | |
| 8. ollama run codellama | |
| 9. # open new terminal |
| # 2023-11-27 MIT LICENSE | |
| Here's the open source version of my ChatGPT game MonkeyIslandAmsterdam.com. | |
| It's an unofficial image+text-based adventure game edition of Monkey Island in Amsterdam, my home town. | |
| Please use it however you want. It'd be nice to see more ChatGPT-based games appear from this. If you get inspired by it, please link back to my X https://x.com/levelsio or this Gist so more people can do the same! | |
| Send me your ChatGPT text adventure game on X, I'd love to try it! |
| // npm i superagentai-js | |
| import { SuperAgentClient } from "superagentai-js"; | |
| const GITHUB_REPO_URL = "https://github.com/homanp/nagato"; | |
| const PROMPT = `You are a helpful AI assistant that's an expert at answering questions about the following Github repository: ${GITHUB_REPO_URL}\n\nAlways use the functions provided to answer all questions by the user.`; | |
| interface Agent { | |
| id: string; | |
| name: string; |