Skip to content

Instantly share code, notes, and snippets.

View dharmapurikar's full-sized avatar

Sachin Dharmapurikar dharmapurikar

View GitHub Profile
@dharmapurikar
dharmapurikar / README_MINIMAL_PROMPT_CHAINABLE.md
Created December 11, 2024 09:57 — forked from disler/README_MINIMAL_PROMPT_CHAINABLE.md
Minimal Prompt Chainables - Zero LLM Library Sequential Prompt Chaining & Prompt Fusion

Minimal Prompt Chainables

Sequential prompt chaining in one method with context and output back-referencing.

Files

  • main.py - start here - full example using MinimalChainable from chain.py to build a sequential prompt chain
  • chain.py - contains zero library minimal prompt chain class
  • chain_test.py - tests for chain.py, you can ignore this
  • requirements.py - python requirements

Setup

@dharmapurikar
dharmapurikar / README.md
Created December 11, 2024 09:57 — forked from disler/README.md
Four Level Framework for Prompt Engineering
@dharmapurikar
dharmapurikar / README.md
Created December 11, 2024 09:56 — forked from disler/README.md
Prompt Chaining with QwQ, Qwen, o1-mini, Ollama, and LLM

Prompt Chaining with QwQ, Qwen, o1-mini, Ollama, and LLM

Here we explore prompt chaining with local reasoning models in combination with base models. With shockingly powerful local models like QwQ and Qwen, we can build some powerful prompt chains that let us tap into their capabilities in a immediately useful, local, private, AND free way.

Explore the idea of building prompt chains where the first is a powerful reasoning model that generates a response, and then use a base model to extract the response.

Play with the prompts and models to see what works best for your use cases. Use the o1 series to see how qwq compares.

Setup

  • Bun (to run bun run chain.ts ...)
@dharmapurikar
dharmapurikar / README.md
Created December 11, 2024 09:56 — forked from disler/README.md
Use Meta Prompting to rapidly generate results in the GenAI Age

Meta Prompting

In the Generative AI Age your ability to generate prompts is your ability to generate results.

Guide

Claude 3.5 Sonnet and o1 series models are recommended for meta prompting.

Replace {{user-input}} with your own input to generate prompts.

Use mp_*.txt as example user-inputs to see how to generate high quality prompts.

@dharmapurikar
dharmapurikar / update.ps1
Created October 5, 2024 14:25
Ollama updater for Powershell
ollama list | Select-Object -Skip 1 | ForEach-Object {
$model = ($_ -split '\s+')[0]
if ($model -and $model -ne "NAME") {
Write-Host "Updating model: $model"
ollama pull $model
}
}
@dharmapurikar
dharmapurikar / update-models.py
Created October 5, 2024 13:11
Powershell script to update ollama models
import asyncio
import time
async def run_powershell_command(command):
process = await asyncio.create_subprocess_exec(
"powershell", "-Command", command,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE
)
@dharmapurikar
dharmapurikar / starship_docker_container.sh
Created September 29, 2024 22:15
Show docker python version in the starship prompt
#!/bin/bash
export STARSHIP_DOCKER_CONTAINER="dharmapurikar/poetry:3.11" # Replace this with the base container you are using
if [ -n "$STARSHIP_DOCKER_CONTAINER" ]; then
docker run "$STARSHIP_DOCKER_CONTAINER" python -c 'import sys; print(sys.version.split()[0])'
fi
@dharmapurikar
dharmapurikar / starship.toml
Last active September 29, 2024 22:15
A different color scheme for Starship "gruvbox-rainbow" preset.
"$schema" = 'https://starship.rs/config-schema.json'
format = """
[](color_orange)\
$os\
$username\
[](bg:color_yellow fg:color_orange)\
$directory\
[](fg:color_yellow bg:color_aqua)\
$git_branch\
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
{
"keys": [
"start",
"end"
],
"values": [
[
"1363842000",
"1364446800"
],