Skip to content

Instantly share code, notes, and snippets.

View pronitdas's full-sized avatar
🧸

Pronit Das pronitdas

🧸
View GitHub Profile
{
"mcpServers": {
"thinker": {
"command": "npx",
"args": [
"-y",
"clarity-mcp-server"
]
},
"calculator": {

LLM Edge Inference Mobile App - Technical Specification

1. Project Overview

1.1 Purpose

Develop a cross-platform mobile application that enables users to download, manage, and interact with Large Language Models (LLMs) directly on their mobile devices using React Native and local inference capabilities.

1.2 Key Features

  • Local LLM inference without internet connectivity
  • Model download and management from Hugging Face Hub

WHITE PAPER: THE DIGITAL AUGMENTATION FRAMEWORK – TRANSFORMING DEVELOPERS INTO AUTOCRATIC AUMENTS THROUGH NEURAL SYMBIOTIC INTEGRATION
20,000 WORDS | VERSION 1.0


TABLE OF CONTENTS

  1. INTRODUCTION: THE FUTURE OF HUMAN-AI COLLABORATION
    • The Dawn of Digital Symbiosis
    • Redefining Developer Roles in the Age of Autocratic Augments
  2. THEORETICAL FRAMEWORK: MIRROR NEURON SYMBIOTICS AND NEURAL PATHWAY INTEGRATION
# Build Your Brain: Developer Guide
Welcome to the Build Your Brain developer guide! This resource will help you integrate LLMs and implement cognitive enhancement features into your application. Here's everything you need to know to get started.
## 🧠 Core Concepts
### LLM Integration
Large Language Models (LLMs) form the foundation of modern cognitive applications. Here's how to leverage them effectively:
@pronitdas
pronitdas / test.js
Created January 23, 2024 19:50
test mvt using node
// api in question is a mapbox vector tiles
// Path: test.js
const SphericalMercator = require("@mapbox/sphericalmercator");
const merc = new SphericalMercator({ size: 256 });
const axios = require("axios");
const fs = require("fs").promises;
const { Console } = require("console");
const http = require('http');
@pronitdas
pronitdas / index.py
Created January 4, 2022 16:31
convert to cog
from rio_cogeo.cogeo import cog_translate
from rio_cogeo.profiles import cog_profiles
def translate(src_path, dst_path, profile="deflate", profile_options={}, **options):
"""Convert image to COG."""
output_profile = cog_profiles.get(profile)
output_profile.update(dict(BIGTIFF="IF_SAFER"))
output_profile.update(profile_options)
config = dict(
@pronitdas
pronitdas / contact-to-vcf
Created October 7, 2021 19:19
remove duplicates vcf
const csv = require('csv-parser')
const fs = require('fs')
var vCardJS = require('vcards-js');
let outputStream = fs.createWriteStream('new_contacts.vcf', 'utf8');
let inputStream = fs.createReadStream('new_contacts.csv', 'utf8');
let allContacts = []
inputStream
.pipe(csv())
#!/bin/bash
while true
do
cd ~/<REPO_FOLDER>
git fetch
LOCAL=$(git rev-parse HEAD)
REMOTE=$(git rev-parse @{u})
if [ $LOCAL != $REMOTE ]; then
git pull origin <TARGET_BRANCH>
docker-compose up --build -d
Some thoughts on how you could evaluate the state of the systems your team owns.
One way to use this:
Put some of these criteria on the Y axis
Put the name of the components you own on the X.
Give everything a score from 1 to 0.
Either average the scores or sum them to figure out which components need the most love.
If all of these are the same for everything you own, it might make sense to skip that section. For example, if you own 10 services, but they all use a common build pipeline that you don’t maintain, it might make sense to skip that criteria.
@pronitdas
pronitdas / The Technical Interview Cheat Sheet.md
Created April 9, 2021 08:52 — forked from tsiege/The Technical Interview Cheat Sheet.md
This is my technical interview cheat sheet. Feel free to fork it or do whatever you want with it. PLEASE let me know if there are any errors or if anything crucial is missing. I will add more links soon.

ANNOUNCEMENT

I have moved this over to the Tech Interview Cheat Sheet Repo and has been expanded and even has code challenges you can run and practice against!






\