Skip to content

Instantly share code, notes, and snippets.

@tikg
Last active April 15, 2026 08:18
Show Gist options
  • Select an option

  • Save tikg/7f30846898026aef71bd07ad5b587d89 to your computer and use it in GitHub Desktop.

Select an option

Save tikg/7f30846898026aef71bd07ad5b587d89 to your computer and use it in GitHub Desktop.
Setting up LMS Studio (AI) on my local (history)

The Setup

lms get

@tikg
Copy link
Copy Markdown
Author

tikg commented Apr 15, 2026

Last login: Wed Apr 15 13:59:33 on ttys009
You have mail.

The default interactive shell is now zsh.
To update your account to use zsh, please run `chsh -s /bin/zsh`.
For more details, please visit https://support.apple.com/kb/HT208050.
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$ clear
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$ curl -fsSL https://lmstudio.ai/install.sh | bash
Downloading llmster 0.0.11-1 Darwin arm64
##################################################### 100.0%
Verifying checksum...
Installing llmster...
(node:33814) ExperimentalWarning: Single executable application is an experimental feature and might change at any time
(Use `llmster --trace-warnings ...` to show where the warning was created)
Install completed at /Users/my.user.name/.lmstudio/llmster/0.0.11-1.
Installation finished successfully! llmster is ready to launch.
To start the daemon, run:

    lms daemon up

To add lms to your PATH, either restart your shell or run:
    export PATH="/Users/my.user.name/.lmstudio/bin:$PATH"
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$ lms daemon up
-bash: lms: command not found
MAC-LOCAL-PC:~ my.user.name$ export PATH="/Users/my.user.name/.lmstudio/bin:$PATH
>
MAC-LOCAL-PC:~ my.user.name$ export PATH="/Users/my.user.name/.lmstudio/bin:$PATH"
MAC-LOCAL-PC:~ my.user.name$ lms daemon up
Waking up LM Studio service...
llmster started (PID: 33890).
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$ lms --help
Usage: lms [options] [command]

Local models
   chat               Start an interactive chat with a model
   get                Search and download local models or presets
   load               Load a model
   unload             Unload a model
   ls                 List the models available on disk
   ps                 List the models currently loaded in memory
   import             Import a model file into LM Studio

Serve
   server             Commands for managing the local server
   log                Log incoming and outgoing messages

Remote Instances
   link               Commands for managing LM Link

Runtime
   runtime            Manage and update the inference runtime

Develop & Publish (Beta)
   clone              Clone an artifact from LM Studio Hub to a local folder
   push               Uploads the artifact in the current folder to LM Studio Hub
   dev                Starts a plugin dev server in the current folder
   login              Authenticate with LM Studio
   logout             Log out of LM Studio
   whoami             Check the current authentication status

Learn more:           https://lmstudio.ai/docs/developer
Join our Discord:     https://discord.gg/lmstudio
MAC-LOCAL-PC:~ my.user.name$ lms get
No exact match found. Please choose a model from the list below.

✔ Select a model to download deepseek/deepseek-r1-0528-qwen3-8b

   ↓ To download: model deepseek/deepseek-r1-0528-qwen3-8b - 51.56 KB
   └─ ↓ To download: DeepSeek R1 0528 Qwen3 8B 4BIT [MLX] - 4.62 GB

About to download 4.62 GB.

✔ Start download? yes
Finalizing download...
Download completed.
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$ lms server start
Success! Server is now running on port 1234
MAC-LOCAL-PC:~ my.user.name$ curl http://localhost:1234/v1/models
{
  "data": [
    {
      "id": "deepseek/deepseek-r1-0528-qwen3-8b",
      "object": "model",
      "owned_by": "organization_owner"
    },
    {
      "id": "text-embedding-nomic-embed-text-v1.5",
      "object": "model",
      "owned_by": "organization_owner"
    }
  ],
  "object": "list"
}MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$ lms status
Server: ON (port: 1234)

No Models Loaded
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$
MAC-LOCAL-PC:~ my.user.name$ lms ls

You have 2 models, taking up 4.70 GB of disk space.

LLM                                               PARAMS    ARCH     SIZE       DEVICE
deepseek/deepseek-r1-0528-qwen3-8b (1 variant)    8B        qwen3    4.62 GB    Local

EMBEDDING                               PARAMS    ARCH          SIZE        DEVICE
text-embedding-nomic-embed-text-v1.5              Nomic BERT    84.11 MB    Local

MAC-LOCAL-PC:~ my.user.name$ lms load deepseek/deepseek-r1-0528-qwen3-8b

Loading deepseek/deepseek-r1-0528-qwen3-8b ⠼

@tikg
Copy link
Copy Markdown
Author

tikg commented Apr 15, 2026

add and loading a model

adding a model lms get meta/llama-3.3-70b
then loading a model lms load meta/llama-3.3-70b

@tikg
Copy link
Copy Markdown
Author

tikg commented Apr 15, 2026

removing from llmster

JP-H41T6MX25R:.lmstudio john.tuliao$ ls
bin			credentials		hub			models			settings.json
config-presets		dev-logs		llmster			projects		user-files
conversations		extensions		mcp.json		server-logs		working-directories
JP-H41T6MX25R:.lmstudio john.tuliao$ cd models/
JP-H41T6MX25R:models john.tuliao$ ls
lmstudio-community
JP-H41T6MX25R:models john.tuliao$ cd lmstudio-community/

### Models
JP-H41T6MX25R:lmstudio-community john.tuliao$ ls
DeepSeek-R1-0528-Qwen3-8B-MLX-4bit	Llama-3.3-70B-Instruct-GGUF
JP-H41T6MX25R:lmstudio-community john.tuliao$ du -h
4.3G	./DeepSeek-R1-0528-Qwen3-8B-MLX-4bit
 35G	./Llama-3.3-70B-Instruct-GGUF
 39G	.

### Path
JP-H41T6MX25R:lmstudio-community john.tuliao$ pwd
/Users/john.tuliao/.lmstudio/models/lmstudio-community
JP-H41T6MX25R:lmstudio-community john.tuliao$
JP-H41T6MX25R:lmstudio-community john.tuliao$

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment