Skip to content

Instantly share code, notes, and snippets.

@YurkoHoshko
Last active September 27, 2024 19:04
Show Gist options
  • Select an option

  • Save YurkoHoshko/ef30e20d153f208ed3d082b51c0e1808 to your computer and use it in GitHub Desktop.

Select an option

Save YurkoHoshko/ef30e20d153f208ed3d082b51c0e1808 to your computer and use it in GitHub Desktop.

Revisions

  1. YurkoHoshko revised this gist Sep 11, 2024. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion ollama_example.livemd
    Original file line number Diff line number Diff line change
    @@ -12,7 +12,7 @@ Mix.install(

    <!-- livebook:{"break_markdown":true} -->

    This is a demo of tiny fraction of Livebook capabilities.
    This is a demo for a tiny fraction of Livebook capabilities.

    In this example, we create a simple two-input form that allows setting system prompt and sending a message to LLM hosted with [Ollama](ollama.com).

  2. YurkoHoshko revised this gist Sep 11, 2024. 1 changed file with 14 additions and 11 deletions.
    25 changes: 14 additions & 11 deletions ollama_example.livemd
    Original file line number Diff line number Diff line change
    @@ -1,20 +1,23 @@
    <!-- livebook:{"app_settings":{"access_type":"public","output_type":"rich","slug":"one-shot-prompter"},"persist_outputs":true} -->

    [![Run in Livebook](https://livebook.dev/badge/v1/blue.svg)](https://livebook.dev/run?url=https%3A%2F%2Fgist.github.com%2FYurkoHoshko%2Fef30e20d153f208ed3d082b51c0e1808)

    # One-shot prompt with Ollama

    ```elixir
    Mix.install([
    {:ollama, "~> 0.7.0"},
    {:kino, "~> 0.13.2"}
    ])

    Kino.nothing()
    Mix.install(
    [:ollama, :kino]
    )
    ```

    ## Simple one-shot chat :)

    [![Run in Livebook](https://livebook.dev/badge/v1/blue.svg)](https://livebook.dev/run?url=https%3A%2F%2Fgist.github.com%2FYurkoHoshko%2Fef30e20d153f208ed3d082b51c0e1808)

    <!-- livebook:{"break_markdown":true} -->

    This is a demo of tiny fraction of Livebook capabilities.

    In this example, we create a simple two-input form that allows setting system prompt and sending a message to LLM hosted with [Ollama](ollama.com).

    You can see an example of me toying with this Livebook [in my twitter post](https://x.com/DJTechDebt/status/1818205498384036019).

    ```elixir
    client = Ollama.init()

    @@ -68,4 +71,4 @@ Kino.listen(form, fn %{
    end)

    Kino.nothing()
    ```
    ```
  3. YurkoHoshko revised this gist Sep 11, 2024. 1 changed file with 2 additions and 0 deletions.
    2 changes: 2 additions & 0 deletions ollama_example.livemd
    Original file line number Diff line number Diff line change
    @@ -1,5 +1,7 @@
    <!-- livebook:{"app_settings":{"access_type":"public","output_type":"rich","slug":"one-shot-prompter"},"persist_outputs":true} -->

    [![Run in Livebook](https://livebook.dev/badge/v1/blue.svg)](https://livebook.dev/run?url=https%3A%2F%2Fgist.github.com%2FYurkoHoshko%2Fef30e20d153f208ed3d082b51c0e1808)

    # One-shot prompt with Ollama

    ```elixir
  4. YurkoHoshko revised this gist Sep 11, 2024. 1 changed file with 0 additions and 2 deletions.
    2 changes: 0 additions & 2 deletions ollama_example.livemd
    Original file line number Diff line number Diff line change
    @@ -1,5 +1,3 @@
    [![Run in Livebook](https://livebook.dev/badge/v1/blue.svg)](https://livebook.dev/run?url=http%3A%2F%2Flocalhost%3A57011%2Fsessions%2F232rrgtf26ji2rld73hi7rn3fthfah4ndyf4k6idyrnz55on)

    <!-- livebook:{"app_settings":{"access_type":"public","output_type":"rich","slug":"one-shot-prompter"},"persist_outputs":true} -->

    # One-shot prompt with Ollama
  5. YurkoHoshko revised this gist Sep 11, 2024. 1 changed file with 2 additions and 0 deletions.
    2 changes: 2 additions & 0 deletions ollama_example.livemd
    Original file line number Diff line number Diff line change
    @@ -1,3 +1,5 @@
    [![Run in Livebook](https://livebook.dev/badge/v1/blue.svg)](https://livebook.dev/run?url=http%3A%2F%2Flocalhost%3A57011%2Fsessions%2F232rrgtf26ji2rld73hi7rn3fthfah4ndyf4k6idyrnz55on)

    <!-- livebook:{"app_settings":{"access_type":"public","output_type":"rich","slug":"one-shot-prompter"},"persist_outputs":true} -->

    # One-shot prompt with Ollama
  6. YurkoHoshko created this gist Sep 11, 2024.
    69 changes: 69 additions & 0 deletions ollama_example.livemd
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,69 @@
    <!-- livebook:{"app_settings":{"access_type":"public","output_type":"rich","slug":"one-shot-prompter"},"persist_outputs":true} -->

    # One-shot prompt with Ollama

    ```elixir
    Mix.install([
    {:ollama, "~> 0.7.0"},
    {:kino, "~> 0.13.2"}
    ])

    Kino.nothing()
    ```

    ## Simple one-shot chat :)

    ```elixir
    client = Ollama.init()

    models =
    client
    |> Ollama.list_models()
    |> then(fn {:ok, %{"models" => models}} -> models end)
    |> Enum.map(fn model -> {Map.get(model, "name"), Map.get(model, "name")} end)

    model_input = Kino.Input.select("Model", models)
    system_prompt_input = Kino.Input.textarea("System prompt")
    instruction_input = Kino.Input.textarea("Instruction")

    form =
    Kino.Control.form(
    [model: model_input, system_prompt: system_prompt_input, instruction: instruction_input],
    submit: "Start"
    )
    |> Kino.render()

    reply_frame =
    Kino.Frame.new()
    |> Kino.render()

    Kino.listen(form, fn %{
    data: %{
    model: model,
    system_prompt: system_prompt,
    instruction: instruction
    }
    } ->
    messages = [
    %{role: "system", content: system_prompt},
    %{role: "user", content: instruction}
    ]

    Kino.Frame.clear(reply_frame)

    {:ok, streaming} =
    Ollama.chat(client,
    model: model,
    messages: messages,
    stream: true
    )

    streaming
    |> Stream.each(fn %{"message" => %{"content" => chunk}} ->
    Kino.Frame.append(reply_frame, Kino.Markdown.new(chunk, chunk: true))
    end)
    |> Stream.run()
    end)

    Kino.nothing()
    ```