Skip to content

bug: offline ollama/llama3.2 doesn't get any files. Only gets promt. #998

@Rival

Description

@Rival

Describe the bug

I have plugged in llama3.2 on Ollama with following setup as described

        ollama = {
          __inherited_from = "openai",
          api_key_name = "",
          endpoint = "http://127.0.0.1:11434/v1",
          model = "llama3.2",
        },

It seems it didn't pick up files in an ask mode:

> explain me this code

I'd be happy to help explain the code. However, you haven't provided any code yet. Please go ahead and share the code with me, and I'll do my best to break it down and explain what each part does.

**Generation complete!** Please review the code suggestions above.

Files are shown in Selected files. Tried different things: f.e. adding additional files.
Btw all other providers (ChatGPT, Claude and Perplexity) works great.

Installation method

return {
  {
    "yetone/avante.nvim",
    event = "VeryLazy",
    lazy = false,
    version = false, -- set this if you want to always pull the latest change
    opts = {
      -- provider = "openai", -- Recommend using Claude
      provider = "ollama",
      openai = {
        endpoint = "https://api.openai.com/v1",
        model = "chatgpt-4o-latest",
        timeout = 30000, -- Timeout in milliseconds
        temperature = 0,
        max_tokens = 4096,
      },
      claude = {
        endpoint = "https://api.anthropic.com",
        model = "claude-3-5-sonnet-20241022",
        timeout = 30000, -- Timeout in milliseconds
        temperature = 0,
        max_tokens = 8000,
      },
      vendors = {
        perplexity = {
          __inherited_from = "openai",
          api_key_name = "PERPLEXITY_API_KEY",
          endpoint = "https://api.perplexity.ai",
          model = "llama-3.1-sonar-large-128k-online",
        },
        ollama = {
          __inherited_from = "openai",
          api_key_name = "",
          endpoint = "http://127.0.0.1:11434/v1",
          model = "llama3.2",
        },
      },
    },
    init = function(_, opts)
      local config_path = vim.fn.stdpath("config")

      local file = io.open(config_path .. "/tools/anthropickey.txt", "r")
      if file then
        local key = file:read("*all"):gsub("%s+", "") -- Remove whitespace
        file:close()
        -- Set environment variable
        vim.env.ANTHROPIC_API_KEY = key
        print("OpenApiKey" .. key)
      end

      file = io.open(config_path .. "/tools/perplexitykey.txt", "r")
      if file then
        local key = file:read("*all"):gsub("%s+", "") -- Remove whitespace
        file:close()
        -- Set environment variable
        vim.env.PERPLEXITY_API_KEY = key
        print("OpenApiKey" .. key)
      end

      local wk = require("which-key")
      wk.add({ "<leader>a", group = "AI" })
    end,
    build = "powershell -ExecutionPolicy Bypass -File Build.ps1 -BuildFromSource false", -- for windows
    dependencies = {
      "stevearc/dressing.nvim",
      "nvim-lua/plenary.nvim",
      "MunifTanjim/nui.nvim",
      --- The below dependencies are optional,
      "hrsh7th/nvim-cmp", -- autocompletion for avante commands and mentions
      "nvim-tree/nvim-web-devicons", -- or echasnovski/mini.icons
      -- "zbirenbaum/copilot.lua", -- for providers='copilot'
      {
        -- support for image pasting
        "HakonHarnes/img-clip.nvim",
        event = "VeryLazy",
        opts = {
          -- recommended settings
          default = {
            embed_image_as_base64 = false,
            prompt_for_file_name = false,
            drag_and_drop = {
              insert_mode = true,
            },
            -- required for Windows users
            use_absolute_path = true,
          },
        },
      },
      {
        -- Make sure to set this up properly if you have lazy=true
        "MeanderingProgrammer/render-markdown.nvim",
        opts = {
          file_types = { "markdown", "Avante" },
        },
        ft = { "markdown", "Avante" },
      },
    },
  },
}

Environment

NVIM v0.10.2
Build type: Release
LuaJIT 2.1.1713484068
Windows 11

Metadata

Metadata

Assignees

No one assigned

    Labels

    StalebugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions