0

I am trying to self host Khoj AI and I have followed all the steps in the documentation.

The only thing I changed in yml file is:-

- OPENAI_API_BASE=http://localhost:1234/v1/

That is where my LMStudio is running on the host machine. And when I do a call from my host machine like:-

curl http://localhost:1234/v1/models/

I get a response so server is up and running. But when I try to send the same request from Khoj AI GUI frontend, The server does not show any incoming requests. Which means that Docker Container app cannot access the server on host machine. How can I make the LMStudio server on host machine available to my Khoj AI docker terminal?

I have checked this and this but I do not understand them.

1 Answer 1

0

by design, the guest (containers, in your case - khoj ai) is unable to access the host services (in your case - lmstudio), which is expected behavior since containers are isolated by default.

you have a few options:

  1. run lmstudio in a container and configure khoj ai appropriately
  2. mount lmstudio socket file into the khoj ai container and use the socket file to interact with lmstudio. see the second example in "Connect to it from a second container"
  3. use a special hostname host.docker.internal to refer to the host machine from inside a the khoj ai container
  4. run the khoj ai container in a network mode that allows it to connect to the host with the --network host flag
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.