0

I have a chatbot UI built with react that invokes a Python script and displays the script response using fast API. I use gpt 3.5 to format the question into a query in my Python script. I then send the query to an internal API and return the API response to the chatbot. I now want to add conversation to the chatbot. How do I handle this on the react and python/LLM end?

I tried using Langchain's conversation buffer window. I am expecting similar results with the chatbot. I would like to answer any follow-up questions about the last 5 questions asked on the chatbot. How do I store the question and response and add it to the context next time the Python script is invoked?

1 Answer 1

0

Here's an example of ConversationChain and ConversationBufferWindowMemory functions that achieves your expectation. Where if k is set to 5, the model will remember the last 5 interactions

from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferWindowMemory
conversation = ConversationChain(
    llm=llm,
    prompt=PROMPT,
    # memory=ConversationBufferMemory(),
    memory=ConversationBufferWindowMemory(k=5),
    verbose=True)
Sign up to request clarification or add additional context in comments.

2 Comments

does the llm remember the previous questions if I invoke the script for every question?
ConversationChain with ConversationBufferWindowMemory will keep the conversation in memory. As the example, k=5 means the last 5 interactions will be remembered.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.