I have a chatbot UI built with react that invokes a Python script and displays the script response using fast API. I use gpt 3.5 to format the question into a query in my Python script. I then send the query to an internal API and return the API response to the chatbot. I now want to add conversation to the chatbot. How do I handle this on the react and python/LLM end?
I tried using Langchain's conversation buffer window. I am expecting similar results with the chatbot. I would like to answer any follow-up questions about the last 5 questions asked on the chatbot. How do I store the question and response and add it to the context next time the Python script is invoked?