-
Notifications
You must be signed in to change notification settings - Fork 344
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enhancement: record full prompt in the feedback record #374
Labels
Comments
@azaylamba Hi Ajay, would you like to take a stab at this since you submitted the PR on user feedback? |
Sure @massi-ang, I will look into this in a couple of days. |
I think this one can be closed with merge last week |
This issue is stale because it has been open for 60 days with no activity. |
This issue was closed because it has been inactive for 30 days since being marked as stale. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The user feedback feature introduced in #287 records the user feedback with the following format:
Note that the
prompt
field contains only the user question, and not the full prompts sent to the LLM, which would consist of a prompt template formatted with the chat history and the user question.By only recording the user question, the data will not be useful for fine tuning of the model.
We want to record the full prompt, and possibly also the prompt template and template arguments values.
The text was updated successfully, but these errors were encountered: