➡️ Sometimes it happens that instead of answering the question the AI will reply with something like this:
Would you like me to find and list the one clearly best-rated garage door installer in Tampere, or would you prefer a short top-3 list (company + rating + source and contact information)? I'll tell you the search criteria (for example, Google/Fonecta/Urakkamaailma ratings) before searching, if you'd like.
This is probably just a prompt issue, maybe embedded in the prompt should be instructions to answer directly and to not ask followup questions?
➡️ What causes this outcome? Should this not be the raw output even for non-detections? Happens mostly with Gemini and ChatGPT, but also with Perplexity even though much more rare.

➡️ Still plenty of prompts with mixed English and profile language.
Or are the prompt templates in English and we see the translations just in the UI?
Please authenticate to join the conversation.
In Review
🐛 Bug Reports
About 2 hours ago

Severi
Get notified by email when there are changes.
In Review
🐛 Bug Reports
About 2 hours ago

Severi
Get notified by email when there are changes.