I was wondering whether we should be more specific in the response to tell the
Same here
However, if I’m precise and ask using the exact name of my sensor, it’s fine
BUT if I open the window and ask the question again, I get the same answer
AND WORSE, if I ask whether the window is open
I renamed my sensors to ‹ window › and ‹ door › and got the same result (and I’m hot so everything is open).
@pierre-gilles any idea how to solve this issue with ChatGPT?
It’s not a bug, it’s a missing feature ![]()
The feature was never developed, and the answers you get are just ChatGPT « hallucinations », which just generates text to answer your question without any basis.
Can you add a « context prompt » before our requests, indicating that it’s possible not to have the answer?
I read that this allows getting responses in an « I don’t know » mode rather than forcing an answer at all costs, even if it’s off the mark.
Indeed, I’ll try adding that, though it doesn’t always work 100% ![]()
@lmilcent Strange, I tried to reproduce what you told me with the version of ChatGPT currently in production, and I do get the expected behavior:
I’ve tried several times and I always get the same response…
That’s the magic of AI, after all ![]()










