Sven 🛌<p>When I was sick recently, I noted down a question I wanted to research later and then realised that it might be an interesting test case for an LLM search.</p><p>And sure enough, the answer to the exact same question went from "drug might be harmful" to "drug might be helpful" with just a small addition.</p><p>I can easily think of other examples like "Is my country doing poorly?" vs. "Is my country doing poorly because of foreigners?" changing the whole trajectory of the response that is then presented as *the* answer.</p><p>Obviously, classic search is prone to confirmation bias as well, but it kinda seems supercharged this way.</p><p><a href="https://mstdn.games/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://mstdn.games/tags/AISearch" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AISearch</span></a> <a href="https://mstdn.games/tags/Perplexity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Perplexity</span></a> <a href="https://mstdn.games/tags/PerplexityAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PerplexityAI</span></a></p>