• 0 Posts
  • 2 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle
  • The AI models can be hilariously bad even on their own terms.

    Yesterday I asked Gemini for a population figure (because I was too lazy to look it up myself). First I asked it:

    what is the combined population of place 1, place 2 and place 3

    It answered:

    Unfortunately, providing an absolutely precise, up-to-the-minute combined population figure is difficult. Population numbers are constantly changing.

    To find these figures, I would recommend checking the Office for National Statistics (ONS) website. They provide detailed population statistics for local authorities in England.

    On a whim, I asked it again as:

    combined population of place 1, place 2 and place 3

    And then it gave me the answer sweet as a nut.

    Apparently I was being too polite with it, I guess?


  • It’s a real issue. A strong use case for LLM search engines is providing summaries which combine lots of facts that would take some time to compile through searching the old fashioned way. But if it’s only 90% accurate and 10% hallucinated bullshit, it becomes very difficult to pick out the bullshit from the truth.

    The other day I asked Copilot to provide an overview of a particular industrial sector in my area. It produced something that was 90% concise, accurate, readable and informative briefing, and 10% complete nonsense. It hallucinated an industrial estate that didn’t exist, a whole government programme that doesn’t exist, it talked about a scheme that went defunct 20 years ago as if it were still current, etc. If it weren’t for the fact that I was already very familiar with the subject, I might not have caught it. Anyone actually relying on that for useful work is in serious danger of making a complete tit of themselves.