

As with all tools, there’s good use and bad use. I use GPT tools for when I can’t remember what the name of something is. They seem to be particularly good at that, and I always follow up with a real source. It’s been wrong, but not often.
As with all tools, there’s good use and bad use. I use GPT tools for when I can’t remember what the name of something is. They seem to be particularly good at that, and I always follow up with a real source. It’s been wrong, but not often.
Oh, no, sorry I was speaking on the general concept that people remember wrong answers even when told they’re wrong. Everyone here is so annoyed at “I asked chatGPT, here’s a link, I haven’t verified it” that I think they purposefully ignored everything else you said.
Except they do. That’s how brains work. Wrong answers will stick in people’s heads even when they know it’s wrong. Then, later on, the “wrongness” fades and you’re left with only familiarity for that answer, which is used as a proxy for correctness. Generally speaking, your brain primarily uses familiarity when assessing information, not strict logic or interrogation.
When given an answer, people will trust it, even when told not to trust it.
You asked ChatGPT to do your homework, didn’t you, kid?
It’s not an overpass. A loose brick falls off a truck going in the opposite direction, bounces off the pavement once, then goes through the windshield.
Edit: oh hurray, there’s two different brick videos.
The California High Speed Rail Songin my ass.