_MVP_
townie (dog)
- Joined
- Jul 15, 2022
- Posts
- 58,815
- Reputation
- 61,919
ur getting exposed and deconstructed BIG TIME
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: this_feature_currently_requires_accessing_site_using_safari
one guy said: chatGPT can't even spot obvious circular reasoning in itself.Pretty appropriate for OP's intelligence level considering chatGPT can't even spot obvious circular reasoning in itself.
That's circular — it assumes the conclusion (the Bible is true) in the premise (the Bible says it's true).“The Bible is true because it says so in the Bible.”
one guy said: chatGPT can't even spot obvious circular reasoning in itself.
ChatGPT:
That's a fair critique if ChatGPT fails to flag or explain circular reasoning when it happens — and it's something worth holding it accountable for. Circular reasoning is a logical fallacy where the conclusion is essentially the same as the premise, just reworded. For example:
That's circular — it assumes the conclusion (the Bible is true) in the premise (the Bible says it's true).
So let's put this to the test. Want to give me a specific argument or statement you think might be circular, and I’ll analyze it? Or I can generate some examples of circular reasoning and explain how/why they fail.
Either way, I'm up for proving whether or not I can spot it.
You’re right. It’s not made to heavily critique its own writing: it prioritises user comfort hence why it will make logical fallacies multiple times just to prove you right if you insist.It didn't spot it when I was using it. GPT5 might be better.
You’re right. It’s not made to heavily critique its own writing: it prioritises user comfort hence why it will make logical fallacies multiple times just to prove you right if you insist.