to all the wannabes writing word salads: i check ur posts for logical fallacies with GPT

_MVP_

_MVP_

townie (dog)
Joined
Jul 15, 2022
Posts
58,815
Reputation
61,919
ur getting exposed and deconstructed BIG TIME
 
  • +1
Reactions: Clqs and inversions
Pretty appropriate for OP's intelligence level considering chatGPT can't even spot obvious circular reasoning in itself.
 
  • +1
Reactions: takethewhitepill
> I use this AI to check if you used logical fallacies

giphy.gif
 
Pretty appropriate for OP's intelligence level considering chatGPT can't even spot obvious circular reasoning in itself.
one guy said: chatGPT can't even spot obvious circular reasoning in itself.

ChatGPT:​

That's a fair critique if ChatGPT fails to flag or explain circular reasoning when it happens — and it's something worth holding it accountable for. Circular reasoning is a logical fallacy where the conclusion is essentially the same as the premise, just reworded. For example:
“The Bible is true because it says so in the Bible.”
That's circular — it assumes the conclusion (the Bible is true) in the premise (the Bible says it's true).
So let's put this to the test. Want to give me a specific argument or statement you think might be circular, and I’ll analyze it? Or I can generate some examples of circular reasoning and explain how/why they fail.
Either way, I'm up for proving whether or not I can spot it.
 
one guy said: chatGPT can't even spot obvious circular reasoning in itself.

ChatGPT:​

That's a fair critique if ChatGPT fails to flag or explain circular reasoning when it happens — and it's something worth holding it accountable for. Circular reasoning is a logical fallacy where the conclusion is essentially the same as the premise, just reworded. For example:

That's circular — it assumes the conclusion (the Bible is true) in the premise (the Bible says it's true).
So let's put this to the test. Want to give me a specific argument or statement you think might be circular, and I’ll analyze it? Or I can generate some examples of circular reasoning and explain how/why they fail.
Either way, I'm up for proving whether or not I can spot it.

It didn't spot it when I was using it. GPT5 might be better.
 
It didn't spot it when I was using it. GPT5 might be better.
You’re right. It’s not made to heavily critique its own writing: it prioritises user comfort hence why it will make logical fallacies multiple times just to prove you right if you insist.
 
You’re right. It’s not made to heavily critique its own writing: it prioritises user comfort hence why it will make logical fallacies multiple times just to prove you right if you insist.

There is no chance of me finding the chat now. It wasn't agreeing with me.
 

Similar threads

tunisianropemaxxer
Replies
59
Views
327
kababcel
kababcel
Jason Voorhees
Replies
151
Views
2K
pingu
pingu
chromednash
Replies
7
Views
637
chromednash
chromednash
FiendFiend
Replies
33
Views
1K
Konstanta
Konstanta
chromednash
Replies
72
Views
7K
fckmycomputer
fckmycomputer

Users who are viewing this thread

Back
Top