Augustus
…
Yesterday I noticed for the first time a poster quoting ChatGPT to provide an explanation of a term and several posters responding favourably to the answer.
When I've played around with ChatGPT I've found it to be very unreliable on anything remotely subjective.
Bias tends to be towards the mainstream view on any topic, presumably as it "learns" from popularity within datasets. On issues where the mainstream view is wrong or ideologically biased, then it's answers clearly reflect this, and answer contain clear errors that you can get it to acknowledge with further prompts.
For example, I was asking it about the origins or the Sunni-Shia split and it's answer heavily relied on a sectarian Sunni perspective. With several follow up questions you can get it to acknowledge it's original answer was anachronistic and that Sunnis didn't really exist for another few centuries.
If you ask it if it's answers are biased it will deny this, but with further prompts you can get it to accept it's answers may indeed be biased towards a mainstream sunni perspective.
As its use will only increase in the future, how much do you trust ChatGPT and how much should we trust it?
How can it be used to become better informed without becoming increasing misinformed too?
In the short term, do you think tools like this will help society to be better informed on average or will they have a negative effect, of no effect?
What are your observations about ChatGPT?
When I've played around with ChatGPT I've found it to be very unreliable on anything remotely subjective.
Bias tends to be towards the mainstream view on any topic, presumably as it "learns" from popularity within datasets. On issues where the mainstream view is wrong or ideologically biased, then it's answers clearly reflect this, and answer contain clear errors that you can get it to acknowledge with further prompts.
For example, I was asking it about the origins or the Sunni-Shia split and it's answer heavily relied on a sectarian Sunni perspective. With several follow up questions you can get it to acknowledge it's original answer was anachronistic and that Sunnis didn't really exist for another few centuries.
If you ask it if it's answers are biased it will deny this, but with further prompts you can get it to accept it's answers may indeed be biased towards a mainstream sunni perspective.
As its use will only increase in the future, how much do you trust ChatGPT and how much should we trust it?
How can it be used to become better informed without becoming increasing misinformed too?
In the short term, do you think tools like this will help society to be better informed on average or will they have a negative effect, of no effect?
What are your observations about ChatGPT?