FWIW, i found the way you relayed GPT errors to be confusing in itself. Saying "GPT got these wrong" and then listing the statement with "True" and an explanation next to it could be GPT's erroneous response or it could be revelation of the correct response. your lack of clarity contributes to the lack of clarity around such questions =)
Have you managed to give it something akin to an IQ test?
One author on Substack gave it the 13 question fluid intelligence test given to UK Biobank participants and it scored 8 which is 2 points higher than the average for that group.
FWIW, i found the way you relayed GPT errors to be confusing in itself. Saying "GPT got these wrong" and then listing the statement with "True" and an explanation next to it could be GPT's erroneous response or it could be revelation of the correct response. your lack of clarity contributes to the lack of clarity around such questions =)
Fixed! Thanks for the feedback.
Have you managed to give it something akin to an IQ test?
One author on Substack gave it the 13 question fluid intelligence test given to UK Biobank participants and it scored 8 which is 2 points higher than the average for that group.