Getting The Questions Wrong

I’ve been reading A City on Mars by Kelly and Zach Weinersmith, and in part one’s section on “spacefarer psychology” there appears a story which regular readers immediately will know why I’m mentioning it here.

That was before he told her she’d gotten a number of questions “wrong.” “Tell me about your hallucinations,” he said. When Collins expressed confusion, he told her that for the question “When I walk down the street, I see things other people don’t see,” she’d written “True.” Surprised, she tried to clarify: “When I am with my husband, for example, I’ll notice flowers, landscaping, and people’s clothing, while he has no clue they’re there. He looks at other things.” The psychiatrist replied, “Well, that question is designed to find out if you hallucinate.”

This is exactly what I mean when I say that psychological questionnaires need a top to bottom review.

People in psychological and medical research disciplines write these questionnaires from within a very blinkered perspective and apparently either don’t realize or don’t care that some people will read them as asking something wildly different from what those in the discipline think is being asked. This inherently renders these questions useless and data-skewing.

The question didn’t ask if she saw things that weren’t there but if she saw things other people didn’t. For astronaut Eileen Collins, the question was about noticing.

At least in this case there was in-person followup to the psychological questionnaire. In the case of most autism research studies, there is only the questionnaire. Barring post-survey outreach by the participant, for a question such as this the researchers would mark the participant down as having had hallucinations, when the participant simply meant they saw things which escaped other people’s attention.

Data: skewed.

That’s bad.

That’s really bad.

I’ve talked about this sort of thing a lot here over the years. My go-to is always a question meant to determine my lack of empathy when the way I always want to answer is informed instead by my routinely apologizing to furniture when I bump into it, or imploring my excretory system to please hang on until I get home.

When they ask me to confirm or deny that “I tend to think about people in the same way that I do objects”, the correct answer is that yes, I do—but that’s because for some autistic reason I treat objects as people. What they will take from the confirmation, however, is that I lack empathy and “theory of mind”. If anything, my empathy cup runneth over into anthropomorphism.

Data: skewed.

None of this even reaches the survey questions which are various versions of asking for the ways in which you behave that “others” find strange, which never explicitly make its case that what they mean by this is “normal people”. What if all your friends and acquaintances are as neurodivergent as you, and don’t find you “strange” at all? You might say “no” when what they are actually seeking to find out might require you to answer “yes”.

Data: skewed.

The entire discipline of writing psychological or medical research surveys needs dramatic and wholesale reform, but I can’t imagine where you’d even begin to seek such a drastic overhaul.

Until it happens, however, psychological and medical research will remain absolutely rife with just plain bad data about people. Which means various psychological and medical disciplines are rife with completely incorrect opinions about the people they are empowered to treat.

Collins was told she got the questions wrong, but the wrongness of the questions comes from the other side.