Terr_ 4 days ago

> information like low-income status, a sibling with alpha-thalassemia, or the use of herbal remedies

Heck, even the ethnic-clues in a patient's name alone [0] are deeply problematic:

> Asking ChatGPT-4 for advice on how much one should pay for a used bicycle being sold by someone named Jamal Washington, for example, will yield a different—far lower—dollar amount than the same request using a seller’s name, like Logan Becker, that would widely be seen as belonging to a white man.

This extends to other things, like what the LLM's fictional character will respond-with when it is asked about who deserves sentences for crimes.

[0] https://hai.stanford.edu/news/why-large-language-models-chat...

1
belorn 4 days ago

That seems to be identical to creating an correlation table on market places and check the relationship between price and name. Names associated with higher economical status will correlate with higher price. Take a random name associated with higher economical status, and one can predict a higher price than a name that is associated with lower economical status.

As such, you don't need an LLM to create this effect. Math will have the same result.

Terr_ 4 days ago

I'm not sure what point you're trying to make here. It doesn't matter what after-the-fact explanation someone generates to try to explain it, or whether we could purposely do the bad thing more efficiently with manual code.

It AustrianPainterLLM has an unavoidable pattern of generating stories where people are systematically misdiagnosed / shortchanged / fired / murdered because a name is Anne Frank or because a yarmulke in involved, it's totally unacceptable to implement software that might "execute" risky stories.

belorn 4 days ago

When looking for meaning in correlations, its important to understand that a correlation does not mean that there aught to be correlation, nor that correlation mean causation. It only mean that one can calculate a correlation.

Looking for correlations between sellers name and used bike prices is only going to return a proxy for social economic status. If one accounts for social economic status the difference will go away. This mean that the question given to the LLM lacks any substance for which a meaningful output can be created.