taylor.town about spam rss

Do Not Shred Your Fingers In An Actual Blender

I recently gave some bad advice in this essay:

Luckily, LLMs significantly reduce the effort/cost of therapy experiments. Consider trying the following prompt:

Please guide me through a round of ERP therapy. Start by listing universal sources of fear/discomfort/anxiety.

If you find this process useful, consider trying it with a licensed human professional.

I think this advice is dangerous if taken too seriously/literally, which is why I removed it.

This is how that passage sounded in my head:

That's how I felt, but that's not what I wrote.

To chat with Claude is to play Human Simulator 2000. It's a bag of words. It is neither friend, nor coworker, nor foe, nor therapist.

Yes, sometimes LLMs can simulate humans. Yes, sometimes those simulations can be useful. But be wary of a simulation if you can't verify its accuracy/efficacy. When you cannot yet distinguish fact from fiction, relying on a fiction pump seems unwise.

Do not shred your fingers in an actual blender.