Hear me out: A cool shop or company should donate some pittance of every purchase to a good trans focused non profit and then have the signage “Make every transaction a trans action”
Hear me out: A cool shop or company should donate some pittance of every purchase to a good trans focused non profit and then have the signage “Make every transaction a trans action”
LLMs are trained to do one thing: produce statistically likely sequences of tokens given a certain context. This won’t do much even to poison the well, because we already have models that would be able to clean this up.
Far more damaging is the proliferation and repetition of false facts that appear on the surface to be genuine.
Consider the kinds of mistakes AI makes: it hallucinates probable sounding nonsense. That’s the kind of mistake you can lure an LLM into doing more of.