Summary View* This content is illustrative
in Category
Here's the bug: the smaller the amount of input text, the better the sentences are that it generates. (Above a certain threshold -- too low, and you just get the input text back.)
Published: 04/19/2021 by Article Author
Detail View* This content is illustrative
DadaDodo works rather differently than Dissociated Press; whereas Dissociated Press (which, incidentally, refers to itself as a ``travesty generator'') simply grabs segments of the body of text and shuffles them, DadaDodo tries to work on a larger scale: it scans bodies of text, and builds a probability tree expressing how frequently word B tends to occur after word A, and various other statistics; then it generates sentences based on those probabilities The theory here is that, with a large enough corpus, the generated sentences will tend to be grammatically correct, but semantically random: exterminate all rational thought.
DadaDodo works rather differently than Dissociated Press; whereas Dissociated Press (which, incidentally, refers to itself as a ``travesty generator'') simply grabs segments of the body of text and shuffles them, DadaDodo tries to work on a larger scale: it scans bodies of text, and builds a probability tree expressing how frequently word B tends to occur after word A, and various other statistics; then it generates sentences based on those probabilities The theory here is that, with a large enough corpus, the generated sentences will tend to be grammatically correct, but semantically random: exterminate all rational thought.
DadaDodo works rather differently than Dissociated Press; whereas Dissociated Press (which, incidentally, refers to itself as a ``travesty generator'') simply grabs segments of the body of text and shuffles them, DadaDodo tries to work on a larger scale: it scans bodies of text, and builds a probability tree expressing how frequently word B tends to occur after word A, and various other statistics; then it generates sentences based on those probabilities The theory here is that, with a large enough corpus, the generated sentences will tend to be grammatically correct, but semantically random: exterminate all rational thought.