D&D monster stat blocks, in contrast, are well-structured and can be fairly lengthy, with the longest taking up full pages in the rather large-paged books they come in. Most applications I have seen, however, generate unstructured text of fairly short lengths, no more than about a paragraph. There are numerous examples where it performs uncannily well. As a one line summary, the GPT-2 transformer model was trained to simply predict the next word in a text, given all previous words. If you’re unfamiliar with GPT-2, you might want to click through that link and read their explanation of it before proceeding. If you want to skip to seeing the results, the generator is running at The Modelįor synthetic text generation, there are only a few machine learning techniques which have shown significant promise.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |