CHARNN text generator: Difference between revisions
From Algolit
(Created page with "The CharRNN text generator produces text using the CharRNN model. It is a recurrrent neural that reads a text character per character. In the training phase the model analyzes...") |
|||
Line 1: | Line 1: | ||
− | + | [[Category:Algoliterary-Encounters]] | |
− | One of the first things the model learns is that words are separated by spaces and sentences are separated by a period, a space, followed by an uppercase. Although it might seem the model has learned a text is constructed out of multiple words and sentences it learned that after a certain amount of characters chances are high a space will occur and after | + | {| |
+ | |- | ||
+ | | Type: || Algolit exploration | ||
+ | |- | ||
+ | | Dataset(s): || Complete Works by Shakespeare, Complete Works by Jules Verne | ||
+ | |- | ||
+ | | Technique: || calculating semantic similarity with word-embeddings | ||
+ | |- | ||
+ | | Developed by: || Google Tensorflow | ||
+ | |} | ||
+ | |||
+ | The CharRNN text generator produces text using the CharRNN model. This is a recurrent neural network that reads a text character per character. In the training phase the model analyzes which characters occur after each other and learns the chances of the next character based on the previous character it has seen. The model has a memory that varies in size. In the learning process it can forget certain information it has seen as the network is constructed using Long Short Term Memory modules. | ||
+ | |||
+ | One of the first things the model learns is that words are separated by spaces and sentences are separated by a period, a space, followed by an uppercase. Although it might seem the model has learned that a text is constructed out of multiple words and sentences, it has actually learned that after a certain amount of characters the chances are high a space will occur, and that after a series of character sand spaces the chances grow there will be a period, a space and an uppercase character. | ||
The generator interface is trained on various data-sets and can be tried out. | The generator interface is trained on various data-sets and can be tried out. | ||
+ | The model is based on a script by Karpathy: https://github.com/karpathy/char-rnn/blob/master/Readme.md |
Revision as of 16:28, 25 October 2017
Type: | Algolit exploration |
Dataset(s): | Complete Works by Shakespeare, Complete Works by Jules Verne |
Technique: | calculating semantic similarity with word-embeddings |
Developed by: | Google Tensorflow |
The CharRNN text generator produces text using the CharRNN model. This is a recurrent neural network that reads a text character per character. In the training phase the model analyzes which characters occur after each other and learns the chances of the next character based on the previous character it has seen. The model has a memory that varies in size. In the learning process it can forget certain information it has seen as the network is constructed using Long Short Term Memory modules.
One of the first things the model learns is that words are separated by spaces and sentences are separated by a period, a space, followed by an uppercase. Although it might seem the model has learned that a text is constructed out of multiple words and sentences, it has actually learned that after a certain amount of characters the chances are high a space will occur, and that after a series of character sand spaces the chances grow there will be a period, a space and an uppercase character.
The generator interface is trained on various data-sets and can be tried out. The model is based on a script by Karpathy: https://github.com/karpathy/char-rnn/blob/master/Readme.md