Algoliterary Encounters: Difference between revisions
From Algolit
(→How the Machine Reads: Dissecting Neural Networks) |
|||
Line 35: | Line 35: | ||
* [[A Bag of Words]] | * [[A Bag of Words]] | ||
* [[A One Hot Vector]] | * [[A One Hot Vector]] | ||
+ | |||
+ | ==== Special Focus: Word Embeddings ==== | ||
* [[word embeddings]] | * [[word embeddings]] | ||
+ | * [[Crowd Embeddings]] - case studies, still needs fine tuning | ||
− | ==== Different | + | ===== Different portraits of word embeddings ===== |
* [[Word embedding Projector]] | * [[Word embedding Projector]] | ||
* [[5 dimensions 32 graphs]] | * [[5 dimensions 32 graphs]] | ||
* [[The GloVe Reader]] | * [[The GloVe Reader]] | ||
− | ==== | + | ===== Inspecting the technique ===== |
− | |||
* [[word2vec_basic.py]] - in piles of paper | * [[word2vec_basic.py]] - in piles of paper | ||
* [[softmax annotated]] | * [[softmax annotated]] |
Revision as of 16:05, 25 October 2017
Start of the Algoliterary Encounters catalog.
Introduction
Algoliterary works
- Oulipo recipes
- i-could-have-written-that
- Obama, model for a politician
- In the company of CluebotNG
Algoliterary explorations
What the Machine Writes: a closer look at the output
- CHARNN text generator
- You shall know a word by the company it keeps - Five word2vec graphs, each of them containing the words 'collective', 'being' and 'social'.
How the Machine Reads: Dissecting Neural Networks
Datasets
- Many many words - introduction to the datasets with calculation exercise
- The data (e)speaks - espeak installation
From words to numbers
Special Focus: Word Embeddings
- word embeddings
- Crowd Embeddings - case studies, still needs fine tuning
Different portraits of word embeddings
Inspecting the technique
- word2vec_basic.py - in piles of paper
- softmax annotated
- Reverse Algebra
How a Machine Might Speak
Sources
Bibliography
- Algoliterary Bibliography - Reading Room texts