Algoliterary Encounters: Difference between revisions
From Algolit
(→What the Machine Writes: a closer look at the output) |
|||
Line 17: | Line 17: | ||
=== What the Machine Writes: a closer look at the output === | === What the Machine Writes: a closer look at the output === | ||
* [[CHARNN text generator]] | * [[CHARNN text generator]] | ||
− | * [[You shall know a word by the company it keeps]] | + | * [[You shall know a word by the company it keeps]] |
=== How the Machine Reads: Dissecting Neural Networks === | === How the Machine Reads: Dissecting Neural Networks === |
Revision as of 16:07, 25 October 2017
Start of the Algoliterary Encounters catalog.
Introduction
Algoliterary works
- Oulipo recipes
- i-could-have-written-that
- Obama, model for a politician
- In the company of CluebotNG
Algoliterary explorations
What the Machine Writes: a closer look at the output
How the Machine Reads: Dissecting Neural Networks
Datasets
- Many many words - introduction to the datasets with calculation exercise
- The data (e)speaks - espeak installation
From words to numbers
Special Focus: Word Embeddings
- word embeddings
- Crowd Embeddings - case studies, still needs fine tuning
Different portraits of word embeddings
Inspecting the technique
- word2vec_basic.py - in piles of paper
- softmax annotated
- Reverse Algebra
How a Machine Might Speak
Sources
Code
Bibliography
- Algoliterary Bibliography - Reading Room texts