Actions

Difference between revisions of "Algebra with Words"

From Algolit

 
Line 12: Line 12:
 
Technique: word embeddings, word2vec
 
Technique: word embeddings, word2vec
  
Original model: Radim Rehurek and Petr Sojka  
+
Original model: Radim Rehurek and Petr Sojka
 
 
[[Category:Data_Workers]][[Category:Data_Workers_EN]]
 

Latest revision as of 20:23, 19 March 2019

by Algolit

Word embeddings are language modelling techniques that through multiple mathematical operations of counting and ordering, plot words into a multi-dimensional vector space. When embedding words, they transform from being distinct symbols into mathematical objects that can be multiplied, divided, added or substracted.

By distributing the words along the many diagonal lines of the multi-dimensional vector space, their new geometrical placements become impossible to perceive by humans. However, what is gained are multiple, simultaneous ways of ordering. Algebraic operations make the relations between vectors graspable again.

This installation uses Gensim, an open-source vector space and topic-modelling toolkit implemented in the programming language Python. It allows to manipulate the text using the mathematical relationships that emerge between the words, once they have been plotted in a vector space.


Concept & interface: Cristina Cochior

Technique: word embeddings, word2vec

Original model: Radim Rehurek and Petr Sojka