I’ve been spending the weekend experimenting with vector space modelling and poetic language. Vector space word embedding models use learning algorithms on very large corpora in order map a unique location in n-dimensional space to each token (=word) in the corpus. “N-dimensional space” is just a mathy-sounding way of saying that multiple (or n) features […]
- 
		
				
Recent Posts
 - 
				
Recent Comments
 - 
				
Archives
- December 2024
 - July 2024
 - May 2024
 - November 2023
 - July 2023
 - June 2023
 - May 2023
 - November 2022
 - July 2022
 - February 2022
 - December 2021
 - March 2021
 - December 2020
 - September 2020
 - August 2020
 - July 2020
 - June 2020
 - May 2020
 - March 2020
 - February 2020
 - January 2020
 - December 2019
 - October 2019
 - July 2019
 - June 2019
 - September 2018
 - July 2018
 - May 2018
 - April 2018
 - March 2018
 - January 2018
 - October 2017
 - August 2017
 - May 2017
 - March 2017
 - February 2017
 - January 2017
 - December 2016
 - November 2016
 - July 2016
 - June 2016
 - April 2016
 - March 2016
 - February 2016
 - January 2016
 - November 2015
 - October 2015
 - September 2015
 - August 2015
 - June 2015
 - May 2015
 - February 2015
 - January 2015
 - November 2014
 - October 2014
 - September 2014
 - August 2014
 - July 2014
 - June 2014
 
 - 
				
Categories
 - 
				
Meta