Abstract

A lot of advances have been made recently in the area of word- and other embeddings: ELMo, BERT and all its variations, Flair, LASER etc. Each of these address shortcomings of classic word2vec embeddings, such as homonymy, polysemy, dealing with multi-lingual data and out-of-vocabulary words etc. During the course, we will discuss the why and how of several types of embeddings, compare them with other types of embeddings and have a look at practical use cases for which they can be useful.

Speaker’s Profile

Aleksandra is an NLP engineer with a background as a researcher in Theoretical Linguistics. As the head of NLP at Faktion, she is the lead in all NLP-related projects, from automated document processing to semantic search. She is an advocate of making science and technology accessible to a broad audience and likes sharing her passion for language and communication, both with humans and machines.