tldr: embeddings are numeric representations of text on an n-dimensional space that can be used to calculate semantic proximity.
For more details, check:
- Embeddings are underrated – a concise theorical and practical approach to embeddings for discovering connections between texts.
- The Illustrated Word2vec – has lots of diagrams and visualizations for word embeddings, language model training, sliding word windows (skipgrams) and text generation.