Word2Vec is a foundational technique in natural language processing (NLP) that transforms words into continuous vector representations, capturing their semantic and syntactic relationships. Developed by a team led by Tomas Mikolov at Google in 2013, Word2Vec has significantly advanced the field of NLP.
« Back to Glossary Index