ka | en
Company Slogan TODO

Georgian NLP & Word Embeddings

Author: Giorgi Papakerashvili
Co-authors: Giorgi Papakerashvili, Anzor Gozalishvili
Keywords: NLP, AI, Neural Networks, Word Embeddings, Word semantic analysis
Annotation:

Word Semantic analysis, representation in vectors space and comparison are very frequent task in natural language processing. Our main goal is to develop Georgian semantic language model. Moreover we want to turn words into vectors such that every semantically similar words were close to each other in vector space. For example: Taken two vector representation of words, if euclid’s distance between them is small or cosine similarity is high, it means that these two words are semantically similar. Such models will enable us to automatically find different parts of speech in sentences. For example, if we have some pronouns like: me, we, our etc., than we are able to find the nearest words using cosine similarity using word vectors. Such a way we can separately filter all the parts of speech in Georgian raw texts.



Web Development by WebDevelopmentQuote.com
Design downloaded from Free Templates - your source for free web templates
Supported by Hosting24.com