Found in Translation Natural Language Understanding with Cross Lingual Groundi...
Found in Translation Natural Language Understanding with Cross Lingual Grounding
"Natural language understanding is the ""holy grail"" of computational linguistics and a long-term goal in research on artificial intelligence. Understanding human communication is difficult due to the various ambiguities in natur...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
TIN2017-91692-EXP
TRADUCCION AUTOMATICA NEURONAL NO SUPERVISADA: UN NUEVO PARA...
48K€
Cerrado
MULTIJEDI
Multilingual Joint Word Sense Disambiguation
1M€
Cerrado
TIN2015-71147-C2-1-P
COMPRENSION DEL LENGUAJE EN LOS MEDIOS DE COMUNICACION SOCIA...
71K€
Cerrado
Información proyecto FoTran
Duración del proyecto: 71 meses
Fecha Inicio: 2018-04-20
Fecha Fin: 2024-03-31
Líder del proyecto
HELSINGIN YLIOPISTO
No se ha especificado una descripción o un objeto social para esta compañía.
TRL
4-5
Presupuesto del proyecto
2M€
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
"Natural language understanding is the ""holy grail"" of computational linguistics and a long-term goal in research on artificial intelligence. Understanding human communication is difficult due to the various ambiguities in natural languages and the wide range of contextual dependencies required to resolve them. Discovering the semantics behind language input is necessary for proper interpretation in interactive tools, which requires an abstraction from language-specific forms to language-independent meaning representations. With this project, I propose a line of research that will focus on the development of novel data-driven models that can learn such meaning representations from indirect supervision provided by human translations covering a substantial proportion of the linguistic diversity in the world. A guiding principle is cross-lingual grounding, the effect of resolving ambiguities through translation. The beauty of that idea is the use of naturally occurring data instead of artificially created resources and costly manual annotations. The framework is based on deep learning and neural machine translation and my hypothesis is that training on increasing amounts of linguistically diverse data improves the abstractions found by the model. Eventually, this will lead to universal sentence-level meaning representations and we will test our ideas with multilingual machine translation and tasks that require semantic reasoning and inference."