İbrahim Ethem Deveci, Transformer Models for Translating Natural Language Sentences into Formal Logical Expressions
Translating natural language sentences into logical expressions has been a challenging task due to contextual information and the variational complexity of sentences. In recent years, a new deep learning architecture, namely the Transformer architecture, has been providing new ways to handle what was hard or seemed impossible in natural language processing tasks. The Transformer architecture and language models that are based on it revolutionized the artificial intelligence field of research and changed how we approach natural language processing tasks. In this thesis, we conduct experiments to see whether successful results can be achieved using Transformer models in translating sentences into first-order logic expressions.
Date: 23.01.2024 / 11:00 Place: B-116