İbrahim Ethem Deveci, Transformer Models for Translating Natural Language Sentences into Formal Logical Expressions

M.S. Candidate: İbrahim Ethem Deveci
Program: Cognitive Science
Date: 17.04.2024 / 09:30
Place: B-116

Abstract: Translating natural language sentences into logical expressions has been challenging due to contextual information and the variational complexity of sentences. The task is not straightforward to handle using rule-based and statistical methods. In recent years, a new deep learning architecture, namely the Transformer architecture, has provided new ways to handle what was hard or seemed impossible in natural language processing tasks. The Transformer architecture and language models that are based on it revolutionized the artificial intelligence field of research and changed how we approach natural language processing tasks. In this thesis, we conduct experiments to see whether successful results can be achieved using Transformer models in translating sentences into first-order logic expressions. We evaluate our model in terms of whether it can capture the formal aspects of the expressions, generating well-formed expressions, and generalizable over unseen sentences.