Knowledge Modeling and Reasoning

Theme leads

Jian Tang & Amal Zouaq

Researchers involved

Yacine Benahmed, Jackie Cheung, Samira Ebrahimi-Kahou, Richard Khoury, Philippe Langlais, Bang Liu, Jian-Yun Nie, Reihaneh Rabbany, Siva Reddy




  • Knowledge modeling

  • Semantic web

  • Neural and formal reasoning

  • Using knowledge in different tasks (text reasoning, QA, dialogue, etc.)


To be usable, knowledge must be formalized. This means not only to describe knowledge in a standard format, but also to create an appropriate representation for it. The project will investigate the common formalism of knowledge graph (interconnected entities), and try to push it further toward a better organized semantic web. The elements included in such a knowledge graph or semantic web will be encoded in a model. Graph neural networks (GNN) are commonly used for this task. In this project, we aim at creating GNNs that are suitable for different NLP tasks. In particular, we will incorporate GNNs in NLP tasks (QA) to enable multi-hop inference based on knowledge. In addition to knowledge graphs, large pre-trained language models such as BERT and GPT also encode some general knowledge in addition to language regularities. We will study inference in NLP tasks that combine knowledge graphs and pre-trained language models.