Currently, I am AI Research Scientist at Intel Labs working on Graph ML and Geometric DL. You can find me on Twitter, Medium, and now on Mastodon! I help maintaining a Telegram channel on Graph ML news.
Participated (online) in the panel discussion with Bryan Perozzi, Michael Bronstein, and Christopher Morris on graph foundation models held during the Graph Learning Tutorial at ICML 2024, thanks Ameya and Adrian for inviting!
In our new Medium blogpost with Michael Bronstein, Jianan Zhao, Haitao Mao, and Zhaocheng Zhu we discuss foundation models in Graph & Geometric DL: from the core theoretical and data challenges to the most recent models that you can try already today!
A productive week in Singapore! First, gave a keynote at the workshop on Graph Foundation Models at The WebConf 2024 and participated in the panel discussion. Then, visited the group of professor Xavier Bresson at the National University of Singapore with the talk on graph foundation models - from KG reasoning to AI 4 Science. Thank you Professor Bresson for extending the invitation! Slides
It was a delightful experience to participate in the week-long workshop on GNNs and neuro-symbolic AI (FANeSy) organized by Pablo Barcelo, CENIA, and Unversidad San Sebastian. Thanks Pablo for the invitation!
Together with Michael Bronstein, we wrote a huge blog post on the state of affairs in Graph and Geometric DL in 2023 with some predictions for 2024. Part I focuses on theory and GNN architectures (including graph transformers), Part II talks about cool and exciting applications in structured biology, materials science, ML potentials, algorithmic reasoning, and temporal graph learning. We interviewed many prominent researchers to provide several points of view on each subject - so this work wouldn’t be possible without the massive community engagement!
Happy to release ULTRA - the first foundation model for knowledge graph reasoning. A single pre-trained ULTRA model is able to do zero-shot link prediction on any unseen KG and do so better than many supervisedly trained baselines on 50+ datasets! More details in the Medium blog post. We release the paper, several checkpoints (177k params), code, and data on GitHub and HuggingFace Spaces.
Our team got two papers accepted at the upcoming NeurIPS’23 in New Orleans!
A*Net: A Scalable Path-based Reasoning Approach for Knowledge Graphs. Zhaocheng Zhu, Xinyu Yuan, Mikhail Galkin, Sophie Xhonneux, Ming Zhang, Maxime Gazeau, Jian Tang. preprint
Improving Systematic Generalization using Iterated Learning and Simplicial Embeddings. Yi Ren, Samuel Lavoie, Mikhail Galkin, Danica J. Sutherland, Aaron Courville.
I was lucky to attend ICML’23 in Honolulu in person and see many friends and Graph ML folks. Check my report on latest and greatest graph learning research in the new blog post together with some stunning photos from Hawaii!
A new paper Neural graph reasoning: Complex logical query answering meets graph databases and an accompanying Medium post where we introduce the concept of Neural Graph Databases (NGDB) and a whole new categorization of complex logical query answering tasks. NGDBs answer complex queries right in the latent space and are able to reason over missing links (“what’s missing?”) in addition to standard DB-like retrieval (“what’s there”?).
A new Medium post by Andy Huang, Emanuele Rossi, Michael Galkin, and Kellin Pelrine on the recent progress in temporal Graph ML! Featuring theoretical advancements in understanding expressive power of temporal GNNs, discussing evaluation protocols and trustworthiness concerns, looking at temporal KGs, disease modeling, and anomaly detection, as well as pointing to the software libraries and new datasets!
In a new Medium post, we provide an overview of what’s happened in Graph ML in 2022 and its subfields (and hypothesize for potential breakthroughs in 2023), including Generative Models, Physics, PDEs, Graph Transformerrs, Theory, KGs, Algorithmic Reasoning, Hardware, and more!
In the new Medium post, we delve into Denoising Diffusion generative models and their applications in molecular tasks such as conformer generation, protein-ligand docking, molecular linking, and more!
Pablo Barcelo, Mikhail Galkin, Christopher Morris, Miguel Romero Orth. Weisfeiler and Leman Go Relational: we introduce a Relational WL test to quantify expressiveness of relational GNNs (R-GCN and CompGCN).
A new winter holiday longread - what happened in the Graph ML field in 2021? What to expect in 2022? We reflect upon a dozen of research trends - brought by yours truly, Anton Tsitsulin, Anvar Kurmukov, and Sergey Ivanov (with a surprise cameo appearance in the end).
Premium Punta Cana content: a blogpost on Knowledge Graph papers from EMNLP2021: sunny LMs, blue lagoons of ConvAI, sunsets with Entity Linking, white sands of Question Answering!
Had a pleasure presenting a new research talk Beyond Shallow Embeddings of Knowledge Graphs at AI Journey 2021 - the biggest Russian-based venue for AI research. Slides, recording (RU).
Regular review of KG-related papers from ACL 2021, the major NLP research venue in the area. This time: neural databases, retrieval, KG embeddings, entity linking, QA, and a bunch of new datasets.
A new blogpost on our recent research idea: if nodes in a graph are “words”, can we design a fixed-size vocab of “sub-word” units and go beyond shallow embedding? We propose NodePiece, a compositional tokenization approach for dramatic KG vocabulary size reduction, and find that in some tasks, you don’t even need trainable node embeddings! Furthermore, NodePiece is inductive by design and can encode unseen nodes using the same backbone vocabulary.
I compiled a short overview of cool KG-related papers on Neural Reasoning, Temporal Reasoning, Relational Learning, and a bit of Complex QA! Brew some Hot beverage and have a nice weekend reading.
Concluding the year with the review of KG-related papers from NeurIPS 2020: Query Embedding, NAS, Meta-Learning, 📦 vs 🔮, big new benchmarks incl. OGB and GraphGYM, and ⚡️ KeOps!
My review of most prominent KG-related papers from EMNLP 2020. This time we talk about KG-augmented language models, information extraction, entity linking, KG representation algorithms, and many more!
The anniversary post is the series of KG-related papers. It’s been one year since I started publising such digests, and we’re back to the NLP roots and ACL 2020! This time I focus on question answering, KG embeddings, graph-to-text NLG, some ConvAI and OpenIE.
Published a review of KG-related papers from the past ICLR 2020! Among other things we’ll review what’s happening in complex QA, KG embeddings and entity matching with graph embeddings.
Here is a fresh digest of KG-related papers in the NLP context from the past AAAI 2020! We’ll check out new stuff in KG-augmented Language Models, GNN-based entity matching, news on link prediction especially in the temporal dimension, and finally some new papers in ConvAI and Knowledge Graph Question Answering!
A new review on Medium of KG and NLP-related papers from NeurIPS 2019. We’ll have a look on novelties in hyperbolic graph embeddings, new KG embeddings with some logic support, new trends in Markov Logic Nets, new goodies from the ConvAI part, and some interesting GNN-related publications. My review of knowledge graph-related papers from EMNLP 2019.
My review of knowledge graph-related papers from EMNLP 2019. Part I is about language models, extracting KGs from text, dialogue systems, and KG embeddings. Part II discusses question answering, natural language generation from KGs, commonsense reasoning, and NER & RL.
Hello, ACL 2019 has just finished and I attended the whole week of the conference talks, tutorials, and workshops in beautiful Florence! In this post I would like to recap how knowledge graphs slowly but firmly integrate into the NLP community
On July, 6th in Berlin I attended spaCy IRL - a conference organized by explosion.ai and spacy which you probably know as one of the most popular, powerful and fast NLP libraries. Here is a short overview of the event.
Recently returned back from the 2nd Conversational Intelligence Summer School organized by the Text Machine Lab of UMass Lowell and iPavlov lab from MIPT, Moscow. The School took place in Lowell, MA, USA, one of the first US industrial cities with remaining spirit of the industrial revolution.