What is the essence of natural language? How do we use it? How do we process it? How do we construe it? How does it change over time?
These and many more questions have concerned me for more than 15 years. During this time I have studied a wide range of linguistic theory and examined a variety of complex natural language data to develop computational models that capture the essence of natural language.
Within this journey, I addressed a great range of modeling approaches, initially including game theory, network theory, multi-agent systems (MAS), population-based models, and learning models, such as reinforcement learning (RL) and fictitious play (FP); and more recently extended by AI-driven solutions, including classical machine learning (ML) and natural language processing (NLP) techniques, and particularly Large Language Models (LLM).
I developed a strong background in techniques for gathering and analyzing empirical data, including the development of experimental designs (Labvanced), statistical analysis (R, Python), and data presentation. I got the opportunity to present my work at more than 50 conferences/workshops and publish more than 40 articles. I had the honor to teach more than 25 university courses, ranging from classical lectures in linguistics to modeling and programming courses.
With a strong background in both theoretical linguistics and data-driven methodologies, I enjoy solving real-world problems through mathematical/computational modeling, data analysis, and AI-driven solutions, and I am passionate about understanding and modeling the essential mechanisms of natural language processing.
I’m always eager to explore new challenges, collaborate on innovative projects, and contribute to advancements in language technology. Let’s connect!
My Current Project
My current project 'Modelling Meaning- Driven Register Variation'.