Jan—June 2024

I will be serving as Senior Area Chair for Interpretability and Analysis of Models for NLP Track at the EMNLP 2024.

Our demo paper, LLMeBench: A Flexible Framework for Accelerating LLMs Benchmarking, has been accepted at EACL 2024. This framework drives the benchmarking of LLMs, and you can explore our comprehensive benchmarking results for Arabic.

Exciting start to 2024: Two long papers have been accepted at EACL 2024! 🚀 Check out our work on Scaling up the discovery of latent concepts in deep NLP models. Plus, don’t miss our comprehensive benchmarking of LLMs for Arabic.

Oct—Dec 2023

Our paper titled Discovering Salient Neurons in Deep NLP Models has been approved for publication in the JMLR (Journal of Machine Learning Research) 🌟🎉

I will be serving as Senior Area Chair for Interpretability and Analysis of Models for NLP Track at the NAACL 2024.

🎊🎉 Exciting news 🎊🎉 Our paper titled Can LLMs Facilitate Interpretation of Pre-trained Language Models? has been accepted at the EMNLP 2023.

Absolutely elated to announce that the Arabic Language Technologies (ALT) @QCRI has been honored with the prestigious King Salman Global Academy Prize in the institution category! 🌟 This incredible achievement marks a decade of dedication, innovation, and groundbreaking work in the arena of Arabic NLP. 🎊💬

I will be serving as an Action Editor for the EACL, NAACL, ACL conferences in the ACL Rolling Review.

Want to benchmark LLMs for your language or specific downstream tasks? LLMeBench enables effortless benchmarking using a wide range of NLP tasks across multiple LLMs and is easily customizable. Please find more details here.

July—Sept 2023

Thrilled to share that our paper titled Evaluating Neuron Interpretation Methods of NLP Models has been accepted at NeurIPS 2023.

I will be serving as an Area Chair for Applications Involving LRs and Evaluation (including Applications in Specific Domains) at the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation

Our paper titled What do End-to-End Speech Models Learn about Speaker, Language and Channel Information? A Layer-wise and Neuron-level Analysis has been accepted to appear in the next volume of the Computer Speech and Language (CSL) journal.

We will be presenting our TACL paper on surveying neuron analysis methods at the ACL.

I will be serving as an Area Chair (Senior Meta-Reviewer) for the Thirty-Nineth AAAI Conference on Artificial Intelligence (AAAI-23).

Jan—June 2023

We have released new functionality in the NeuroX Library. Please check out our corresponding paper titled NeuroX Library for Neuron Analysis of Deep NLP Models accepted to appear at the ACL demo track.

I will be serving as Senior Area Chair for Interpretability and Analysis of Models for NLP Track at the EMNLP 2023.

NxPlain is a framework that explains model’s predictions using . A demo paper titled NxPlain: A Web-based Tool for Discovery of Latent Concepts has been accepted to appear at the EACL 2023.

I will be serving as an Area Chair for Interpretability and Analysis of Models for NLP Track at the ACL 2023.

Oct—Dec 2022

We developed an analysis platform for annotating deep NLP models. A demo paper titled ConceptX: A Framework for Latent Concept Analysis has been accepted to appear at the AAAI 2023.

A paper titled Post-hoc analysis of Arabic transformer models has been accepted to appear at the workshop on Black-box NLP.

A demo paper related to NatiQ, has also been accepted at WANLP. Please check out the pre-print version .

We have added new male and famale voice to NatiQ. The female voice (Abeer) is neutral MSA, whereas the male voice (Hamza-Shami) speaks in Levantine dialect. Please checkout our demo.

A paper titled On the Transformation of Latent Space in Fine-Tuned NLP Models has been accepted to appear at the EMNLP 2022.

May—Sept 2022

Our paper titled Effect of Post-processing on Contextualized Word Representations has been accepted to appear at COLING 2022

Our paper titled Neuron-level Interpretation of Deep NLP Models: A Survey has been accepted at TACL.

Our paper titled On the Effect of Dropping Layers of Pre-trained Transformer Models has been accepted to appear in the next volume of the Computer Speech and Language (CSL) journal.

Jan—April 2022

Our paper titled Analyzing Encoded Concepts in Transformer Language Models is accepted to appear at NAACL 2022.

The BCD NET linked to our ICLR 2022 paper is now released.

I will be serving as a Senior Program Committee (Meta-Reviewer) Member for the 37th AAAI Conference on Artificial Intelligence (AAAI-23).

Our paper titled Discovering Latent Concepts Learned in BERT is accepted to appear at ICLR 2022.

NeuroX Toolkit is now available via pip.

NatiQ has been licensed to Kanari AI.


May—Aug 2021

I have been granted a QNRF grant (with Darwish, Abdelali, Mubarak): TDF 03-0706-210009, Natiq: Arabic Text to Speech System

Our paper titled Fighting the COVID-19 Infodemic: Modeling the Perspective of Journalists, Fact-Checkers, Social Media Platforms, Policy Makers, and the Society is accepted to appear in the Findings of EMNLP 2021.

I will be serving as an Area Chair for the Thirty-Sixth AAAI Conference on Artificial Intelligence (AAAI-22).

We presented our tutorial on Fine-grained Interpretation and Causation Analysis in Deep NLP Models ran live at NAACL. Video, Slides.

Our paper titled How transfer learning impacts linguistic knowledge in deep NLP models? is accepted to appear in the findings of ACL 2021.

After 6 years at QCRI, I have been promoted to a Senior Scientist position. Thanks to Hassan Sajjad and Fahim Dalvi, my partners in this wonderful journey.


Jan—April 2021

I will be serving as an Area Chair for NLP Applications Track at EMNLP 2021.

NatiQ —ALT’s first text-to-speech system is now online.

Shaheen our MSA and dialectal Arabic — English system featured at the front page of Gulf times.


Sept—Dec 2020

Our tutorial on Fine-grained Interpretation and Causation Analysis in Deep NLP Models has been accepted at NAACL 2021.

We deployed an API for Arabic-English transliteration.

I gave a talk to NYUAD on our work on analyzing representations and neuron level probing.

I was involved in the shared task for Machine Translation Robustness @ WMT 20. Check out findings of the task in this paper .

Check out our paper on Benchmarking Arabic dialects in Machine Translation accepted at COLING.

Our paper on Analyzing Individual neurons in Pre-trained Language Models has been accepted to appear at EMNLP.

Another paper on Redundancy analysis in Pre-trained Neural Network models have been accepted at EMNLP.

I will be serving as a Senior Program Committee (Meta-Reviewer) Member for the 35th AAAI Conference on Artificial Intelligence (AAAI-21).

Shaheen tech-transfer to Kanari AI.


May—Aug 2020

QCRI’s Shaheen achieves milestone with over 1bn words translated and featured in Gulf Times and Qatar tribune .

We organized first task on simultaneous machine translation as part of IWSLT. Check out details here.


Jan—April 2020

We deployed the first Arabic Dialect-to-English translation system based on Neural Machine Translation. It covers various dialects such as Levantine, Egyptian and Maghrebi.

I gave an invited talk about our work on neural analysis at CLT 2020.

I am going to serve as an Area Chair for Machine Translation and Multilinguality Track at ACL 20.


May—Aug 2019

Our journal paper compiling work on analyzing NMT representations has been accepted at the Journal of Computational Linguistics and will appear in the forthcoming digest.

I was involved in the first shared task on Machine Translation Robustness @WMT 19. Check out findings of the task in this paper.

Our work on comparing NMT representations of different granularities has been accepted at NAACL.

Our work on analyzing deep Neural Machine Translation models is in the news.


Sept—Dec 2018

Our paper titled Identifying and Controlling Important Neurons in Neural Machine Translation has been accepted to appear at the ICLR 19.

Our paper titled What is one Grain of Sand in the Desert? Analyzing Individual Neurons in Deep NLP Models has been accepted to appear at the AAAI 19.

A demo paper for NeuroX Toolkit has been accepted to appear at AAAI 19.


Jan—April 2018

Our paper titled Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation has been accepted to appear at the NAACL.

Our Speech Translation System won the Best Innovation Award at the Annual Research Conference in Doha Qatar.

Our Speech Translation System was deployed at ARC for onsite speech translation of Arabic talks.


Sept—Dec 2017

A paper titled Continuous Space Reordering Models for Phrase-based MT has been accepted to appear at IWSLT.

Another paper titled Neural Machine Translation Training in a Multi-Domain Scenario has been accepted to appear at IWSLT.

Our work on analyzing Neural Machine Translation was picked up by Science Blogs.

Our long paper titled Understanding and Improving Morphological Learning in Neural Machine Translation Decoder has been accepted to appear at IJCNLP.

Another long paper titled Evaluating Layers of Representation in Neural Machine Translation on Parts-of-Speech and Semantic Tagging Tasks has been accepted to appear at IJCNLP.


Jan—April 2017

Our long paper titled What do Neural Machine Translation Model Learn about Morphology? has been accepted to appear at ACL 17.

Another short paper on Language independent alternatives for Arabic word segmentation has been accepted to appear at ACL.

Our demo paper QCRI Live Speech Translation System has been accepted to appear at the EACL in Valencia.

Another demo paper The SUMMA Platform Prototype has been accepted to appear at EACL in Valencia.

I gave a talk summarizing our Domain Adaptation project, at our weekly symposium.


Sept—Dec 2016

Our journal paper on Domain Adaptation using Neural Network has been accepted to appear in the journal of Computer Speech and Language, Special Issue on Deep Learning for Machine Translation

Our first neural machine translation system was ranked first in the Arabic-English TED and QED translation tasks at IWSLT 2016. Check out our system description paper.

We demo’ed our Speech Translation System to the ministry of defense Qatar.

Our paper on deep fusion NNJM for domain adaptation has been accepted to appear at the COLING 2016.

Our internship project on Medical Translation was announced to be 2nd during summer internship at QCRI. Congratulations to Ali and Jafar.


Jan—April 2016

Our demo paper on Farasa: fastest Arabic segmenter has been accepted to appear in NAACL 2016.

A paper on Eye-tracking for MT evaluation has been accepted to appear in NAACL 2016.

Also check out the related demo paper on this.

Our paper on Medical translation for low resource languages has been accepted to appear in CICLING 2016.


Sept—Dec 2015

Our internship project on Medical Translation was announced to be 3rd amongst a team of 23 projects done during summer internship at QCRI. Congratulations to Ahmed, Osama, Naila and Manisha.

The code for Operation Sequence Model Interpolation has been pushed in Moses git. Please find instructions on how to invoke it here.

Our paper on Domain Adaptation using joint models has been accepted at the MT Summit.


May—Aug 2015

Our paper on Domain Adaptation using Neural Network has been accepted at EMNLP.

We were ranked at the NIST Open MT workshop on the task of SMS and Chat translation.

Our system description paper was voted best at the NIST Open MT workshop. It was a joint effort led by QCRI and CAMeL labs, NYU Abu Dhabi.

We participated in the 2nd QALB Shared Task on Automatic Text Correction for Arabic. It was a joint effort mainly led by CMU-Q. Check out our system paper.


Jan—April 2015

I gave a presentation on Moses architecture at QCRI

Preliminary results for NIST 2015 are out. We did well and ranked second behind BBN on Arabic-English track.

We participated in NIST 2015 (Arabic-English task).

I gave a talk on the Operation Sequence Model at QCRI.


Sept—Dec 2014

I gave two talks at the IIT Bombay about Transliteration and the Operation Sequence Model.

I joined Qatar Computing Research Institute as a Scientist with the Arabic Language Technologies Group led by Stephan Vogel.

My journal paper titled The Operation Sequence Model - Combining N-Gram-based and Phrase-based Statistical Machine Translation is finally accepted to appear in the next volume of Computational Linguistics.

My Student Nadeem Khan defended his MS thesis on surveying SMT for Indic languages. Here’s a Survey Paper that came from the effort.

I gave a talk on introduction to Statistical Machine Translation at University of Engineering Technology (UET) Lahore.

I gave a talk on the Operation Sequence Model to a team at Systran, Paris


May—Aug 2014

My PhD dissertation has been selected as the best Computational Linguistic dissertation for the years 2012-2014 by the German Society for Computational Linguistics. The GSCL Award for best Doctoral Thesis in the memory of Wolfgang Hoeppner was given at KOVENS 2014 where I was invited to give a short presentation.

Our campaign at IWSLT’14 was very successful.

The Operation Sequence Model was acknowledged as one of the prominent approaches that have led to actual improvements in systems in the evaluation campaign. I was invited to briefly present the model at the Ninth Workshop of statistical machine translation. Here are the slides including other works that were acknowledged.

We performed really well at the WMT’14. Please checkout our system description paper.

I visited SDL Language Weaver and gave a talk on the Operation Sequence Model.

My paper titled Investigating the usefulness of generalized word representation in SMT has been accepted to appear in COLING 2014.


Jan—April 2014

My paper titled Improving machine translation via triangulation and transliteration has been accepted to appear in EAMT 2014.

My paper titled Integrating and unsupervised transliteration model into statistical machine translation has been accepted to appear in EACL 2014. The work is now part of Moses as feature.

My work on Improving Egyptian-to-English SMT by mapping Egyptian into MSA that I carried during an internship at IBM Watson has been accepted to appear in CICLING 2014.


Sept—Dec 2013

My PhD supervisor Hinrich Schütze presented the Operation Sequence Model as his invited talk at the MT Summit 2013.

The Operation Sequence Model has been integrated as a feature in the Moses toolkit after tremendous success at the WMT’13.

Our campaign at IWSLT’13 was very successful.


May—Aug 2013

We ran over the board at the WMT’13. Please checkout our system description paper.

UEdin collaborated with Stuttgart, Munich and QCRI to participate at WMT’13. Multiple submissions were made. Papers can be found here.


Jan—April 2013

I was able to Combine Markov based translation models into Phrase-based MT. The work has been accepted to appear at ACL’13.

My work on Integrating Phrase-based and N-gram-based models has been accepted at NAACL’13.

I joined the MT group led by Philipp Koehn at the University of Edinburgh as a Post-doctoral Research Associate.


Sept—Dec 2012

I have succesfully defended my thesis. Thanks to my supervisors Alex Fraser, Helmud Schmid and Hinrich Schütze that I was able to make a dream come true.

I have been offered a post-doctoral position at the MT group led by Philipp Koehn at the University of Edinburgh.


Jan—April 2012

I will be attending the 2nd Lisbon Machine Learning School.

I am interning at the IBM Watson Center and will be working on Egyptian-to-English Machine Translation project with Yaser Al-Onaizan.

My dissertation has been awarded with the IMS best thesis award.


Jan—Aug 2011

A paper titled Comparing Two Techniques for Learning Transliteration Models Using a Parallel Corpus has been accepted to appear at the IJCNLP.

We proposed a novel translation model with integrated reordering. It’s now accepted at the ACL.


Jan—April 2010

Excited to have my first ACL publication. My work on Hindi-to-Urdu Machine Translation Through Transliteration has been accepted to appear at the ACL.

I am super excited that my MS thesis work on Urdu Word Segmentation is going to appear at NAACL. This my first tier-1 publication.


April 2008

I am starting my PhD with Prof. Hinrich Schütze at IMS-Stuttgart where I will be preliminarily working with Alex Fraser and Helmut Schmid on Hindi-Urdu statistical machine translation.


Jan—Aug 2007

I defended my masters thesis on Urdu Word Segmentation. Thanks to my supervisor Prof. Sarmad Hussain for his guidance.

I have been shortlisted for HEC scholarship to study in Germany. I will be joining the NLP group at the University of Stuttgart with Prof. Hinrich Schütze.

I am travelling to Phnom Penh to attend a workshop on open source localization project.

I am going to lead an effort on Open Source Localization Project. We will create standard translations for Open Office, GNOME and Sea Monkey browser.


Jan—Dec 2006

My project work on studying Split Ergativity in Urdu has been accepted to appear in the 12th Himalayan Language Symposium & 27th Annual Conference of Linguistic Society of Nepal.

I am travelling to Bangkok for ADD workshop.

I am working on standardization of Urdu Domain Names and presented my work at the IEEE Multitopic Conference (INMIC).

I am compiling a A Study on Collation of Languages from Developing Asia for the PAN Localization Project.


Aug—Dec 2005

I started my MS (CS) at the National University of Computer and Emerging Sciences. I will continue to work with Dr. Sarmad Hussain at the Center for Research in Urdu Language Processing on the PAN Localization Project.


Jan—July 2005

I am travelling to Cambodia to present the work we have done in the past 5 months at the workshop in PAN Localization Project.

After 3 months of work on Lao pre-processing (syllabification, collation and notepad utility), we have made a splash in the local news (page 2).

I will be working at the Science, Technology and Environment Agency (STEA) in Laos as a project consultant for the PAN Localization Project.


Jun—Dec 2004

I am compiling a Survey on Language Computing in Asia for the PAN Localization Project.

Aug 2003—May 2004

I am going to work on bootable Urdu Linux as my bachelors FYP with Dr. Sarmad Hussain.