machine learning

Two papers accepted at IJCNN 2020

The following papers have been accepted for presentation at the International Joint Conference on Neural Networks (IJCNN): “On Ensemble Techniques for Data Stream Regression” Authors: Heitor Murilo Gomes, Jacob Montiel, Saulo Martiello Mastelini, Bernhard Pfahringer and Albert Bifet “Adaptive XGBoost for Evolving Data Streams” Authors: Jacob Montiel, Rory Mitchell, Eibe Frank, Bernhard Pfahringer, Talel Abdessalem and Albert Bifet

Tutorial accepted at IJCAI 2020

Our tutorial “Machine learning for data streams with scikit-multiflow” has been accepted for presentation in IJCAI-PRICAI 2020! Abstract Data stream mining has gained a lot of attention in recent years as an exciting research topic. However, there is still a gap between the pure research proposals and the practical applications to real-world machine learning problems. The main goal of this tutorial is to introduce attendees to data stream mining theory and practice.

Tutorial accepted at IJCNN 2020

Our tutorial for stream learning with scikit-multiflow has been accepted for presentation at the International Joint Conference on Neural Networks (IJCNN) to take part alongside the IEEE World Congress on Computational Intelligence (WCCI) 2020.

Interview on adaptive learning

My interview on the benefits of adaptive learning methods has been published in a white-book by Quantmetry. The white-book, No. 5 “AI en production”, is free to download here. Transcript of the interview (in french) Qu’est-ce qu’une dérive ? L’apprentissage est souvent considéré comme une tâche statique. Cependant, en conditions réelles, les données évoluent constamment. C’est ce qu’on appelle une dérive conceptuelle. Par exemple, les marchés financiers sont instables : les risques de crédit ne sont pas les mêmes d’une année sur l’autre.

Research fellow at the University of Waikato

I have started a new potion as Research Fellow in the Machine Learning Group at the University of Waikato in New Zealand.

Postdoc at Télécom ParisTech

I have started a new potion as Postdoc at the Data, Intelligence and Graphs (DIG) group at Télécom ParisTech.

I got my PhD

I have successfully defended my PhD thesis “Fast and Slow Machine Learning”. Jury:\ M João Gama, University of Porto\ M Georges Hébrail, Électricité de France\ M Themis Palpanas, Université Paris Descartes\ M Ricard Gavaldà, Universitat Politècnica de Catalunya\ M Jesse Read, École Polytechnique\ M Albert Bifet, Télécom ParisTech (Directeur de recherche)\ M Talel Abdessalem, Télécom ParisTech (Directeur de thèse)

River

A Python library for online machine learning.

scikit-multiflow

One of the ancestors of River. [Superseded by River]

IEEE BigData 2018

I am attending the 2018 IEEE International Conference on Big Data in Settle, USA, to present our paper Learning Fast and Slow: A Unified Batch/Stream Framework.. Abstract: Data ubiquity highlights the need of efficient and adaptable data-driven solutions. In this paper, we present FAST AND SLOW LEARNING (FSL), a novel unified framework that sheds light on the symbiosis between batch and stream learning. FSL works by employing Fast (stream) and Slow (batch) Learners, emulating the mechanisms used by humans to make decisions.