The following papers have been accepted for presentation at the International Joint Conference on Neural Networks (IJCNN):
“On Ensemble Techniques for Data Stream Regression” Authors: Heitor Murilo Gomes, Jacob Montiel, Saulo Martiello Mastelini, Bernhard Pfahringer and Albert Bifet “Adaptive XGBoost for Evolving Data Streams” Authors: Jacob Montiel, Rory Mitchell, Eibe Frank, Bernhard Pfahringer, Talel Abdessalem and Albert Bifet
Our tutorial “Machine learning for data streams with scikit-multiflow” has been accepted for presentation in IJCAI-PRICAI 2020!
Abstract Data stream mining has gained a lot of attention in recent years as an exciting research topic. However, there is still a gap between the pure research proposals and the practical applications to real-world machine learning problems. The main goal of this tutorial is to introduce attendees to data stream mining theory and practice.
Our tutorial for stream learning with scikit-multiflow has been accepted for presentation at the International Joint Conference on Neural Networks (IJCNN) to take part alongside the IEEE World Congress on Computational Intelligence (WCCI) 2020.
My interview on the benefits of adaptive learning methods has been published in a white-book by Quantmetry.
The white-book, No. 5 “AI en production”, is free to download here.
Transcript of the interview (in french)
Qu’est-ce qu’une dérive ? L’apprentissage est souvent considéré comme une tâche statique. Cependant, en conditions réelles, les données évoluent constamment. C’est ce qu’on appelle une dérive conceptuelle. Par exemple, les marchés financiers sont instables : les risques de crédit ne sont pas les mêmes d’une année sur l’autre.
I have successfully defended my PhD thesis “Fast and Slow Machine Learning”.
Jury:\
M João Gama, University of Porto\
M Georges Hébrail, Électricité de France\
M Themis Palpanas, Université Paris Descartes\
M Ricard Gavaldà, Universitat Politècnica de Catalunya\
M Jesse Read, École Polytechnique\
M Albert Bifet, Télécom ParisTech (Directeur de recherche)\
M Talel Abdessalem, Télécom ParisTech (Directeur de thèse)
I am attending the 2018 IEEE International Conference on Big Data in Settle, USA, to present our paper
Learning Fast and Slow: A Unified Batch/Stream Framework..
Abstract: Data ubiquity highlights the need of efficient and adaptable data-driven solutions. In this paper, we present FAST AND SLOW LEARNING (FSL), a novel unified framework that sheds light on the symbiosis between batch and stream learning. FSL works by employing Fast (stream) and Slow (batch) Learners, emulating the mechanisms used by humans to make decisions.