Picture for Malik Tiomoko

Malik Tiomoko

GIPSA-Lab, Grenoble-Alps University

Human in the Loop Adaptive Optimization for Improved Time Series Forecasting

Add code
May 21, 2025
Viaarxiv icon

High-Dimensional Analysis of Bootstrap Ensemble Classifiers

Add code
May 20, 2025
Viaarxiv icon

Mantis: Lightweight Calibrated Foundation Model for User-Friendly Time Series Classification

Add code
Feb 21, 2025
Viaarxiv icon

Analysing Multi-Task Regression via Random Matrix Theory with Application to Time Series Forecasting

Add code
Jun 14, 2024
Viaarxiv icon

Random matrix theory improved Fréchet mean of symmetric positive definite matrices

Add code
May 10, 2024
Figure 1 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Figure 2 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Figure 3 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Figure 4 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Viaarxiv icon

Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption

Add code
Oct 20, 2023
Figure 1 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Figure 2 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Figure 3 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Figure 4 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Viaarxiv icon

PCA-based Multi Task Learning: a Random Matrix Approach

Add code
Nov 01, 2021
Figure 1 for PCA-based Multi Task Learning: a Random Matrix Approach
Figure 2 for PCA-based Multi Task Learning: a Random Matrix Approach
Figure 3 for PCA-based Multi Task Learning: a Random Matrix Approach
Figure 4 for PCA-based Multi Task Learning: a Random Matrix Approach
Viaarxiv icon

Multi-task learning on the edge: cost-efficiency and theoretical optimality

Add code
Oct 09, 2021
Figure 1 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Figure 2 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Figure 3 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Figure 4 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Viaarxiv icon

Large Dimensional Analysis and Improvement of Multi Task Learning

Add code
Sep 03, 2020
Figure 1 for Large Dimensional Analysis and Improvement of Multi Task Learning
Figure 2 for Large Dimensional Analysis and Improvement of Multi Task Learning
Figure 3 for Large Dimensional Analysis and Improvement of Multi Task Learning
Figure 4 for Large Dimensional Analysis and Improvement of Multi Task Learning
Viaarxiv icon

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

Add code
Mar 08, 2019
Figure 1 for Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions
Figure 2 for Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions
Viaarxiv icon