Picture for Malik Tiomoko

Malik Tiomoko

GIPSA-Lab, Grenoble-Alps University

A Random Matrix Perspective of Echo State Networks: From Precise Bias--Variance Characterization to Optimal Regularization

Add code
Sep 26, 2025
Viaarxiv icon

Incorporating priors in learning: a random matrix study under a teacher-student framework

Add code
Sep 26, 2025
Viaarxiv icon

Human in the Loop Adaptive Optimization for Improved Time Series Forecasting

Add code
May 21, 2025
Viaarxiv icon

High-Dimensional Analysis of Bootstrap Ensemble Classifiers

Add code
May 20, 2025
Viaarxiv icon

Mantis: Lightweight Calibrated Foundation Model for User-Friendly Time Series Classification

Add code
Feb 21, 2025
Viaarxiv icon

Analysing Multi-Task Regression via Random Matrix Theory with Application to Time Series Forecasting

Add code
Jun 14, 2024
Figure 1 for Analysing Multi-Task Regression via Random Matrix Theory with Application to Time Series Forecasting
Figure 2 for Analysing Multi-Task Regression via Random Matrix Theory with Application to Time Series Forecasting
Figure 3 for Analysing Multi-Task Regression via Random Matrix Theory with Application to Time Series Forecasting
Figure 4 for Analysing Multi-Task Regression via Random Matrix Theory with Application to Time Series Forecasting
Viaarxiv icon

Random matrix theory improved Fréchet mean of symmetric positive definite matrices

Add code
May 10, 2024
Figure 1 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Figure 2 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Figure 3 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Figure 4 for Random matrix theory improved Fréchet mean of symmetric positive definite matrices
Viaarxiv icon

Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption

Add code
Oct 20, 2023
Figure 1 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Figure 2 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Figure 3 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Figure 4 for Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption
Viaarxiv icon

PCA-based Multi Task Learning: a Random Matrix Approach

Add code
Nov 01, 2021
Figure 1 for PCA-based Multi Task Learning: a Random Matrix Approach
Figure 2 for PCA-based Multi Task Learning: a Random Matrix Approach
Figure 3 for PCA-based Multi Task Learning: a Random Matrix Approach
Figure 4 for PCA-based Multi Task Learning: a Random Matrix Approach
Viaarxiv icon

Multi-task learning on the edge: cost-efficiency and theoretical optimality

Add code
Oct 09, 2021
Figure 1 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Figure 2 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Figure 3 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Figure 4 for Multi-task learning on the edge: cost-efficiency and theoretical optimality
Viaarxiv icon