
Projects.

OpTrade: A Comprehensive Toolkit for Quantitative Options Trading Research and Strategy Development
Xavier Mootoo
OpTrade is an open-source toolkit for quantitative options trading research. This includes a comprehensive data pipeline using ThetaData's API allowing for minute-level data with intelligent contract selection based on moneyness and volatility-scaled strikes. Market environments can be precisely tuned via fundamental parameters (e.g., P/E) and factor modeling (e.g., Fama-French exposures). Optrade also includes a full experimentation framework with PyTorch/scikit-learn models featuring specialized option featurization (e.g., order book imbalance, quote spreads), time-to-expiration features for theta decay, and datetime features for seasonality patterns.
EMForecaster: A Deep Learning Framework for Time Series Forecasting in Wireless Networks with Distribution-Free Uncertainty Quantification
EMForecaster is a new deep learning architecture for electromagnetic field exposure forecasting that uses multi-scale temporal pattern processing through patching, reversible instance normalization, and dimension mixing. It incorporates conformal prediction for uncertainty quantification with guaranteed coverage rate 1−α and introduces a "Trade-off Score" metric to balance forecast trustworthiness and prediction interval width. Performance significantly exceeds current solutions: 53.97% better than Transformer architecture and 38.44% better than average baseline in point forecasting, while achieving optimal balance between prediction interval width and coverage in conformal forecasting with 24.73% improvement over average baseline and 49.17% over Transformer.
Xavier Mootoo, Hina Tabassum, Luca Chiaraviglio
IEEE Transactions on Network Science and Engineering (under review)
Stochastic Sparse Sampling: A Framework for Variable-Length Medical Time Series Classification
Xavier Mootoo, Alan A. Díaz-Montiel, Milad Lankarany, Hina Tabassum
Full Version: IEEE Transactions on Neural Networks and Learning Systems (under review)
Workshop: NeurIPS 2024 Workshop on Time Series in the Age of Large Models
Stochastic Sparse Sampling (SSS) introduces a novel framework for variable-length time series classification, specifically designed for medical applications where sequence length varies among patients and events. SSS handles variable-length sequences by sparsely sampling fixed windows to generate local predictions, which are then aggregated and calibrated into a global prediction. We benchmark SSS on the task of seizure onset zone localization, identifying seizure-inducing brain regions from variable-length electrophysiological data. Our experiments on the Epilepsy iEEG Multicenter Dataset show SSS outperforms state-of-the-art baselines across most centers and excels in all out-of-distribution unseen centers, while providing valuable post-hoc insights through visualization of temporally averaged local predictions.
T-VICReg: Self-Supervised Learning for Time Series with Partial Temporal Invariance, Noncontrastive and Augmentation-Free
Xavier Mootoo, Alan A. Díaz-Montiel, Milad Lankarany
T-CAIREM Machine Learning Internship 2023, University of Toronto
We propose Temporal Variance-Invariance-Covariance Regularization (T-VICReg), a self-supervised learning method for time series that enables partial invariance to temporal translations. By integrating past and future representations into current-time representations, T-VICReg improves state transition detection across various tasks. Testing on OpenNeuro dataset ds003029 with epilepsy patients' IEEG signals, our method achieved 92.92% and 89.26% Top-1 accuracy in binary and multiclass seizure detection tasks, outperforming supervised baselines (89.23% and 84.07% respectively). T-VICReg operates without contrastive learning or augmentations and works with both continuous and discrete time series.
QoS-Aware Deep Unsupervised Learning for STAR-RIS Assisted Networks: A Novel Differentiable Projection Framework
Mehrazin Alizadeh, Xavier Mootoo, Hina Tabassum
IEEE Wireless Communication Letters 2024
We present a novel unsupervised learning framework for solving constrained optimization problems with non-convex constraints, featuring a differentiable projection function that maps outputs to the feasible solution space with zero constraint violation. Our approach combines a custom neural network, this projection function, and unsupervised training to optimize the primary objective. Evaluated on wireless network optimization, our method outperforms genetic algorithms and existing projection-based approaches in achieved performance, computational efficiency, and convergence speed while maintaining perfect constraint satisfaction. This generalizable framework provides a promising direction for tackling complex optimization problems where traditional methods struggle with constraint enforcement and solution quality.