Brouste, A., Dutang, C., Hovsepyan, L. and Rohmer, T.
Fast inference in copula models with categorical explanatory variables using the one-step procedure, Computational Statistics, 41 (1) (2026).
The statistical decision theory deals with the problem of constructing an optimal decision for a given statistical experiment.
The statistical decision theory deals with the problem of constructing an optimal decision for a given statistical experiment.
The theory of statistical experiments deals with the convergence of statistical experiments and the construction of asymptotical optimal decision.
The theory of statistical experiments deals with the convergence of statistical experiments
and the construction of asymptotical optimal decision.
Since the convergence (in a reasonable sense) on the initial sequence of statistical experiments cannot be expected…
Since the convergence (in a reasonable sense)
on the initial sequence of statistical experiments cannot be expected…
…localized statistical experiments are built (mimicking the centering and renormalization in the central limit theorem).
…localized statistical experiments are built
(mimicking the centering and renormalization in the central limit theorem).
The convergence of the sequence of localized statistical experiment to a “simple” canonical experiment for which the optimal decision can be defined…
The convergence of the sequence of localized statistical experiment
to a “simple” canonical experiment for which the optimal decision can be defined…
…allows to define the optimal decision in the localized statistical experiments for sufficiently large size of the sample.
…allows to define the optimal decision in the localized statistical experiments
for sufficiently large size of the sample.
Moreover a “global” optimal decision can be built in the initial corresponding statistical experiment.
Moreover a “global” optimal decision can be built
in the initial corresponding statistical experiment.
The project aims to improve the knowledge on asymptotic efficiency and to provide new and innovative efficient estimators and testing procedure for large and high-frequency dataset encountered in real applications.
The project aims to improve the knowledge on asymptotic efficiency
and to provide new and innovative efficient estimators and testing procedure
for large and high-frequency dataset encountered in real applications.
[A]Toggle Animation
[I]Invert colors
[G]Toggle Guides
2022
SeminarANR EFFI France-Japan seminarApril 5, 2022 -
Le Mans Université / University of Tokyo / Online
2023
SeminarANR EFFI Japan-France seminarJune 6, 2023 -
Le Mans Université / University of Tokyo / Online
2024
2025
Brouste, A., Dutang, C., Hovsepyan, L. and Rohmer, T.
Fast inference in copula models with categorical explanatory variables using the one-step procedure, Computational Statistics, 41 (1) (2026).
Bayraktar, E. and Clément, E.
Volatility and jump activity estimation in a stable Cox-Ingersoll-Ross model, Bernoulli, 32(2) (2026).
Brouste, A., Esstafa, Y. and Popier, A.
Esstafa, Y., Kokonendji, C. and Ngo, T.
Asymptotic properties of continuous associated-kernel density estimators, Communications in Statistics – Theory and Methods, (2025).
Brouste, A., Esstafa, Y. and Malique, C.
Fast and asymptotically efficient estimation for t and log(t) distributions,Communications in Statistics – Theory and Methods, (2025).
Brouste, A., Denis, L. and Ngo, T.
LAMN property for stable-Lévy SDEs with constant scale coefficient, Bernoulli, 31 (3) (2025).
Brouste, A., Hovsepyan, L. and Votsi, E.
One-step statistical estimation method for generalized linear models, Statistical Papers, 66 (2025).
Bayraktar, E. and Clément, E.
Estimation of a pure-jump Cox-Ingersoll-Ross process. Bernoulli, 31 (1), 484-508 (2025).
Meskini, W., Brouste, A. and Dugué, N.
Speeding up the Training of Neural Networks with the One-Step Procedure, Neural Processing Letters, 56 (3) (2024).
Kutoyants, Y.
Volatility estimation of hidden Markov processes and adaptive filtration, Stochastic Processes and their Applications, 173 (2024).
Dombry, C. and Esstafa, Y.
The vanishing learning rate asymptotic for linear $L^2$-boosting, ESAIM PS, 28, 227-257 (2024).
Ben-Hariz, S., Brouste, A., Cai, C. and Soltane, M.
Fast and asymptotically efficient estimation in an autoregressive process with fractional type noise, Statistical Planning and Inference, 232 (2024).
Brouste, A. , Dutang, C., Hovsepyan, L. and Rohmer, T.
Badreau, M. and Proia, F.
Consistency and asymptotic normality in a class of nearly unstable processes, Statistical Inference for Stochastic Processes, 26, 619–641 (2023).
Chigansky, P. and Kleptsyna, M.
Estimation of the Hurst parameter from continuous noisy data, Electronic Journal of Statistics, 17(2), 2343-2385 (2023)
Ben-Hariz, S., Brouste, A., Esstafa, Y. and Soltane M.
Fast calibration of weak FARIMA models, ESAIM PS, 27, 156-173 (2023)
Brouste, A. and Farinetto, C.
Fast and efficient estimation in the Hawkes processes, Journal of Japanese Statistics and Data Science (2023)
Clément, E.
Hellinger distance in approximating Lévy driven SDEs and application to asymptotic equivalence of statistical experiment, Annals of Applied Probability, 33 (3), 2176-2209, 2023.
Afterman, D., Chigansky, P., Kleptsyna, M. and Marushkevych, D.
Linear filtering with fractional noises : large time and small noise asymptotics,
SIAM J, Control and Optimisation, 60(3), 1463-1487 (2022)
Masuda, H., Mercuri, L. and Uehara, Y. .
Noise inference for ergodic Lévy driven SDE, Electronic Journal of Statistics, 16(1), 2432–2474 (2022).
Chernoyarov, O., Dabye, A., Diop, F., Kutoyants, Y.
Non asymptotic expansions of the MME in the case of Poisson observations,
Metrika, 85, 927-950 (2022).
Bayer, C., Fukasawa, M. and Nakahara, S.
On the weak convergence rate in the discretization of rough volatility models,
SIAM J. Finan. Math., 13, 66-73 (2022)
Contact Alexandre Brouste Scientific Coordinator alexandre.brouste@univ-lemans.fr