Coordinator

Scientific leader

Team members

1 - Pause
2 - Title appear
3 - Pause
4 - Title disappear
5 - P1a
6 - P1b
7 - P1c
8 - Cloud zoom/first tunnel
9 - P2a
9b - Pause
10 - Second tunnel
11 - P3a
12 - P3b
13 - P3c
13b - Pause
14 - Third tunnel
15 - P4
15b - Pause
16 - Fourth tunnel
17 - Sponsors
19 - Zoom out
21 - Zoom out again
Efficient inference for large and high-frequency data

The statistical decision theory deals with the problem of constructing an optimal decision for a given statistical experiment.

The statistical decision theory deals with the problem of constructing an optimal decision for a given statistical experiment.

The theory of statistical experiments deals with the convergence of statistical experiments and the construction of asymptotical optimal decision.

The theory of statistical experiments deals with the convergence of statistical experiments
and the construction of asymptotical optimal decision.

Since the convergence (in a reasonable sense) on the initial sequence of statistical experiments cannot be expected…

Since the convergence (in a reasonable sense)
on the initial sequence of statistical experiments cannot be expected…

localized statistical experiments are built (mimicking the centering and renormalization in the central limit theorem).

localized statistical experiments are built
(mimicking the centering and renormalization in the central limit theorem).

The convergence of the sequence of localized statistical experiment to a “simple” canonical experiment for which the optimal decision can be defined…

The convergence of the sequence of localized statistical experiment
to a “simple” canonical experiment for which the optimal decision can be defined…

…allows to define the optimal decision in the localized statistical experiments for sufficiently large size of the sample.

…allows to define the optimal decision in the localized statistical experiments
for sufficiently large size of the sample.

Moreover a “global” optimal decision can be built in the initial corresponding statistical experiment.

Moreover a “global” optimal decision can be built
in the initial corresponding statistical experiment.

The project aims to improve the knowledge on asymptotic efficiency and to provide new and innovative efficient estimators and testing procedure for large and high-frequency dataset encountered in real applications.

The project aims to improve the knowledge on asymptotic efficiency
and to provide new and innovative efficient estimators and testing procedure
for large and high-frequency dataset encountered in real applications.

Sponsors:

Conferences, seminars and summer schools

2022

SeminarANR EFFI France-Japan seminarApril 5, 2022 -

Le Mans Université / University of Tokyo / Online

ConferenceAdvances in time seriesNovember 15, 2022 -

Le Mans Université

SeminarANR EFFI France-Japan seminarNovember 21, 2022 -

Le Mans Université / Online

2023

SeminarANR EFFI Japan-France seminarJune 6, 2023 -

Le Mans Université / University of Tokyo / Online

MinisymposiumEfficient inference for large and high-frequency dataAugust 20, 2023 - August 25, 2023

ICIAM 2023 10th International Congress on Industrial and Applied Mathematics, Waseda University [Program] [Slides]

Waseda University

SeminarANR EFFI Japan-France seminarDecember 5, 2023 - December 5, 2023

Le Mans Université / Online

2024

Spring schoolEFFI Spring SchoolMay 27, 2024 - May 31, 2024

Le Mans University

Software Package R

  • OneStep
  • Yuima

Books

Introduction to the Statistics of Poisson Processes and Applications

2023-10-01

Yu. A. Kutoyants

Articles

  • All tasks
  • Stochastic differential equations
  • Time series
  • Fractional processes
  • All years
  • 2024
  • 2023
  • 2022

Kutoyants, Y.

Volatility estimation of hidden Markov processes and adaptive filtration, Stochastic Processes and their Applications, 173

Dombry, C. and Esstafa, Y.

The vanishing learning rate asymptotic for linear $L^2$-boosting, ESAIM PS, forthcoming.

Ben-Hariz, S., Brouste, A., Cai, C. and Soltane, M.

Fast and asymptotically efficient estimation in an autoregressive process with fractional type noise, Statistical Planning and Inference, forthcoming

Brouste, A. , Dutang, C., Hovsepyan, L. and Rohmer, T.

One-step closed form estimator for generalized linear model with categorical explanatory variables, Statistics and Computing, 33(138) (2023)

Badreau, M. and Proia, F.

Consistency and asymptotic normality in a class of nearly unstable processes, Statistical Inference for Stochastic Processes, 26, 619–641 (2023).

Chigansky, P. and Kleptsyna, M.

Estimation of the Hurst parameter from continuous noisy data, Electronic Journal of Statistics, 17(2), 2343-2385 (2023)

Masuda, H., Mercuri, L. and Uehara, Y. .

Noise inference for ergodic Lévy driven SDE, Electronic Journal of Statistics, 16(1), 2432–2474 (2022).

Afterman, D., Chigansky, P., Kleptsyna, M. and Marushkevych, D.

Linear filtering with fractional noises : large time and small noise asymptotics,
SIAM J, Control and Optimisation, 60(3), 1463-1487 (2022)

Brouste, A., Dutang, C. and Rohmer, T.

A closed-form alternative estimator for GLM with categorical explanatory variables, Communications in Statistics – Simulation and Computation (2022)

Bayer, C., Fukasawa, M. and Nakahara, S.

On the weak convergence rate in the discretization of rough volatility models,
SIAM J. Finan. Math., 13, 66-73 (2022)

Chernoyarov, O., Dabye, A., Diop, F., Kutoyants, Y.

Non asymptotic expansions of the MME in the case of Poisson observations,
Metrika, 85, 927-950 (2022).

Ben-Hariz, S., Brouste, A., Esstafa, Y. and Soltane M.

Fast calibration of weak FARIMA models, ESAIM PS, 27, 156-173 (2023)

Brouste, A. and Farinetto, C.

Fast and efficient estimation in the Hawkes processes, Journal of Japanese Statistics and Data Science (2023)

Clément, E.

Hellinger and total variation distance in approximating Lévy driven SDEs, Annals of Applied Probability, forthcoming (2023)

Offered positions

Contact Alexandre Brouste Scientific Coordinator alexandre.brouste@univ-lemans.fr