Dimitri A. Vilensky

(Pasechnyuk)

 

Biographie

dmivilensky1@gmail.com

Originaire de la Baltique, citadin de Saint-Pétersbourg, étudiant français, expatrié arabe.

Je fais des recherches en programmation mathématique, étudie la topologie, l’analyse et la théorie des types, et travaille en tant qu'ingénieur logiciel.

Ceux qui voudraient savoir les détails de ma biographie personnelle peuvent se référer à l’album photo.

2023

2020

Emploi

MBZUAI (Émirats arabes unis)
mars 2022 — maintenant

MIPT (Russie)
décembre 2020 — décembre 2023

Superviseurs

A.V. Gasnikov, A.Yu. Gornov,
Martin Takáč, Roland Hildebrand

Langues

• Russe (couramment), anglais (C1), français (A2)
• SystemVerilog, Assembler (RISC), C++ (20), Java (8 & Android SDK 28), Python, MATLAB, Haskell, Coq

Thèmes de recherche

L’optimisation du second ordre et méthode de Bregman

Co-auteurs: Dr. Dmitry Kamzolov

Superviseurs: Prof. Martin Takáč, Prof. Fedor Stonyakin

Articles: [30], [27], [16]

L’optimisation d’ordre zéro

Superviseurs: Prof. Alexander Gasnikov

Articles: [28]

L’apprentissage profond et la topologie

Co-auteurs: Dr. Fedor Pavutnitskiy

Articles: [26]

Curriculum Vitæ

Stages

Euler IMI (Russie)
octobre 2021 — décembre 2022

JetBrains Research
mars 2021 — mars 2022

Affiliations

ISP RAS (Russie)
novembre 2021 — maintenant

IITP RAS (Russie)
août 2021 — juin 2024

Projets communs

Fonds Cerera (Serbie)
juillet 2023 — maintenant
R&D quantitative

VTB (Russie)
août 2021 — maintenant
article dans TASS

Huawei
décembre 2020—décembre 2023
mené par Yanitsky et Vorobyov

L’Université de transport (Russie)
mai 2022 — janvier 2023

Éducation

L’Université Grenoble-Alpes
2023 — maintenant, licence

MIPT (Russie)
2020 — 2021, licence

CSC (Russie)
2018 — 2022, entraînement

PhML №239 (Russie)
2017 — 2020, lycée

Activité sociale

Enseignement

MIPT (l’automne 2021). Master en Méthodes d’Optimisation. Assisté le professeur Stonyakin

Sirius (l’été 2021). Camp de projet pour les écoliers (à propos du projet, diapositives)

Sirius (le printemps 2020). Conférences scientifiques populaires (vidéo N°1, vidéo N°2)

Revue

NeurIPS (2022 — 2024). Évaluateur

ICML (2023 — 2024). Évaluateur

ICLR (2024 — 2025). Évaluateur

AISTATS (2025). Évaluateur

OPTIMA (2023 — 2024). Membre du comité de programme

Journal of Inequalities and Applications (2023). Évaluateur invité

Mathematical Programming (2024). Évaluateur invité

Articles

2024

[31] Savchuk, O.S., Alkousa, M.S., Shushko, A.S., Vyguzov, A.A., Stonyakin, F.S., Pasechnyuk, D.A., Gasnikov, A.V. Accelerated Bregman gradient methods for relatively smooth and relatively Lipschitz continuous minimization problems. (2024) URL https://arxiv.org/abs/2411.16743

[30] Kamzolov, D.I., Pasechnyuk, D.A., Agafonov, A., Gasnikov, A.V., Takáč, M. OPTAMI: Global Superlinear Convergence of High-order Methods. (2024) URL https://arxiv.org/abs/2410.04083

[29] Pasechnyuk, D.A., Dvurechensky, P., Uribe, C.A., Gasnikov, A.V. Decentralized convex optimisation with probability-proportional-to-size quantization. (2024) URL TBA

[28] Pasechnyuk, D.A., Lobanov, A., Gasnikov, A. Upper bounds on the maximum admissible level of noise in zeroth-order optimisation. (2023) URL https://arxiv.org/abs/2306.16371

[27] Pasechnyuk, D.A., Gasnikov, A., Takáč, M. Convergence analysis of stochastic gradient descent with adaptive preconditioning for non-convex and convex functions. (2023) URL https://arxiv.org/abs/2308.14192

ICML A*

[26] Brilliantov, K., Pavutnitskiy, F.Yu., Pasechnyuk, D.A., Magai, G. Applying language models to algebraic topology: generating simplicial cycles using multi-labeling in Wu's formula. Proceedings of the 41st International Conference on Machine Learning, PMLR 235, 4542–4560 (2024) URL https://proceedings.mlr.press/v235/brilliantov24a.html

[25] Kubentayeva, M., Yarmoshik, D., Persiianov, M., Kroshnin, A., Kotliarova, E., Tupitsa, N., Pasechnyuk, D., Gasnikov, A., Shvetsov, V., Baryshev, L., Shurupov, A. Primal-Dual Gradient Methods for Searching Network Equilibria in Combined Models with Nested Choice Structure and Capacity Constraints. Computational Management Science: Network Analysis and Applications 21(1), 15 (2023) DOI 10.1007/s10287-023-00494-8

2023

[24] Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S., Pasechnyuk, D.A. Adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Programming and Computer Software 49(6), 485–492 (2023) DOI 10.1134/S0361768823060026

[23] Aivazian, G.V., Stonyakin, F.S., Pasechnyuk, D.A., Alkousa, M.S., Raigorodskii, A.M., Baran I.V. Adaptive variant of the Frank–Wolfe algorithm for convex optimization problems. Programming and Computer Software 49(6), 493–504 (2023) DOI 10.1134/S0361768823060038

[22] Pasechnyuk, D.A., Persiianov, M., Dvurechensky, P., Gasnikov, A. Algorithms for Euclidean-regularised Optimal Transport. International Conference on Optimization and Applications. Lecture Notes in Computer Science 14395, 84–98. Springer, Cham (2023). DOI 10.1007/978-3-031-47859-8_7

[21] Pasechnyuk, D.A., Gornov, A.Yu. A randomised non-descent method for global optimisation. International Conference on Optimization and Applications. Communications in Computer and Information Science 1913, 3–14. Springer, Cham (2023). DOI 10.1007/978-3-031-48751-4_1

[20] Gladin, E.L., Kuruzov, I.A., Pasechnyuk, D.A., Stonyakin, F.S., Alkousa, M.S., Gasnikov, A.V. Solving strongly convex-concave composite saddle point problems with a small dimension of one of the variables. Matematicheskii Sbornik 214(3), 3–53 (2023) DOI 10.4213/sm9700

ICSE A*

[19] Pasechnyuk, D., Prazdnichnykh, A., Evtikhiev, M., Bryksin, T. Judging Adam: Studying Optimizer Performance On ML4SE Tasks. IEEE/ACM 45th International Conference on Software Engineering: New Ideas and Emerging Results, 117–122 (2023) DOI 10.1109/ICSE-NIER58687.2023.00027

[18] Beznosikov, A.N., Gasnikov, A.V., Zainullina, K.E., Maslovskiy, A.Yu., Pasechnyuk, D.A. A Unified Analysis of Variational Inequality Methods: Variance Reduction, Sampling, Quantization and Coordinate Descent. Computational Mathematics and Mathematical Physics 63(2), 189–217 (2023) DOI 10.1134/S0965542523020045

2022

[17] Pasechnyuk, D.A., Gasnikov, A., Takáč, M. Effects of momentum scaling for SGD. NeurIPS Workshop HOO (2022) URL https://order-up-ml.github.io/papers/17.pdf

NIPS A*

[16] Hanzely, S., Kamzolov, D., Pasechnyuk, D., Gasnikov, A., Richtárik, P., Takáč, M. A Damped Newton Method Achieves Global O(1/k²) and Local Quadratic Convergence Rate. Advances in Neural Information Processing Systems 35, 25320–25334 (2022) URL https://proceedings.neurips.cc/paper_files/paper/2022/hash/a1f0c0cd6caaa4863af5f12608edf63e-Abstract-Conference.html

JOTA Q1

[15] Ivanova, A., Dvurechensky, P., Vorontsova, E., Pasechnyuk, D., Gasnikov, A., Dvinskikh, D., Tyurin, A. Oracle Complexity Separation in Convex Optimization. Journal of Optimization Theory and Applications (2022). DOI 10.1007/s10957-022-02038-7

[14] Alpatov, A.V., Peters, E.A., Pasechnyuk, D.A., Raigorodskii, A.M. Stochastic optimization in digital pre-distortion of the signal. Computer Research and Modeling 14(2), 399–416 (2022) DOI 10.20537/2076-7633-2022-14-2-399-416

[13] Pasechnyuk, D.A., Anikin, A.S., Matyukhin, V.V. Accelerated Proximal Envelopes: Application to the Coordinate Descent Method. Computational Mathematics and Mathematical Physics 62(2), 342–352 (2022) DOI 10.1134/S0965542522020038

2021

[12] Pasechnyuk, D., Raigorodskii, A. Network utility maximization by updating individual transmission rates. International Conference on Optimization and Applications. Communications in Computer and Information Science 1514, 184–198. Springer, Cham (2021). DOI 10.1007/978-3-030-92711-0_13

[11] Pasechnyuk, D., Dvurechensky, P., Omelchenko, S., Gasnikov, A. Stochastic optimization for dynamic pricing. International Conference on Optimization and Applications. Communications in Computer and Information Science 1514, 82–94. Springer, Cham (2021). DOI 10.1007/978-3-030-92711-0_6

[10] Pasechnyuk, D., Matyukhin, V. On the Computational Efficiency of Catalyst Accelerated Coordinate Descent. International Conference on Mathematical Optimization Theory and Operations Research. Lecture Notes in Computer Science 12755, 176–191. Springer, Cham (2021). DOI 10.1007/978-3-030-77876-7_12

[9] Maslovskiy, A., Pasechnyuk, D., Gasnikov, A., Anikin, A., Rogozin, A., Gornov, A., Vorobyev, A., Antonov, L., Vlasov, R., Nikolaeva, A., Yanitskiy, E., Begicheva, M. Non-convex optimization in digital pre-distortion of the signal. International Conference on Mathematical Optimization Theory and Operations Research. Communications in Computer and Information Science 1476, 54–70. Springer, Cham (2021). DOI 10.1007/978-3-030-86433-0_4

[8] Gasnikov, A.V., Dvinskikh, D.M., Dvurechensky, P.E., Kamzolov, D.I., Matyukhin, V.V., Pasechnyuk, D.A., Tupitsa, N.K., Chernov, A.V. Accelerated meta-algorithm for convex optimization. Computational Mathematics and Mathematical Physics 61(1), 17–28 (2021) DOI 10.1134/S096554252101005X

[7] Ivanova, A., Pasechnyuk, D., Grishchenko, D., Shulgin, E., Gasnikov, A., Matyukhin, V. Adaptive catalyst for smooth convex optimization. International Conference on Optimization and Applications. Lecture Notes in Computer Science 13078, 20–37. Springer, Cham (2021). DOI 10.1007/978-3-030-91059-4_2

[6] Ivanova, A.S., Pasechnyuk, D.A., Dvurechensky, P.E., Gasnikov, A.V., Vorontsova, E.A. Numerical methods for the resource allocation problem in networks. Computational Mathematics and Mathematical Physics 61(2), 312–345 (2021) DOI 10.1134/S0965542521020135

OMS Q1

[5] Stonyakin, F., Tyurin, A., Gasnikov, A., Dvurechensky, P., Agafonov, A., Dvinskikh, D., Alkousa, M., Pasechnyuk, D., Artamonov S., Piskunova, V. Inexact model: A framework for optimization and variational inequalities. Optimization Methods and Software, 1–47 (2021). DOI 10.1080/10556788.2021.1924714

2020

[4] Ivanova, A., Stonyakin, F., Pasechnyuk, D., Vorontsova, E., Gasnikov, A. Adaptive Mirror Descent for the Network Utility Maximization Problem. IFAC-PapersOnLine 53(2), 7851–7856 (2020). DOI 10.1016/j.ifacol.2020.12.1958

2019

[3] Pasechnyuk, D.A., Stonyakin, F.S. One method for minimization a convex Lipschitz-continuous function of two variables on a fixed square. Computer Research and Modeling 11(3), 379–395 (2019) DOI 10.20537/2076-7633-2019-11-3-379-395

[2] Stonyakin, F., Dvinskikh, D., Dvurechensky, P., Kroshnin, A., Kuznetsova, O., Agafonov, A., Gasnikov, A., Tyurin, A., Uribe, C., Pasechnyuk, D., Artamonov, S. Gradient methods for problems with inexact model of the objective. International Conference on Mathematical Optimization Theory and Operations Research. Lecture Notes in Computer Science 11548, 97–114. Springer, Cham (2019). DOI 10.1007/978-3-030-22629-9_8

[1] Pasechnyuk D.A. Scheduling strategies for resource allocation in a cellular base station. Proceedings of MIPT 11(2), 38–48 (2019) URL https://mipt.ru/upload/medialibrary/f16/4_trudy-mfti-_2_42_38_48.pdf

Conférences

2024

Le rapport Applying language models to algebraic topology: generating simplicial cycles using multi-labeling in Wu's formula. International Conference on Machine Learning, Vienna (2024). URL https://icml.cc/virtual/2024/poster/33428

2023

Les rapports A randomised non-descent method for global optimisation et Algorithms for Euclidean-regularised Optimal Transport. International Conference Optimization and Applications (2023). URL http://agora.guru.ru/display.php

2022

Le rapport Effects of momentum scaling for SGD NeurIPS Workshop "Order up! The Benefits of Higher-Order Optimization in Machine Learning" (2022). URL https://order-up-ml.github.io/papers/

Le rapport A Damped Newton Method Achieves Global O(1/k²) and Local Quadratic Convergence Rate Ivannikov ISP RAS Conférence ouverte (2022). URL https://www.isprasopen.ru/#Agenda

2021

Le rapport "Solar" method: a two-level algorithm in a one-level optimization problem Lectures de Lyapunov, ISDCT SB RAS (2021). URL https://ris.icc.ru/publications/6158

La lection Optimization in SE. JetBrains Research, ML4SE lab (2021). URL https://youtu.be/bzGWnhb8_-8

Les rapports Stochastic optimization for dynamic pricing, Network utility maximization by updating individual transmission rates et Adaptive catalyst for smooth convex optimization. International Conference Optimization and Applications (2021). URL http://agora.guru.ru/display.php

Les rapports Network utility maximization: optimization & protocols et Full-gradient based methods for modeling nonlinear systems in digital signal processing. Optimization without borders (2021). URL http://dmivilensky.ru/opt_without_borders/

Le rapport On the Computational Efficiency of Catalyst Accelerated Coordinate Descent. International Conference Mathematical Optimization Theory and Operations Research (2021). URL https://easychair.org/smart-program/MOTOR2021/

2020

Le rapport Universal Accelerated Proximal Envelopes. 63ème Conférence Scientifique du MIPT (2020). URL https://conf.mipt.ru//folders/attachment/2882/download

Le rapport Accelerated Proximal Envelopes: Application to the Coordinate Descent Method. École d’été Modern Methods of Information Theory, Optimization and Control, Sirius University (2020). URL https://sochisirius.ru/obuchenie/graduates/smena673/3259

2019

Le rapport On Inexactness for Yu.E. Nesterov Method for a Convex Minimization on a Fixed Square. Ecole Traditionnelle des Jeunes Chercheurs Control, Information, Optimization (2019). URL https://ssopt.org/2019

Le rapport On Inexactness for Yu.E. Nesterov Method for a Convex Minimization on a Fixed Square. Conference on graphs, networks, and their applications, MIPT (2019). URL http://ru.discrete-mathematics.org/conferences/201905/workshop_graphs/schedule_workshop_network_optimization.pdf

2018

Le rapport Throughput maximization for flows with fine input structure. Workshop Optimization at Work, MIPT (2018). URL http://www.mathnet.ru/php/conference.phtml?eventID=1&confid=1259

2017

Le rapport Index Strategies for Routing Base Station Traffic. 60ème Conférence Scientifique du MIPT (2017). URL https://abitu.net/public/admin/mipt-conference/FPMI.pdf