Dmitrii A. Vilensky

(Pasechnyuk)

 

Bio

dmivilensky1@gmail.com

Was born in 2004 in the Baltic, grew up in St. Petersburg, study in Grenoble.

I do research in the field of mathematical programming, study simplicial homotopy theory and homological algebra, and also work as a software engineer.

Besides work, I am fond of poetry, ballet, music, and collecting.

Those who want to know the details of my personal biography can refer to the photo album.

2023

2020

Positions

MBZUAI (UAE)
March 2022 — now

MIPT (Russia)
December 2020 — December 2023

Supervisors

A.V. Gasnikov, A.Yu. Gornov,
Martin Takáč, Roland Hildebrand,

Languages

Russian (fluent), English (C1), Français (A2)
SystemVerilog, Assembler, C++, Java, Python, MATLAB, Haskell, Coq

Research topics

Second-order optimisation methods and preconditioning

Co-authors: Dr. Dmitry Kamzolov

Supervisors: Prof. Martin Takáč

Papers: [16], [17], [28]

Zeroth-order optimisation

Supervisors: Prof. Alexander Gasnikov

Papers: [26]

Application of ML to algebra and topology

Co-authors: Dr. Fedor Pavutnitskiy

Papers: [27]

Curriculum Vitæ

Internships

Euler IMI (Russia)
October 2021 — December 2022

JetBrains Research
March 2021 — March 2022

Affiliations

ISP RAS (Russia)
November 2021 — now

IITP RAS (Russia)
August 2021 — now

Joint projects

Cerera Fund (Serbia)
July 2023 — now
quantitative R&D

Huawei
December 2020 — now
led by E. Yanitskiy and A. Vorobyev

VTB (Russia)
August 2021 — now
article in TASS

RUT MIIT (Russia)
May 2022 — January 2023

Education

Grenoble-Alpes (France)
2023 — now, BSc student

MIPT (Russia)
2020 — 2021, BSc student

CSC (Russia)
2018 — 2022, CS and SE student

PhML №239 (Russia)
2017 — 2020, school student

Social activity

Teaching

MIPT, autumn 2021. Master course in Optimisation methods. Assistant for prof. Stonyakin.

Sirius, summer 2021. School project camp. Mentor of project with schoolchildren: about project, slides.

Sirius, spring 2020. Popular scientific lectures: lecture no. 1, lecture no. 2.

Reviewing

• Conference NIPS '22, '23. Reviewer.

• Conference ICML '23, '24. Reviewer.

• Conference ICLR '24. Reviewer.

• Conference OPTIMA '23. Program committee member.

• Journal of Inequalities and Applications. Reviewer since 2023.

Papers

2024

[28] Pasechnyuk, D.A., Gasnikov, A., Takáč, M. Convergence analysis of stochastic gradient descent with adaptive preconditioning for non-convex and convex functions. (2023) URL https://arxiv.org/abs/2308.14192

[27] Brilliantov, K., Pavutnitskiy, F.Yu., Pasechnyuk, D.A., Magai, G. Applying language models to algebraic topology: generating simplicial cycles using multi-labeling in Wu's formula. (2023) URL http://arxiv.org/abs/2306.16951

[26] Pasechnyuk, D.A., Lobanov, A., Gasnikov, A. Upper bounds on the maximum admissible level of noise in zeroth-order optimisation. URL https://arxiv.org/abs/2306.16371

[25] Kubentayeva, M., Yarmoshik, D., Persiianov, M., Kroshnin, A., Kotliarova, E., Tupitsa, N., Pasechnyuk, D., Gasnikov, A., Shvetsov, V., Baryshev, L., Shurupov, A. Primal-Dual Gradient Methods for Searching Network Equilibria in Combined Models with Nested Choice Structure and Capacity Constraints. Computational Management Science: Network Analysis and Applications 21(1), 15 (2024) DOI 10.1007/s10287-023-00494-8

2023

[24] Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S., Pasechnyuk, D.A. Adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Programming and Computer Software 49(6), 485–492 (2023) DOI 10.1134/S0361768823060026

[23] Aivazian, G.V., Stonyakin, F.S., Pasechnyuk, D.A., Alkousa, M.S., Raigorodskii, A.M., Baran I.V. Adaptive variant of the Frank–Wolfe algorithm for convex optimization problems. Programming and Computer Software 49(6), 493–504 (2023) DOI 10.1134/S0361768823060038

[22] Pasechnyuk, D.A., Persiianov, M., Dvurechensky, P., Gasnikov, A. Algorithms for Euclidean-regularised Optimal Transport. International Conference on Optimization and Applications. Lecture Notes in Computer Science 14395, 84–98. Springer, Cham (2023). DOI 10.1007/978-3-031-47859-8_7

[21] Pasechnyuk, D.A., Gornov, A.Yu. A randomised non-descent method for global optimisation. International Conference on Optimization and Applications. Communications in Computer and Information Science 1913, 3–14. Springer, Cham (2023). DOI 10.1007/978-3-031-48751-4_1

[20] Gladin, E.L., Kuruzov, I.A., Pasechnyuk, D.A., Stonyakin, F.S., Alkousa, M.S., Gasnikov, A.V. Solving strongly convex-concave composite saddle point problems with a small dimension of one of the variables. Matematicheskii Sbornik 214(3), 3–53 (2023) DOI 10.4213/sm9700

ICSE A*

[19] Pasechnyuk, D., Prazdnichnykh, A., Evtikhiev, M., Bryksin, T. Judging Adam: Studying Optimizer Performance On ML4SE Tasks. IEEE/ACM 45th International Conference on Software Engineering: New Ideas and Emerging Results, 117–122 (2023) DOI 10.1109/ICSE-NIER58687.2023.00027

[18] Beznosikov, A.N., Gasnikov, A.V., Zainullina, K.E., Maslovskiy, A.Yu., Pasechnyuk, D.A. A Unified Analysis of Variational Inequality Methods: Variance Reduction, Sampling, Quantization and Coordinate Descent. Computational Mathematics and Mathematical Physics 63(2), 189–217 (2023) DOI 10.1134/S0965542523020045

2022

[17] Pasechnyuk, D.A., Gasnikov, A., Takáč, M. Effects of momentum scaling for SGD. NeurIPS Workshop HOO (2022) URL https://order-up-ml.github.io/papers/17.pdf

NIPS A*

[16] Hanzely, S., Kamzolov, D., Pasechnyuk, D., Gasnikov, A., Richtárik, P., Takáč, M. A Damped Newton Method Achieves Global O(1/k²) and Local Quadratic Convergence Rate. Advances in Neural Information Processing Systems 35, 25320–25334 (2022) URL https://proceedings.neurips.cc/paper_files/paper/2022/hash/a1f0c0cd6caaa4863af5f12608edf63e-Abstract-Conference.html

JOTA Q1

[15] Ivanova, A., Dvurechensky, P., Vorontsova, E., Pasechnyuk, D., Gasnikov, A., Dvinskikh, D., Tyurin, A. Oracle Complexity Separation in Convex Optimization. Journal of Optimization Theory and Applications (2022). DOI 10.1007/s10957-022-02038-7

[14] Alpatov, A.V., Peters, E.A., Pasechnyuk, D.A., Raigorodskii, A.M. Stochastic optimization in digital pre-distortion of the signal. Computer Research and Modeling 14(2), 399–416 (2022) DOI 10.20537/2076-7633-2022-14-2-399-416

[13] Pasechnyuk, D.A., Anikin, A.S., Matyukhin, V.V. Accelerated Proximal Envelopes: Application to the Coordinate Descent Method. Computational Mathematics and Mathematical Physics 62(2), 342–352 (2022) DOI 10.1134/S0965542522020038

2021

[12] Pasechnyuk, D., Raigorodskii, A. Network utility maximization by updating individual transmission rates. International Conference on Optimization and Applications. Communications in Computer and Information Science 1514, 184–198. Springer, Cham (2021). DOI 10.1007/978-3-030-92711-0_13

[11] Pasechnyuk, D., Dvurechensky, P., Omelchenko, S., Gasnikov, A. Stochastic optimization for dynamic pricing. International Conference on Optimization and Applications. Communications in Computer and Information Science 1514, 82–94. Springer, Cham (2021). DOI 10.1007/978-3-030-92711-0_6

[10] Pasechnyuk, D., Matyukhin, V. On the Computational Efficiency of Catalyst Accelerated Coordinate Descent. International Conference on Mathematical Optimization Theory and Operations Research. Lecture Notes in Computer Science 12755, 176–191. Springer, Cham (2021). DOI 10.1007/978-3-030-77876-7_12

[9] Maslovskiy, A., Pasechnyuk, D., Gasnikov, A., Anikin, A., Rogozin, A., Gornov, A., Vorobyev, A., Antonov, L., Vlasov, R., Nikolaeva, A., Yanitskiy, E., Begicheva, M. Non-convex optimization in digital pre-distortion of the signal. International Conference on Mathematical Optimization Theory and Operations Research. Communications in Computer and Information Science 1476, 54–70. Springer, Cham (2021). DOI 10.1007/978-3-030-86433-0_4

[8] Gasnikov, A.V., Dvinskikh, D.M., Dvurechensky, P.E., Kamzolov, D.I., Matyukhin, V.V., Pasechnyuk, D.A., Tupitsa, N.K., Chernov, A.V. Accelerated meta-algorithm for convex optimization. Computational Mathematics and Mathematical Physics 61(1), 17–28 (2021) DOI 10.1134/S096554252101005X

[7] Ivanova, A., Pasechnyuk, D., Grishchenko, D., Shulgin, E., Gasnikov, A., Matyukhin, V. Adaptive catalyst for smooth convex optimization. International Conference on Optimization and Applications. Lecture Notes in Computer Science 13078, 20–37. Springer, Cham (2021). DOI 10.1007/978-3-030-91059-4_2

[6] Ivanova, A.S., Pasechnyuk, D.A., Dvurechensky, P.E., Gasnikov, A.V., Vorontsova, E.A. Numerical methods for the resource allocation problem in networks. Computational Mathematics and Mathematical Physics 61(2), 312–345 (2021) DOI 10.1134/S0965542521020135

OMS Q1

[5] Stonyakin, F., Tyurin, A., Gasnikov, A., Dvurechensky, P., Agafonov, A., Dvinskikh, D., Alkousa, M., Pasechnyuk, D., Artamonov S., Piskunova, V. Inexact model: A framework for optimization and variational inequalities. Optimization Methods and Software, 1–47 (2021). DOI 10.1080/10556788.2021.1924714

2020

[4] Ivanova, A., Stonyakin, F., Pasechnyuk, D., Vorontsova, E., Gasnikov, A. Adaptive Mirror Descent for the Network Utility Maximization Problem. IFAC-PapersOnLine 53(2), 7851–7856 (2020). DOI 10.1016/j.ifacol.2020.12.1958

2019

[3] Pasechnyuk, D.A., Stonyakin, F.S. One method for minimization a convex Lipschitz-continuous function of two variables on a fixed square. Computer Research and Modeling 11(3), 379–395 (2019) DOI 10.20537/2076-7633-2019-11-3-379-395

[2] Stonyakin, F., Dvinskikh, D., Dvurechensky, P., Kroshnin, A., Kuznetsova, O., Agafonov, A., Gasnikov, A., Tyurin, A., Uribe, C., Pasechnyuk, D., Artamonov, S. Gradient methods for problems with inexact model of the objective. International Conference on Mathematical Optimization Theory and Operations Research. Lecture Notes in Computer Science 11548, 97–114. Springer, Cham (2019). DOI 10.1007/978-3-030-22629-9_8

[1] Pasechnyuk D.A. Scheduling strategies for resource allocation in a cellular base station. Proceedings of MIPT 11(2), 38–48 (2019) URL https://mipt.ru/upload/medialibrary/f16/4_trudy-mfti-_2_42_38_48.pdf

Talks

2023

Reports A randomised non-descent method for global optimisation and Algorithms for Euclidean-regularised Optimal Transport. International Conference Optimization and Applications (2023). URL http://agora.guru.ru/display.php

2022

Report Effects of momentum scaling for SGD NeurIPS Workshop "Order up! The Benefits of Higher-Order Optimization in Machine Learning" (2022). URL https://order-up-ml.github.io/papers/

Report A Damped Newton Method Achieves Global O(1/k²) and Local Quadratic Convergence Rate Ivannikov ISP RAS Open Conference (2022). URL https://www.isprasopen.ru/#Agenda

2021

Report "Solar" method: a two-level algorithm in a one-level optimization problem Lyapunov reading, ISDCT SB RAS (2021). URL https://ris.icc.ru/publications/6158

Lection Optimization in SE. JetBrains Research, ML4SE lab (2021). URL https://youtu.be/bzGWnhb8_-8

Reports Stochastic optimization for dynamic pricing, Network utility maximization by updating individual transmission rates and Adaptive catalyst for smooth convex optimization. International Conference Optimization and Applications (2021). URL http://agora.guru.ru/display.php

Reports Network utility maximization: optimization & protocols and Full-gradient based methods for modeling nonlinear systems in digital signal processing. Optimization without borders (2021). URL http://dmivilensky.ru/opt_without_borders/

Report On the Computational Efficiency of Catalyst Accelerated Coordinate Descent. International Conference Mathematical Optimization Theory and Operations Research (2021). URL https://easychair.org/smart-program/MOTOR2021/

2020

Report Universal Accelerated Proximal Envelopes. The 63rd MIPT Scientific Conference (2020). URL https://conf.mipt.ru//folders/attachment/2882/download

Report Accelerated Proximal Envelopes: Application to the Coordinate Descent Method. School Modern Methods of Information Theory, Optimization and Control, Sirius University (2020). URL https://sochisirius.ru/obuchenie/graduates/smena673/3259

2019

Poster On Inexactness for Yu.E. Nesterov Method for a Convex Minimization on a Fixed Square. Traditional School for Young Researchers Control, Information, Optimization (2019). URL https://ssopt.org/2019

Report On Inexactness for Yu.E. Nesterov Method for a Convex Minimization on a Fixed Square. Conference on graphs, networks, and their applications, MIPT (2019). URL http://ru.discrete-mathematics.org/conferences/201905/workshop_graphs/schedule_workshop_network_optimization.pdf

2018

Report Throughput maximization for flows with fine input structure. Workshop Optimization at Work, MIPT (2018). URL http://www.mathnet.ru/php/conference.phtml?eventID=1&confid=1259

2017

Report Index Strategies for Routing Base Station Traffic. The 60th MIPT Scientific Conference (2017). URL https://abitu.net/public/admin/mipt-conference/FPMI.pdf