Speaker
Jian Wang
(Shandong University)
Description
We address the next-to-leading power (NLP) resummation of large double logarithms of 1 − x to all orders in the strong coupling, which are present even in the off-diagonal DGLAP splitting kernels. The appearance of divergent convolutions prevents the application of factorization methods known from leading power resummation. Employing d-dimensional consistency relations from requiring 1/ε pole cancellations in dimensional regularization between momentum regions, we show that the resummation of the off-diagonal parton-scattering channels at the leading logarithmic order can be bootstrapped from the recently conjectured exponentiation of NLP soft-quark Sudakov logarithms. In particular, we derive a result for the DGLAP kernel in terms of the series of Bernoulli numbers found previously by Vogt directly from algebraic all-order expressions. We identify the off-diagonal DGLAP splitting functions and soft-quark Sudakov logarithms as inherent two-scale quantities in the large-x limit. We use a refactorization of these scales and renormalization group methods inspired by soft-collinear effective theory to derive the conjectured soft-quark Sudakov exponentiation formula.
Primary author
Jian Wang
(Shandong University)