Forgot password?
 Register account
View 244|Reply 3

[函数] Markov brothers' inequality

[Copy link]

3153

Threads

7905

Posts

610K

Credits

Credits
64091
QQ

Show all posts

hbghlyj Posted 2023-5-28 20:53 |Read mode
Wikipedia
令$P$为次数$≤n$的多项式,则对于所有非负整数$k$
$$ \max _{-1\leq x\leq 1}|P^{(k)}(x)|\leq {\frac {n^{2}(n^{2}-1^{2} )(n^{2}-2^{2})\cdots (n^{2}-(k-1)^{2})}{1\cdot 3\cdot 5\cdots (2k-1)} }\max _{-1\leq x\leq 1}|P(x)|$$
对于第一类 Chebyshev 多项式,等式是成立的。

3153

Threads

7905

Posts

610K

Credits

Credits
64091
QQ

Show all posts

 Author| hbghlyj Posted 2023-5-28 20:54
当$k=1$
$$\max _{-1\leq x\leq 1}|P'(x)|\leq n^2\max _{-1\leq x\leq 1}|P(x)|$$

3153

Threads

7905

Posts

610K

Credits

Credits
64091
QQ

Show all posts

 Author| hbghlyj Posted 2023-5-28 21:40
Twelve Proofs of the Markov Inequality 1 Introduction
damtp.cam.ac.uk › Alexei › papers
PDF
This is the story of the classical Markov inequality for the k-th derivative of an algebraic polynomial and attempts to find a simpler and better proof that.

3153

Threads

7905

Posts

610K

Credits

Credits
64091
QQ

Show all posts

 Author| hbghlyj Posted 2023-5-28 21:47
Chebyshev approximation
One can obtain polynomials very close to the optimal one by expanding the given function in terms of Chebyshev polynomials and then cutting off the expansion at the desired degree.This is similar to the Fourier analysis of the function, using the Chebyshev polynomials instead of the usual trigonometric functions.
If one calculates the coefficients in the Chebyshev expansion for a function:$$ f(x)\sim \sum _{i=0}^{\infty }c_{i}T_{i}(x) $$
and then cuts off the series after the $ T_{N} $ term, one gets an $N$th-degree polynomial approximating $f(x)$.
The reason this polynomial is nearly optimal is that, for functions with rapidly converging power series, if the series is cut off after some term, the total error arising from the cutoff is close to the first term after the cutoff. That is, the first term after the cutoff dominates all later terms. The same is true if the expansion is in terms of bucking polynomials. If a Chebyshev expansion is cut off after $ T_{N+1} $. The Chebyshev polynomials have the property that they are level – they oscillate between +1 and −1 in the interval $[−1, 1]$. $ T_{N} $ is close to a level function with $N+2$ extrema, so it is close to the optimal $N$th-degree polynomial.
In the graphs above, the blue error function is sometimes better than (inside of) the red function, but sometimes worse, meaning that it is not quite the optimal polynomial. The discrepancy is less serious for the exp function, which has an extremely rapidly converging power series, than for the log function.
Chebyshev approximation is the basis for Clenshaw–Curtis quadrature, a numerical integration technique.

Mobile version|Discuz Math Forum

2025-6-5 07:28 GMT+8

Powered by Discuz!

× Quick Reply To Top Edit