$g(x)\xrightarrow{x\to\infty}\infty$ Implies $g'(x)\leq g^{1+\varepsilon}(x)$
Recently in my ordinary differential equations class we were given the following problem:
Suppose $g:(0,\infty)\to\mathbb{R}$ is an increasing function of class $C^{1}$ such that $g(x)\xrightarrow{x\to\infty}\infty$. Show that for every $\varepsilon>0$ the inequality $g^{\prime}(x)\leq g^{1+\varepsilon}(x)$ upholds, outside a set of finite length.
I thought about using Grönwall's inequality , but i did not got any useful result.
3 answers
The following users marked this post as Works for me:
User | Comment | Date |
---|---|---|
Udi Fogiel | (no comment) | May 17, 2022 at 12:44 |
Calvin Khor | (no comment) | May 31, 2022 at 04:09 |
We may restrict attention to an interval $I = (A,+\infty)$ on which $g(x) > 1$. Let $\varepsilon > 0$ be given.
The set $S$ of points $x$ in $I$ at which $g'(x) > (g(x))^{1 + \varepsilon}$ is open, hence is a disjoint union of countably many open intervals of the form $(a,b)$.
Given such an interval $(a,b)$, with $b$ finite, we have $g'(x) \geq 1$ on $[a,b]$. Thus the function $y = g(x)$ has a $C^1$ inverse on $[a,b]$. Moreover the inequality $dy/dx > y^{1 + \varepsilon}$ implies $dx/dy < y^{-1-\varepsilon}$. Integrating with respect to $y$, we find $b - a \leq \int_{g(a)}^{g(b)} y^{-1-\varepsilon}dy = \frac{1}{\varepsilon}[(g(a))^{-\varepsilon} - (g(b))^{-\varepsilon}]$. (If $b = +\infty$, an obvious modification of this argument shows that $b - a$ is finite, a contradiction.)
Therefore, the length of the set $S$ is bounded above by $1/\varepsilon$ times the length of its image under the decreasing function $h(x) = (g(x))^{-\varepsilon}.$ But this image is contained in $(0,1)$.
0 comment threads
Consider $g^{-\epsilon}$. Then its first derivative is $D_x \; g^{-\epsilon} = \epsilon g^{-1-\epsilon} g'$. Then $g^{-\epsilon} > 0$ and tends to 0, and ${D_x \; g^{-\epsilon}} < 0 $.
If $D_x \; g^{-\epsilon} < -\epsilon$ on a set of infinite measure, then $\int^{x}_0 D_x , g^{-\epsilon} \quad dx $ (which differs from $g^{-\epsilon}$ by a constant) would diverge to negative infinity. So $D_x \; g^{-\epsilon} \ge -\epsilon$ outside a set of finite measure, from which your desired inequality follows immediately.
My questions
-
Can you cite or scan the question from the source? I am leery, because the claim appears false.
-
Is g(x) supposed to be convex?
Game plan
I shall construct, on top of $f(x)=x$, a function which occasionally jumps up a constant amount over littler and littler intervals. This means I can create a function with the growth rate of $x$, but with as much derivative growth as I want.
Counter Example
Let $h(x)$ be any nonnegative continuous function with the property that $h(0) = 1$ and $h \le 1$ everywhere, and $h = 0 $ outside the finite interval $[-0.25,0.25]$. Let $H(x)= h(x) + 2h(2(x-1)) + 4h(4(x-2)) + 8h((x-3)/8) + ...$
Then $H(x)$ converges to a continuous function, because the supports of each summand are disjoint. Observe that $H(n) = 2n$.
Let $g(x) = x + \int^0_x H(t) , dt$. Then certainly $g$ is $C^1$, increasing, and tends to infinity.
I constructed $h \le 1$, and $h = 0$. Thus $ \int^0_x ah(at) = \int^0_{x/a} h(u) du \le C$ for some constant C. Then $g(x) \le x + C(x+1)$, since at most $x+1$ of the rescaled $h's \neq 0$.
So $g^{1+\epsilon}$ grows at most polynomially. But we just constructed $g$ so that $g' \ge H$ has an exponentially growing SUBsequence!
1 comment thread