Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

$g(x)\xrightarrow{x\to\infty}\infty$ Implies $g'(x)\leq g^{1+\varepsilon}(x)$

+6
−0

Recently in my ordinary differential equations class we were given the following problem:

Suppose $g:(0,\infty)\to\mathbb{R}$ is an increasing function of class $C^{1}$ such that $g(x)\xrightarrow{x\to\infty}\infty$. Show that for every $\varepsilon>0$ the inequality $g^{\prime}(x)\leq g^{1+\varepsilon}(x)$ upholds, outside a set of finite length.

I thought about using Grönwall's inequality , but i did not got any useful result.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.
Why should this post be closed?

1 comment thread

If you don't get an answer here, ask this on https://old.reddit.com/r/math/ or https://old.reddit.com... (1 comment)

3 answers

You are accessing this answer with a direct link, so it's being shown above all other answers regardless of its score. You can return to the normal view.

+4
−0

We may restrict attention to an interval $I = (A,+\infty)$ on which $g(x) > 1$. Let $\varepsilon > 0$ be given.

The set $S$ of points $x$ in $I$ at which $g'(x) > (g(x))^{1 + \varepsilon}$ is open, hence is a disjoint union of countably many open intervals of the form $(a,b)$.

Given such an interval $(a,b)$, with $b$ finite, we have $g'(x) \geq 1$ on $[a,b]$. Thus the function $y = g(x)$ has a $C^1$ inverse on $[a,b]$. Moreover the inequality $dy/dx > y^{1 + \varepsilon}$ implies $dx/dy < y^{-1-\varepsilon}$. Integrating with respect to $y$, we find $b - a \leq \int_{g(a)}^{g(b)} y^{-1-\varepsilon}dy = \frac{1}{\varepsilon}[(g(a))^{-\varepsilon} - (g(b))^{-\varepsilon}]$. (If $b = +\infty$, an obvious modification of this argument shows that $b - a$ is finite, a contradiction.)

Therefore, the length of the set $S$ is bounded above by $1/\varepsilon$ times the length of its image under the decreasing function $h(x) = (g(x))^{-\varepsilon}.$ But this image is contained in $(0,1)$.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

0 comment threads

+1
−4

Consider $g^{-\epsilon}$. Then its first derivative is $D_x \; g^{-\epsilon} = \epsilon g^{-1-\epsilon} g'$. Then $g^{-\epsilon} > 0$ and tends to 0, and ${D_x \; g^{-\epsilon}} < 0 $.

If $D_x \; g^{-\epsilon} < -\epsilon$ on a set of infinite measure, then $\int^{x}_0 D_x , g^{-\epsilon} \quad dx $ (which differs from $g^{-\epsilon}$ by a constant) would diverge to negative infinity. So $D_x \; g^{-\epsilon} \ge -\epsilon$ outside a set of finite measure, from which your desired inequality follows immediately.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

1 comment thread

Why is the derivative negative? (2 comments)
+0
−3

My questions

  1. Can you cite or scan the question from the source? I am leery, because the claim appears false.

  2. Is g(x) supposed to be convex?

Game plan

I shall construct, on top of $f(x)=x$, a function which occasionally jumps up a constant amount over littler and littler intervals. This means I can create a function with the growth rate of $x$, but with as much derivative growth as I want.

Counter Example

Let $h(x)$ be any nonnegative continuous function with the property that $h(0) = 1$ and $h \le 1$ everywhere, and $h = 0 $ outside the finite interval $[-0.25,0.25]$. Let $H(x)= h(x) + 2h(2(x-1)) + 4h(4(x-2)) + 8h((x-3)/8) + ...$

Then $H(x)$ converges to a continuous function, because the supports of each summand are disjoint. Observe that $H(n) = 2n$.

Let $g(x) = x + \int^0_x H(t) , dt$. Then certainly $g$ is $C^1$, increasing, and tends to infinity.

I constructed $h \le 1$, and $h = 0$. Thus $ \int^0_x ah(at) = \int^0_{x/a} h(u) du \le C$ for some constant C. Then $g(x) \le x + C(x+1)$, since at most $x+1$ of the rescaled $h's \neq 0$.

So $g^{1+\epsilon}$ grows at most polynomially. But we just constructed $g$ so that $g' \ge H$ has an exponentially growing SUBsequence!

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.

1 comment thread

While it is a nice example, i don't believe it is answering my question, as the set where $g'(x)> g^{... (1 comment)

Sign up to answer this question »