Am I taking the antiderivative of |x| correctly?
Edit: forgot +C, sorry
So I remember seeing this video by Blackpenredpen on the Youtube homepage around 3 months ago where he was showing failed attempts of him trying to figure out how to get$$\int\sqrt{\sin^2(x)} \,dx=\int\left|\sin(x)\right| \, dx=-\cot(x)\left|\sin(x)\right|+c$$without piecewise integration.
While I do have my own attempts at this that I've done over the course of 2 months (by attempting IBP or using $u$-substitution), I think these attempts have failed because I don't really understand how to even evaluate$$\int|x|\,dx$$So I think that if I can integrate this, I can integrate$$\int\left|\sin(x)\right| \, dx$$ using a similar method. Here is my attempt at doing so:
We have that$$\int|x|\,dx=\int1\cdot|x| \,dx$$Now, we can rewrite this as$$\int\left(\frac d{dx}x\right)|x|\,dx$$and now we can use IBP.
Here's a brief overview of IBP for anyone who doesn't remember it:
IBP, which stands for
Integration By Parts
, is a method of integration that states that for functionsf(x)
andg(x)
,$$\int f(x)\,dg(x)=f(x)g(x)-\int g(x)\,df(x)$$(here $dg(x)$ and $df(x)$ means $\frac d{dx}g(x)$ and $\frac d{dx}f(x)$ respectively. I understand the notation is disliked a lot)
So what I think it is trying to show here in this context is that if we can rewrite our integral as a product of a 2 functions, and we know the antiderivative of one of them, we can use that to find the antiderivative of the other function.
So now we can use IBP on our integral$$I=\int\left(\frac d{dx}x\right) |x| \, dx$$using$$f(x)=|x|$$$$g(x)=x$$to get$$I=\int\left(\frac d{dx}x\right)|x|=x|x|-\int x \cdot \left(\frac{|x|}x\right)\,dx$$$$=x|x|-\int|x|,dx$$$$I=x|x|-\int|x| \, dx\implies I=x|x|-I$$$$\implies 2I=x|x|\implies I=\int|x|,dx=\frac{x|x|}2$$So we have that the antiderivative of $|x|$ is $\frac{x|x|}2+C$.
However, my question is: Am I taking the antiderivative of $|x|$ correctly, or how would I take it?
1 answer
Am I taking the antiderivative of |x| correctly? That depends on how rigorous you want to be. From a technical point of view, no.
How would I take it? The main problem of your argumentation is that you derivate $\left|\cdot\right|$ which isn't differentiable ([S06], pages 142 and 143) . Instead, you can do the following. Let $f:=\left|\cdot\right|$ and $id$ be the identity function: if $x\ge0, F_0(x)=\int _0^xf=\int _0^xid=x^2/2$; if $x<0, F_0(x)=\int _0^xf=\int _0^x-id=-x^2/2$. This function is an antiderivative of the absolute value function by the First Fundamental Theorem of Calculus ([S06], page 268). If you want to condense its formula, you can write $F_0(x)=(x\left|x\right|)/2$, for every $x\in R$. To obtain the indefinite (which seems to be what you want) antiderivative, you can write $F(x)=(x\left|x\right|)/2+c$, where $c\in R$ (which is actually a notation for $F=\{F_c:c\in R\}$ where $F_c(x)=(x\left|x\right|)/2+c$).
[S06] Michael D. Spivak, Calculus, 3rd edition (2006).
1 comment thread