- Joined
- Feb 16, 2005
- Messages
- 8,384
- Gender
- Male
- HSC
- 2006
The definite integral ∫f(x) dx (from a to x) is considered 'sloppy' notation and instead we replace the x in the integrand with a dummy variable, say t, and we get ∫f(t) dt (from a to x), which of course supposedly gives the same answer assuming f(x) is continuous.
So why is ∫f(x) dx (from a to x) supposedly an incorrect notation, even though it works regardless of what variable we set the integrand to be?
So why is ∫f(x) dx (from a to x) supposedly an incorrect notation, even though it works regardless of what variable we set the integrand to be?