The result you claim, that the integral of e^f(x) is e^f(x) / f'(x) is true ONLY if f(x) is a linear function in x, ie if
f(x) = Ax + B, for A and B constants. Fitzpatrick, p. 409, wrote that the integral of e^(kx) is (1 / k) * e^(kx) + c, which is of the same form as your result, but is not the same thing.
Your result is mostly used as a short cut for questions like integrate xe^(x^2). This is not a function of the form e^f(x), but people use your result anyway, and say the answer is xe^(x^2) / 2x + c, and then cancel the x's to get e^(x^2) / 2 + c, which is the correct answer, but the working is fundamentally flawed. Whether this gets penalised in an exam situation depends on the marker.
The correct way to do this type of question is using the third result I stated about, namely that the integral of
f'(x) * e^f(x) is e^f(x). This is done as follows: First, recognise that e^(x^2) is of the form e^f(x), with f(x) = x^2, so this result is probably needed. We need an f'(x), which is 2x, so start by rewriting the question. So ...
int xe^(x^2) dx = (1 / 2) * int 2x * e^(x^2) dx [Notice that we now have an f'(x) * e^f(x)]
= (1/2) * e^(x^2) + C, for some constant C [I integrated f'(x) * e^f(x) to e^f(x)]
= e^(x^2) / 2 + C, as expected.
Now, why is your result so dangerous? Well, it leads people to think things like int e^(x^2) dx = e^(x^2) / 2x, which is totally wrong - and that's just for starters.