Ran across a fun quote this afternoon:
There are in this world optimists who feel that any symbol that starts off with an integral sign must necessarily denote something that will have every property that they should like an integral to possess. This of course is quite annoying to us rigorous mathematicians; what is even more annoying is that by doing so they often come up with the right answer.
Bulletin of the American Mathematical Society, v. 69, p. 611, 1963.
As much as mathematicians like to joke about the gaps between pure and applied math, as in the quote above, the gap is not as wide as supposed. Things like delta “functions,” or differentiating non-differentiable functions, or summing divergent series have a rigorous theory behind them.
It is true that these things were used in practice before theory caught up, but the theory caught up a long time ago. There are still instances where hand-waving calculation is ahead of theory (or just wrong), and there always will be, but distribution theory and asymptotic analysis provided a rigorous justification for a lot of seemingly non-rigorous math decades ago. This is well-known to some mathematicians and unknown to others.
That said, you still have to be careful. Some formal calculations lead to false results. They don’t just lack rigorous theory; they’re simply wrong.
The people who abuse notation are either really bad or really really good at mathematics. As someone who is solidly middling, I’m irritated by the former and in awe of the latter.