The word “classical” has a standard meaning in the humanities, but not in science.
Ward Cheney and Will Light give a candid definition of “classical” in the scientific sense in the introduction to their book on approximation theory:
… the “classical” portion of approximation theory — understood to be the parts of the subject that were already in place when the authors were students.
There you have it: whatever was known when you were in school is classical. Yes, this definition is entirely relative. And it describes common usage pretty well.
The first sentence strikes me as very odd.
I don’t know about “standard,” but the word “classical” does have an unambiguous definition in physics, viz., anything that is pre-quantum theory. Perhaps this usage is not known outside physics. Historically, it refers to those physical models or theories that existed before the mid-1920s, when a consistent quantum mechanical formalism was finally developed. In the 20 years prior, there was a lot of approximation and guessing going on, as people struggled to accommodate atomic effects into a Newtonian framework, but failed to reach any kind of mathematical consistency. For this reason the term “semi-classical” also appears in the physics literature when referring to those theories.
An example of a semi-classical theory is the Bohr (planetary) model of the atom c.1913; the one most people have in their heads today. Such an atom would collapse under purely classical laws of physics so, certain ad hoc rules were imposed (e.g., quantization of classical angular momentum) to prevent it. Hence, semi-classical. In the later quantum mechanics, these ad hoc rules are unnecessary because the fundamental physical variables are defined very differently from classical Newtonian variables.
From the modern standpoint, we say that a classical regime is one where the quantum fluctuations (that always exist) are sufficiently small that they can be ignored with respect to the rest of the system; which is how things looked to Newton.