Agree in many cases you can do this however with point 2 the removal of a function(s) is literally the aim. The accretion of functions benefits legacy systems however its tradeoff is potential harm to users new and unfamiliar with the library. Accretion creates a cognitive overhead (even if only minor) for both the maintainer and new users. Maintainers when they return to code to update and modify behaviour. New users when they seek to understand the library through documentation, examples, and usage. I don't think it's coincidence that a number of languages and libraries acknowledge this by having "one correct way to do X".
Using a concrete example relating to security; libressl maintained much of the API surface that OpenSSL provides. In essence they aimed to provide a "drop-in" replacement. However there were whole families of algorithms and functions which they deemed "unsafe/unfit" for purpose (e.g. FIPS algorithms). I think that's a perfectly valid exception to the rule. It acts as a canary in the coal mine and you have options; fix your code or defer upgrading.
I would advocate for thoughtful deprecation cycles over ossification of poor APIs and algorithms.