Unfortunately the article is kind of idiotic, the actual paper [1] has nothing to do with discovering a newly discovered flaw. As the paper's author even says, many students find the same "flaw" themselves.
I do understand where the confusion is stemming from, but taking the time to properly learn the clash between Leibniz and Lagrange is very important. It makes a lot of sense to use one versus the other in specific cases. You might be able to protest by saying there are certain cases where the differential operator is treated like a fraction, but not only would I call this a rarity but also a cheat to what is really happening (the chain rule comes to mind). I do think it's cool to think of analogies that could include notation changes for students in early calculus classes though.
Fixing the notation would be great, saving numerous future generations from confusion.
Getting agreement to do this is difficult. Short of laying it down in the law as part of a treaty (throw it in with WTO or Berne copyright?) there doesn't seem to be a way to make the change happen. Nobody wants to buy the calculus book with weird non-standard notation. It would be like a trigonometry book with the symbols Feynman invented in high school.
We're more likely to get serious mathematicians calling equations "math sentences".
I am in favor of notational changes, but calling this a 'flaw' is simply wrong, and the teacher profiled should correct the journal allowing his idea to be misrepresented.
The article is dumb, but the paper is actually pretty interesting.
Intuitively, I never treated differentials as "algebraic units" -- even though I can see why it might be tempting to do so. In other words, I never thought that dx^2/dy^2 = (dx/dy)^2. From my experience, most teachers and professors do hint that "something funky" is happening when you say "dBLAH."
Further, (and more to the root of the problem), as far as higher-order derivatives are concerned, I always held a firm belief that something like "d^nx" was not really a manipulatable statement and was just shorthand for "the nth derivative" and certainly not the "first derivative raised to the nth power."
When writing a ratio that works like a ratio is too cumbersome, we
prefer simply avoiding the ratio notation altogether, to prevent
making unwarranted leaps based on notation that may mislead the intuition.
Yeah, would definitely be a shame if somebody misunderstood some notation and leapt into writing a paper about it.
Realizing that mathematical notional is inconsistent and contextual, and potentially confusing if inappropriately generalized, is a common frustrating learning experience, and it's nice to consider alternatives.
But it is not a "calculus flaw".
Also, the sidebars beside this article are filled with unrelated low-quality political clickbait, so perhaps we shouldn't give this site more traffic.
I was definitely hoping for a lot more based on the title.
I'm also not sure I agree with the phrasing that he "discovered" an alternative notation. Invented, maybe, but not discovered.
It may not be a literal calculus flaw, but at the very least it's a part of calculus that's held together with duct tape and chewing gum that people are warned to stay away from.
Sorry, but the rule against trollish usernames applies even to moderators: https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme.... This sort of thing is always better to nip in the bud rather than let fester, so I've banned this account. If you don't want it to be banned, you're welcome to email hn@ycombinator.com with an alternate username and we'll be happy to swap it for you.
[1] http://online.watsci.org/abstract_pdf/2019v26/v26n3a-pdf/4.p...