But in this future, why will “the most compelling motivations, the clearest explanations, and the most useful maps between intuitions, theorems, and applications” be necessary? Catering to hobbyists?
Most mathematicians don't understand the fields outside of their specialization (at a research level). Your assumption that intuition and applications are limited to hobbyists ignores the possibility of enabling mathematicians to work and collaborate more effectively at the cutting edge of multiple fields.
Very far in the future when AI runs everything, of course math will be a hobby (and it will be great! As a professional programmer I'm happy that I now have a research-level tutor/mentor for my math/physics hobby). In the nearer term, it seems apparent to me that people with stronger mental models of the world are able (without even trying!) to formulate better prompts and get better output from models. i.e. as long as people are asking the questions, they'll do better to have some idea of the nuance within the problem/solution spaces. Math can provide vocabulary to express such nuance.
In addition to the already-stated causes of government issuing currency necessary to meet war spending and the fact that war spending produces destruction of economic capability rather than development, wars tend to introduce trade barriers and divert resources away from productive tasks. Whether barriers are legal (tariffs, embargoes), or simply higher premiums due to increased risk, less trade happens, which raises prices (inflation). Economists also really hate number-go-down even when down is good, so policy is oriented towards making sure deflation never gets a chance in the interwar development periods.
Only half of the incidents listed were actually full-scale wars (WWI and II). The other two incidents are an oil shock and a pandemic.
The commonality between all four of these incidents is that they correspond to severe supply shocks:
- During WWI and WWII, industrial supply was rerouted by force to the war effort, leaving normal consumer demand unfulfilled.
- During the oil crisis of the 70s, a critical energy input to the American economy massively increased in price due to sanctions placed on America.
- During the COVID-19 pandemic, a significant chunk of workers were paid not to work, as a form of deliberate supply destruction to avoid the spread of a novel coronavirus.
In a "normal" economy, supply is flexible enough that you can print money and nobody even notices. The supply curve is smooth and gradual, so prices only rise a little. When supply is constrained, however, prices rise to whatever value is necessary to curtail demand, because they have to. The supply curve is a brick wall.
Weapons cost ALOT and do very little to increase the future economy.
It's the issue Russia is facing right now in Ukraine. Even if Putin wanted to stop, his economy has turned entirely wartime, when it ends the country crashes on itself.
Crashes? How is that possible. You could take the money spent on bombs and do anything with it including helicopter money or digging holes and it would be better spent.
The problem is that you're likening fundamentally unlike things. AI isn't like a microwave or an automatic car or a power tool. It does not augment you. As I said elsewhere: AI is not a bicycle for the mind, it's an easy chair. You will lose more than you ever gain.
This is purely a matter of perception. Cooking a meal is a deeply intellectual process. If I buy a meal from a restaurant, yes I am losing a skill. But if making a hollandaise is not a skill I ever need in my life, it's not really a practical loss.
AI is taking problems and putting them in a drawer so we never have to think about it again. Matches de-intellectualized making a fire. A washing machine de-intellectualized doing laundry. These are now solved problems.
Our brainpower spent on them is effectively worth nothing. The only reason we need to learn to make a fire from scratch is for the intellectual satisfaction or for emergency situations. The same reason we would choose to work on the problems that AI can now solve.
It only a loss if you think the skill and ability you are losing is intrinsically valuable, and the only thing you are going to replace it with is leisure.
>It only a loss if you think the skill and ability you are losing is intrinsically valuable
What about the skill of learning itself? I would suggest that's one of the most important skills humans have evolved. The more integrated AI becomes in our societies, the more it will automate away potential opportunities for learning. I can forsee a world tightly integrated with AI where people are not only physically sedentary, but mentally as well.
As we progress further into the future, we need more educated people than ever to tackle the exponentially increasing complexities of our society. But AI presents an obstacle that many will never cross due to how to convenient it is to skip the messy work of understanding.
Also, this problem is not unique to AI. It existed before the GPTs and Claude's of the world. But it's a problem of scale, and every company on the Earth right now is trying to scale AI up as fast as possible.
Here's a practical example: I am using AI to help me with my garden. It's been amazing - it helps me identify plants, identify soil issues, what fertilizer to use and what days to apply it, etc.
What exactly did AI take from me? Spending hours of research on Google and Youtube to glean little incomplete bits and pieces? Calling a yard service?
It's also clearly obvious when AI gives bad or incorrect advice - I am still trying different things and watching for the results.
Coding is a outlier example where AI can just do the work semi-competently without anyone checking it. But I think it speaks more to the nature of coding itself - coding is a means to an end and for most people not an actual pursuit in itself.
>What exactly did AI take from me? Spending hours of research on Google and Youtube to glean little incomplete bits and pieces? Calling a yard service?
An opportunity for a deeper understanding of gardening? If you spend hours researching on gardening and come away with an incomplete understanding of what you were attempting to do, I'm not sure that's immediately the fault of the research available. It could be that you just didn't do a good job searching for the necessary information.
In this way, AI can be a boon. It helps figure out what you actually want to know in the moment. But I think it would be a step to far to say that a smattering of specific questions can replace the sturdy foundation povided by a typical education--e.g. through apprenticeship, books, etc.
>It's also clearly obvious when AI gives bad or incorrect advice
Is it? Isn't this a __core__ problem that researchers around the world are trying to solve? Also, __how__ could you make such a statement unless you already possessed the knowledge ahead of time to make such a judgment? I think it's hard to know if something is bad advice by looking at just cause and effect. It could be that you just lack the understanding to put the advice into practice.
> It could be that you just didn't do a good job searching for the necessary information.
How can you? The existing resources are terrible.
> But I think it would be a step to far to say that a smattering of specific questions can replace the sturdy foundation povided by a typical education--e.g. through apprenticeship, books, etc.
I am not going to go through a college program for my own garden. And I have books! But unless you can read a tiny and perform a small research project, you are not going to know how all of the plants in your specific garden in your specific region in your specific weather are going to behave.
The best I could do is hire an expert - but again I am learning less by hiring it out.
> Also, __how__ could you make such a statement unless you already possessed the knowledge ahead of time to make such a judgment?
"Use X to kill the moss". It didn't kill the moss. I will now use AI to find a list of alternative things to try to kill the moss, and learn what works in my garden.
The idea that AI is going to make people stop learning I don't think is born out in practice. It might make some people stop researching as an activity though.
> "It only a loss if you think the skill and ability you are losing is intrinsically valuable..."
I'm fascinated by the AI bros putting hollandaise sauce and making fires on the same level as creating production software. One hopes that it is because they create only very simple software, making the analogy less invalid than it would be for more complex software. If not, the implication is that loss of the reasoning and cognitive ability needed to build foundational software like libraries and frameworks is not important to them.
The only thing that separates homo sapiens from other species is the sapience. Diminishing or atrophying one's own cognitive abilities is the same as climbing down the evolutionary ladder.
I mean, doesn't the fact that people rely on these libraries and frameworks without thinking it itself prove the value of intentionally compartmentalizing off skills?
No one is arguing that everyone needs to build programs ground up from assembly. So what's the magic difference between using a framework and asking a computer to write out the for-loops for me?
> making a hollandaise is not a skill I ever need in my life
I know you just wanted to poke at the analogy, but if you like hollandaise, it's one of the easiest and most rewarding sauces to make at home! Restaurant hollaindaise is usually terrible
(Though it's not as easy as a béchamel, and yet I still see people buy jarred alfredo sauces. You can literally make an amazing alfredo sauce with pantry ingredients in less time than it takes to boil the noodles! Why would anyone buy an alfredo sauce!?)
Although this more or less is my point. If people are willing to give up these incredibly high reward, low effort skills - how much more uphill is the battle to make people code and process data?
Now you're getting it! The modern way of life which prioritizes convenience and production destroys human connection. Making sauce is pointless; let's go one step further and make every other thing you might do equally pointless. Welcome to the hellscape! It's surprisingly comfortable.
The other extreme is also a hellscape. Work and suffering is the only thing of value. Let's make pyramids to bring people together and show off our collective wealth.
Again, writing replacing memorization is not a good 1:1 comparison to AI replacing technical understanding. Someone still needs to understand what is written and act upon that knowledge. That requires skill and experience in the domain they're working within.
However, a person using an AI does not need to understand the underlying problem to get results. A person can ask Claude Code to write them a web app dashboard without having ever learned JS/CSS/HTML. It does not require them to have skills within a domain.
Also, we need to be honest with ourselves. Human brains did not evolve for the instant gratification of modern technology. We've already seen what technology has done to our attention spans. I am concerned over what further reliance on technology, particularly AI, will do to our brains.
> However, a person using an AI does not need to understand the underlying problem to get results. A person can ask Claude Code to write them a web app dashboard without having ever learned JS/CSS/HTML. It does not require them to have skills within a domain.
This perspective is funny to me because of how much the modern web is already built around web developers refusing to use CSS and PHP. The giving up of the skills happened before the automation.
Dubious. Ai psychosis is the opposite. It’s about being empowered to explore ideas much further but with a maladaptive tool designed to be an appeaser by reinforcement learning.
Other than the 14 examples included, not yet; this is brand new and I am reaching out to the community to create some stuff with it, extend it with angled brushes, new entity types, etc
reply