Almost everything you can do on your own is a "solved problem". Why go into woodworking if you can buy an Ikea stool? The point of hobbies isn't to solve problems - that's called a job - but to learn and have fun.
Find a niche where you can resist the temptation to constantly compare yourself to eight billion other people on the internet. Something where success isn't measured in Github stars, Youtube likes, or Reddit upvotes. Once you get in that mindset, almost anything goes. I know people who collect RPN calculators and are having a blast. All kinds of hands-on crafts are great too. I like making electronic music and I'm pretty bad at it.
“When I was 15, I spent a month working on an archeological dig. I was talking to one of the archeologists one day during our lunch break and he asked those kinds of “getting to know you” questions you ask young people: Do you play sports? What’s your favorite subject? And I told him, no I don’t play any sports. I do theater, I’m in choir, I play the violin and piano, I used to take art classes.
And he went WOW. That’s amazing! And I said, “Oh no, but I’m not any good at ANY of them.”
And he said something then that I will never forget and which absolutely blew my mind because no one had ever said anything like it to me before: “I don’t think being good at things is the point of doing them. I think you’ve got all these wonderful experiences with different skills, and that all teaches you things and makes you an interesting person, no matter how well you do them.”
And that honestly changed my life. Because I went from a failure, someone who hadn’t been talented enough at anything to excel, to someone who did things because I enjoyed them. I had been raised in such an achievement-oriented environment, so inundated with the myth of Talent, that I thought it was only worth doing things if you could “Win” at them.”
I can attest to this, in my own way. I had a headstart since nobody ever pushed me, parents werent competitive fuckups and thus their child wasnt.
I do climbing, hiking, skiing, skitouring, diving, weightlifting, used to do paragliding, now trying to pick up wind surfing. None of those sports on high level, just keeping it cca the same.
Climbing is a good example since every climbing route is exactly graded for overall difficulty. Over years I even went down a bit from levels I used to do. I just go to the gym or rock, enjoy even the simplest routes that some consider for beginners, despite doing sport for 15 years. Many folks around keep pushing themselves hard to progress, I just dont have that bug.
Exactly same could be said for weightlifting, found my set of free weights values and just keep doing them. If I feel extra strong I may do few extra reps but thats it. I was even mocked here for saying this but who cares, I do it for myself and not to impress or do stupid empty status games with others. Its very sustainable that way too.
That sounds awesome! I basically do all the same sports, but I always need to win or to get better at them. I am trying to move in your direction though.
That’s a dour outlook. I don’t cook to be the best at it, I cook to feed my friends. I don’t sing to be the best at it, I do it so I can have fun at karaoke. I don’t boulder to climb mountains, I do it because it makes me feel healthy. I don’t bike to win races, I bike because it’s the fastest way to get around my temperate city. I’m a worse cross-stitcher than my sister but I like that she teaches me. I go to music festivals because it’s the only place you can easily dance for 8 hours straight. I’ve never taken a dance lesson but regularly am asked to join in because I’m having fun!
I totally agree with what you've written - "comparison is the thief of joy" is one of my favorite mantras.
That said, hobbies can be remarkably useful because they allow you to create or engage in something that is uniquely tailored to your own personal interests, and the modern economy often doesn't provide that level of personalization, or if it does it's extremely expensive. E.g. the other commenter that decided to design his own clothes because mass produced clothing is really just tailored to "average".
What defines hobbies is not what or why you do it, but when you do it. The phenomenon of hobbies has been mulled over quite a bit - activities that engage many disciplines of an actual job, activities that for other people actually are jobs. The key difference in the nature of a hobby is that we do the activities on our own schedule - we are in no way compelled to do them but have complete ownership over the execution.
> I understand the sentiment, but.. do you realize how much more expensive that would make all these services?
It wouldn't. For example, before Gmail, email was often free or nearly free (bundled with your internet service), but in most cases, you could talk to a human if you had issues with the service.
What we couldn't do is turn these business models into planetary-scale behemoths that rake in hundreds of billions of dollars in revenue. In essence, you couldn't have Google or Facebook with good customer support. I'm not here to argue that Google or Facebook are a net negative, but the trade-offs here are different from what you describe.
I find the production and consumption of AI music to be uniquely... anti-human. You can make utilitarian arguments for most other uses of AI. For example, the code you're generating didn't exist before, and it would take serious time or money to write it. So, I get it, the economic argument is compelling enough.
But music? There's basically an inexhaustible supply of human-created tracks that can be accessed for next to nothing. Millions upon millions of them, in every conceivable style, for every conceivable mood. There's nothing you gain by listening to AI music day-to-day, so what's the argument for it - other than utmost indifference to human creativity?
Or the uno reverse - that's what the anti-AI crowd is experiencing in their inability to adjust to the coming reality.
You're just going to have to make peace. I don't know how y'all can cope with being angry at progress all the time. It's not going to stop for you. It's also really awesome that we live to see this come to pass.
> As far as I'm concerned we're content scarce and I don't care what makes the music - humans, robots, netherworld demons - I just want good music.
Presumably you've already listened to every piece of music ever recorded? Otherwise it seems it would be more efficient to do that first than wait for AI to generate it and you chancing upon it.
I think humans are machines, they are just vastly more advanced than any machine invented by humans. This is something I thought long before the current AI hype cycle.
What do you think are some important differences between machines and humans?
At an abstract enough level, not really. I treat them with care and try to give them whatever they need to do their thing. I want them to last as long as possible.
But in asking this question you must have some differences in mind. Could you speak to some of those?
What is the non machine part? What do you believe exists other than chemical and electrical systems?
Edit: If you mean machine in a more colloquial sense that's fine. Let us first get clear if we mean machine in that sense or in sense of any physical mechanism.
If the question is what is there about us that's not covered by the body, we can mention things like: feelings, intentions, perceptions, acts of consciousness.
Or however else you want to divide up things that have to do with the mind.
Eliminativists/illusionists may completely deny such things. The rest can fall into many camps, some of them religious.
It's not like there are any surprising new parts. It's about how one chooses to interpret/conceive those we are familiar with.
Another different question to help me understand what you think of this. I think you agree with me, but just to clarify. A human being is independent of the process of creation right? If we created a molecule by molecule synthesis of a human being, you'd agree it is conscious and the same thing as a human created via typical reproduction, right?
And what part remains in that space after we have mapped all the brain signals and configurations corresponding to these feelings, intentions and perceptions? I don't feel the need to bring up absurd unproven concepts without waiting for more data. It'd be like me saying there is something aphysical behind Mercury's orbital perturbations if I were born before SR and GR were discovered (as an example). No point in jumping to such an argument without first exhausting more believable causes first. History is very strongly against any kind of bet in the aphysical.
My question to you would be, what do you think remains that's not a simple natural system if/after something like Neuralink is successfully established?
Forgive me if I ramble for too long. I've been seeing a lot of comments in this vein and the thoughts have accumulated.
Tacit in your question is the notion that the inquiries that are important are those that can result in predictive models of phenomena encountered in the world — hence feelings, intentions & perceptions turn into a shorthand for reported accounts of the same — and that given enough reports (data), we could build a dictionary that maps a bundle of reports to a(n equivalence class) of physical system(s).
But when we speak of having feelings, or acting on intentions, most often we are not using these as stand-ins for our failure to pin down the current state of our physical system to another. If I am exposed to fire, I want to get away — I am unconcerned with how well I could translate my report of the pain to a patterns of neural activations. The reality of pain for me is unaffected by the fidelity of my "experience report dictionary". And it is there whether it's a brush fire or a neuralink streaming fire bits to my cortex.
If you decide that primacy ought to always be given to things as they can be modeled, you can choose to elevate the "experience report dictionary" and make the reality of experience a second-class citizen. Then you end up with an eliminativist ontology where indeed, we can rightly be called a mechanism.
But that is a "world-making" decision, a value judgement: "this is how things should be seen". It might be sponsored by our recent history, where we got high on the fruits of applied scientific modelling, nursed by the education which taught us that being a good engineer can have us continue in line with that, and pushed on us by impoverished modern eschatologies promising eternal youth, experience machines and what-not at this point. And it might seem preferable or more dependable than whatever equally impoverished, inhumane eschatologies we may have been presented with before.
It doesn't mean there isn't a whole world of places where we can go instead. But in general, we don't change our value judgements until the current one seems inadequate for some reason.
> If we created a molecule by molecule synthesis of a human being, you'd agree it is conscious and the same thing as a human created via typical reproduction, right?
Eternal youth and experience machines don't seem like problems with any conceptual difficulties. We already know electrical and chemical signals change what the brain perceives and eger6nal youth is no more difficult a concept than making any other form of long lasting machine. Obviously there is a long sequence of research problems to solve in the line but none of it is conceptually impossible or blockaged.
Yes so that was my point, if we can agree that a molecular synthesis of a human being, being a pure naturalistic physical process is as good as any other human then if we assume some aphysical element to consciousness, then we have a purely physical process for achieving a system with aphysicality in it. Which means either its not in fact aphysical or then what, we are left with the quetion at what point during this assembly process this new special aspect arises.
It's my feeling that we are still getting too ahead of ourselves in judging some supernatural element, that it's much like the atomism question in ancient greece. An honest thinker back then could have no really firm reason to support one side over another and they tended toward thesse kinds of endless circular metaphysical discussions. That is until we had further data and observation tools which settled the question experimentally. Juat like certain aspects of consciousness, atomism felt an insolvable question in some ways back then. I feel the problems we will have with consciousness will eventually have a similar fate. This bet has always succeeded for millenia till now.
Oh that's what you're banging on about. You think AI is like a demon, or you think LLMs are people too, something like that, hence "I don't care what makes the music". That would otherwise be a spooky and implausible phrase that says something strange about what gives music quality, as if quality in music is something ethereal and mathematical and objective and detached from the human condition, and detached from artists. But if you think the AI counts as a person too then it seems less cold and abstract.
(Belatedly) yes. Kind of a big argument to grapple with, but let's start by considering everything. I mean, all the stuff, the abstract stuff, that's out there objectively in the universe and in the future, waiting to be discovered. I believe there's quite a considerable amount of it. It's all potentially of interest to us eventually, and only a teeny tiny part of it is comprehensible to us now. That part is at the leading edge, the cutting edge of our enquiries, and in order for us to see and comprehend and even care about that part, it has to relate to us. It has to be oriented to us and our thoughts and things we can use.
You see what I'm getting at? Humans don't really like abstract things. Mathematicians seem to, but I doubt that even mathematics truly has an objective abstract quality that's distant from human concerns. I reckon humans do human mathematics, and it probably has fashions, too, it's probably modern and current, that is, of its time and place.
So you could accept that, but still claim that music relates strongly to mathematics as we know it. Of course there's such a thing as the mathematics of music. I could dispute the value of that to the quality of the music, as being too abstract and niche compared to the evocative qualities of music, where it evokes things in our physical world: the sounds of hitting things with sticks, heartbeats, tones of voice, meaningful instruments such as bugles evoking battles, mazy noodling around evoking contemplative thoughts (is that abstract?) ... but either way, the point is that we live in a sort of parochial Bag End, if Middle Earth represents everything abstractly possible, and so we only understand hobbit things and only appreciate hobbit art. So to speak.
Sometimes you can't even tell. I was in an uber drive where the driver had this incredible playlist of Brazilian bossa nova. It was sublime and some of the best tracks I ever heard. He even said he loves the singer but cannot find their name anywhere. It turns out it's a youtube playlist that is fully AI generated and genuinely some of the best bossa nova you can imagine. I still hear that playlist daily tbh. Moreover, imagine if you are an independent musician, have a good voice, know how to play instruments...you could ask AI to generate hit tracks for you and then you can play them at concerts or shows and claim them for yourself
There isn't that much good electro-swing made by humans, and not much new coming out. One can easily consume it all and want to hear some new tunes in that genre, and maybe AI can help with that.
I guess we've had different experiences then, because youtube has had no problem showing me huge amounts of electro-swing in the past (before AI-generated music was a thing). I've somewhat moved on from that genre though
I think that's probably the crux of where there's conflict here. There was a time in my life where I definitely much more emotionally invested in the music listened to. I thought I'd definitely kill myself if I ever went deaf. But these days, I really just have it for background noise when I'm working, exercising, doing chores. And it's all just electronic stuff – I don't like vocals (unless they're sufficiently unintelligible so they don't become a distraction to my thinking). At the end of the day, it's just some beats to me. AI or not.
I can recommend you to spend some free time to really listen to music again, Beethoven, Hendrix, Gorillaz, Slayer, Sub Focus, whatever floats your boats your boat. Your brain is wired to remember and sing along to music around a campfire, and will pump you full of exquisite drugs if you really give into it, ideally together with other people. Alleviates stress and makes you happy.
Music demoted to just background noise is unrelated to the social concept of music, which is so ingrained in our nature that we all can’t escape it. And that to me is also why I agree with OP—AI-generated music is fundamentally treason to our species.
Is formulaic pop music produced by a corporate label that's designed to push all the right buttons more "human" than the average track you find published on Suno? I wouldn't say so. Pop music was already to some extent a commodity.
Actually, it is more human, because there are humans involved at each level. Doesn't matter if you think the music sucks, it's definitionally more human than AI music.
I feel like the more important distinction might be whether the creator(s) are expressing themselves or are solving an optimization problem of maximizing audience approval. The latter seems true for both some human and AI pop songs.
One is a form of communication that requires (at least to some extent) meeting both sides to meet in the middle, the other is unidirectional broadcasting.
It is sort of a blend now. Beats and rhythm tracks are often generated. Vocals are auto-tuned. There's still some humanity in it, but it's not what it used to be.
I mean, maybe in the sense that any other corporate activity is technically “human activity” because humans happened to be the ones doing the formula-dictated tasks, but it's ultimately the formula at the helm, not the human.
AI music is generated from the result of training on far more human-made music than any human could ever consume in their lifetime, so there are even more humans involved in its creation.
> Pop music was already to some extent a commodity.
The commodification of humanity predates human history. It may be a negative trend that alienates us from each other and from the products of our labor, but it is truly ancient.
Electronic music exists but has limited commercial scope because most people don't see the point of music if they can't form an emotional connection with the artist through the music. Popular music has an intense focus on the artist.
AI "music" has the same issues as electronic music but worse: because it's trying to imitate humans rather than be its own thing like electronic music, it's not only emotionally unavailable but also creepy. Can you imagine listening to an AI "musician" laughing, for instance? It makes my skin crawl even thinking about it.
That's a dangerous game to play, though—the only value record companies have is their intellectual property, especially if they are no longer financing recording new material. Convincing people to listen to slop is a great way to completely obsolete yourself.
Not only that, but music generated by AI is not copyrightable. If it's truly 100% AI generated, you can redistribute it to your heart's content without infringement. (IANAL)
Someone will surely attempt some kind of end-run around this, perhaps through ToS alterations at the service you obtain the music from, but it's undoubtedly a problem for the labels. In the meantime they have a strong incentive to keep human creativity in the loop.
To me the anti-AI crowd is looking at this through the wrong lens, it's now possible to generate an infinite library of music that isn't copyrighted, and can be freely shared, some of which is quite good. There is a pathway all the way from conception to mass distribution that doesn't require the major labels. Whatever else happens that seems like a silver lining at least.
If you consider say elevator music - music that's just there to fill space, rather than to be listened too - then I don't think there's that much difference between using AI to produce it and using AI to produce clip art or boilerplate code.
Music as wallpaper vs music as artistic paintings.
We are fine with mass-producing wallpaper with machines. People buy this every day, no problem.
We are not fine with mass-producing framed paintings that are "art".
Both hang on the wall as decoration. Essentially the same purpose. But we have very different feelings about them and hold them to very different standards.
Music is the same. We have muzak - background music that isn't supposed to be listened to, it's just wallpaper. I don't think many people object to this being machine-made in bulk. And then we have music that is art and is supposed to be listened to explicitly. We hold this to a higher standard and expect it to be the product of human creative urges.
It depends entirely upon who the "we" is in question. There has long been an aristocratic tantrum against affordable decoration in the art and architecture world, dating back to men's formal wear going mostly monochromatic as soon as colors became widely affordable instead of reserved for the gentry. There were similar ones against ornamentation with Brutalism (mixed with dadaist 'the world doesn't deserve art!' post WWI despair memes).
The cynical would dismiss the whole distinction between mass produced and unique art as arbitrary. Or worse, just as a racket to create artificial scarcity, a social kabuki show to create the pretension of high culture, or for the purpose of some sort of criminal scheme like money laundering.
Well, code and visual art is more differentiated, so the thing you need probably doesn't exist and it would take effort & money to procure it. Not always, but often enough to make rational people default to AI.
With music... if there's a style you like, no matter how eclectic, there are probably thousands matching human-recorded tracks you can listen to today.
I guess using AI is just the logical continuation of what mainstream pop already did before that: reduce music to the lowest common denominator so it can appeal to as many listeners as possible. AI only speeds up that process.
I remember when Apple got ridiculed for running a commercial where they crushed a bunch of musical instruments and artist supplies into the shape of an iPad.
I guess if AI companies did the same, they would be crushing people into the shape of an input prompt.
There's basically an inexhaustible supply of human-created tracks that can be accessed for next to nothing
You train an AI on that, in order to create something that combines all of the best parts that you want. If anything, I think AI music is the natural progression of innate human desire to leverage and "stand on the shoulders of giants" to create something bigger from smaller pieces.
Which is of course nonsense because LLM is from definition unable to bring in something new. It's not standing on shoulders of giants, it's just making endless copies of them.
That is trivially untrue, even if we ignore the misnomer of trying to use a language model for non-linguistic audio file outputs. I can assure you there was no reference material of say Sam Altman getting arrested when he is getting caught stealing GPUs from a shelf of a BestBuy. (One of the uses of SORA.)
Used Suno to reimagine a handful of my old demos late last year, and honestly the results floored me. I could never release those tracks though, purely out of shame. But it seems pretty practical to study the AI remixes to understand what I like about them, and use these as a practice tool for music production.
It's not that people want to listen to AI music, per se. According to the article, this artist charting was part of an April fools gag. It's about ego, or maybe hubris. People think their idea for a record is good, but don't want to learn musical composition. Instead, they put blind faith in AI generation. Gen AI is more for the idea men unwilling to put in the effort than the consumers.
Because human singers will usually sing about what they like. They will use their own life experience and imagination to write and sing songs. Other people may or may not like them.
AI will only sing songs that other people like, so AI singers will naturally attract more listeners.
You’ve hit upon a bit of a paradox inherent in music - the average listener really gives next to no shits about human creativity or the artistry and hard work that goes into being a musician capable of releasing music. They can’t even comprehend, so don’t. Music is something that comes out of a speaker same as water is something that comes out of a tap.
It isn’t indifference, it’s obliviousness. My mother keeps on listening to AI music, and I’ll be like “why are you listening to this slop” and she’ll then argue back that it isn’t AI, it’s actually really very good and I’m just jealous, as the synthetic voice continues to warble nonsenses like a fucking arcade machine full of snakes in the background.
It’s an even more uncomfortable truth: your average Joe cannot tell the difference between human made music or AI generations, just as they also really think that that 8 year old African boy with a huge beard and no hands built a helicopter out of old bottles, or that that cat walked into a hairdresser wearing a suit and had its whiskers curled.
So there’s no argument for it apart from “people will buy the product because they can’t tell that it isn’t real”.
while I don't like AI music, "Millions upon millions of them, in every conceivable style, for every conceivable mood." is something that's not true. There very often is a gap which forces me to open up Ableton and make edits
> I find the production and consumption of AI music to be uniquely... anti-human.
I mean, I'm a professional musician - not sure if that gives me more credibility or less - but I don't feel slighted by folks listening to music made by others (whether those others are other humans, or birds, or whales, or AI).
As you point out, music has an infinite edge; one can spend a lifetime exploring either its niches or its closures and still have an infinity of each to continue discovering.
As moat identification goes, I do feel slightly secure in the sense that AI music (and the information age generally) seems to stoke a hunger for dirty traditionals played well on thick steel strings, and it's going to be a minute before robots can pick 'em like we can.
This sounds a tad misanthropic, if I had the choice to opt out of working full time making music is one of the primary things I'd spend my time doing. I like software but at the end of the day to me it's the most creative job I can do while still putting bread on my table reliably.
The reasons I don't do music full time are purely economic ones, far from wanting to 'free up' my time to do other things with AI music I'd rather have more of my time occupied by working on music. I want AI to automate the things I don't want to do, I want it to automate the mindless drudgery that is required to exist in a society. Automating art so that I have more time to work is a philistine position in my view, and one which reveals a somewhat dystopian vision of humanity's relationship with both art and work.
What an insane take. You dont have any songs you like that there arent many others like it and that you can generate an endless supply of with AI? Come on.
I sure do love the dying thrashes of human-creativity chauvanists. AI art, AI video, and AI music will eclipse most humans and there is absolutely nothing that will stop it. And you will use it and appreciate it more too. Once you open your eyes that is.
Have you heard of dubstep? It sounds like robots falling down stairs, and humans made it and love it. If AI can make music less crappy, I'm all for it.
Eh. It doesn't start or stop with people like Altman, Zuckerberg, or Nadella. I think it's a symptom of a broader problem in tech. Half the people on this site made a decision to work at companies that do shady things, and they did that to maximize personal wealth.
The difference isn't that the average techie doesn't dream of making a billion by any means necessary; it's that most of us don't think we have a shot, so we stick to enabling lesser evils to retire with mere millions in the bank.
I don't think it's all that hard to avoid working on anything shady. It's not as easy to avoid being associated with anything shady due to widespread cynicism and a tendency to treat tech companies with thousands of projects as a monolith.
> The difference isn't that the average techie doesn't dream of making a billion by any means necessary
I hope that's not true. If it is, we live in a bleak world indeed.
I can confidently say I've never once dreamed of having billions. I've never wanted billions. Not even in a fanciful manner. What would I do with that money? Buy mansions and megayachts? That's loser stuff
Most of what I want out of life cannot be bought. The pieces that come with a price tag, like a comfortable home, do not require billions
I think only sociopaths want billions because they don't understand spending your life seeking things that actually matter, like family and human connection
There are three possible paths that sort of substantiate current valuations:
1) Business: LLMs become essential to every company, and you become rich by selling the best enterprise tools to everyone.
2) Consumer: LLMs cannibalize search and a good chunk of the internet, so people end up interacting with your AI assistant instead of opening any websites. You start serving ads and take Google's lunch.
3) Superhuman AGI: you beat everyone else to the punch to build a life form superior to humans, this doesn't end up in a disaster, and you then steal underpants, ???, profit.
Anthropic is clearly betting on #1. Google decided to beat everyone else to #2, and they can probably do it better and more cheaply than others because of their existing infra and the way they're plugged into people's digital lives. And OpenAI... I guess banked on #3 and this is perhaps looking less certain now?
I'm increasingly unsure if this is something to aspire for. I make an effort to only follow people I know, and I turn off algorithmic feeds on social media, but it doesn't matter because the people I know routinely reshare made-up political bait and AI slop that's coming from the broader ecosystem.
This sucks and there's no way to push back on that. First, if you do it too much, you're just a "reply guy" - you become a part of the same suckiness of social media that you're trying to push back against. Second, the near-universal reaction you get is "maybe these specific immigrants were not eating pets, but you gotta agree with my broader concern about immigration". This just an example, the reaction has its equivalent for all sides of the political spectrum. We just like to read stuff that aligns with our political identity and beliefs. The pursuit of truth is a distant second.
I think that for social networks or forums to be at least somewhat healthy, they need to be small, specifically to limit the interactions you have with complete strangers and content that doesn't interest you at all. If you open up the ecosystem too much, it devolves into some flavor of Facebook.
I have rarely been in a group chat that suffers the same problem as I see on all the other social networks (Google+ and its "Circles" seemed promising while it lasted...) but it could be because leaving a single chat is easier than leaving an entire network and the group is well-defined. Federation is good but it's not enough on its own. If I think back far enough I do remember email chain letters with people forwarding everything to everyone on in their Eudora addressbook:
> Now for 180. Forward stupid chain letters to as many people as you can.
> Remember: Be annoying whenever possible
It seems to me that if a given social media network is not an effective way for you to connect with someone then try something else. Expecting one platform to handle all out social connections is unreasonable. Some people live on Discord and others prefer a phone call etc. A world with everyone on IRC would be convenient but probably also a nightmare once someone figured out how to make money off with it.
Yeah, I agree that small social networks are better. But some people are just bad at using social media - even if they're great people in other ways, so they share AI slop and made up political bait posts. You may have to curate your feed a bit.
> Everyone who depends on the good graces of a cloud provider for something (not just Google, but Amazon, Microsoft, Apple, whatever) needs to at the very least, take a moment, and figure out what their plan is when they are suddenly banned and locked out permanently, without any way to contact the company.
This is one of the most common sentiments I hear expressed on HN, next only to "if you're not building your software business around Claude Code, you're gonna left behind".
It probably has to do with the fact that we condition children and adolescents to consider white-collar jobs as more noble than blue-collar jobs, then we tell them that to get a good white-collar job, they need a degree... and then we make STEM degrees hard by subjecting students to more math than most people realistically need. So we have a lot of frontend developers who know calculus and an oversupply of people with humanities degrees.
With that degree, you're generally pushed toward jobs in journalism, publishing, graphic design, teaching, administrative functions, and so on. Most of these pay relatively little.
Calculus is required for English degrees in other countries. Heck a lot of countries require some amount of calculus just to graduate high school.
Same goes for the basics of statistics. A basic understanding of statistics is a requirement for any college degree in many countries, and for good reasons. Stats comes up all the damn time. From proper A/B testing, to marketing, to understanding public health emergencies, to making informed medical decisions.
6 semesters seems like... a lot? IIRC getting a math undergrad at my Uni didn't require that many classes of calc.
I think calc 1 and 2 are extremely valuable. The concept of rate of change is fundamental to so many things in life, and understanding "area under the curve" is essential to understanding how many ideas are communicated, including lots of graphs in physics, chemistry, and economics.
Beyond that I feel calculus starts getting into specific applications and is less generally applicable to the populace at large.
I don't think it's a matter of more 'noble', simply a more comfortable option if it's available to you. It has historically paid better and taken a lower toll on your body. The former is now less true, but the latter is still a big issue.
It's a shame that calculus isn't required by every college degree. Just because I'm not integrating functions during my normal work, doesn't mean I don't benefit from understanding the fundamental principles.
Yes, totally. I was about to undero surgery but found out the doctor didn't even know about Laplace transforms. He small-mindedly spent his formative years learning anatomy, never benefitting from the knowledge of frequency-domain derivatives. I dodged that bullet by storming out.
Would you say the same about learning Christianity: maybe not directly useful for your job, however it is rather foundational to much of English society.
My current hypothesis is that as AI forces software development down less and less deterministic pathways, I suspect that the value of a basic CS degree will diminish relative to humanities training. Comfort with ambiguity, an ability to construct a workable "theory of mind", and to construct unambiguous natural-language prompts will become more relevant than grokking standard algorithms.
The reverse most certainly is not true, and even if it were it wouldn't matter.
Humanities advocates have been hoping for the demise of valuable STEM degrees for at least the last 30 years. It's not happening for many reasons, of them being: All the skills you listed are also taught in an engineering and rigorous CS curriculum, plus those degrees provide validation that the individual is intelligent and determined enough to complete coursework that most people cannot.
I dunno, man. The difficulty (and resentment of having to even take them) most STEM majors had in my college-level writing classes causes me to doubt that, as does the general reaction on this board to any kind of problem / domain with irreducible ambiguity. But look, I'm not talking about the top ~10%, or whatever: the really smart kids can adapt to whatever gets thrown at them[0]. I'm doubtful that a 50th-percentile or below CS degree / student will retain the value that they've recently had - and given what I read on here about the present job market for new grads on here, that's maybe already happening.
Anyway, I had to pick one, my money'd be on philosophy degrees rising in value: they're already sought out by financial firms. Have you seen the sort of analytical / symbolic reasoning they do?
[0] In fact, in case you didn't know, rigorous humanities programs and research involve an awful lot of statistics and coding, even though the dinosaurs that run the MLA and most English departments aren't able to handle it.
> I dunno, man. The difficulty (and resentment of having to even take them) most STEM majors had in my college-level writing classes causes me to doubt that, as does the general reaction on this board to any kind of problem / domain with irreducible ambiguity.
I don't think most STEM majors would be outstanding English Literature (or whatever humanities program you prefer) majors, but I do think they could manage to obtain a degree. Very, very few humanities majors could get an engineering degree.
And yes, the writing classes they force engineers to take are largely pointless and not enjoyable. Everyone with a degree got through them though, and I have to imagine the percentage of STEM students who washed out on that and not organic chemistry, compiler design, differential equations, etc. is extremely small (it was 0 out of the hundreds of people I knew at my school).
> But look, I'm not talking about the top ~10%, or whatever: the really smart kids can adapt to whatever gets thrown at them[0].
Sure. Very few of these kids are going into publishing, because they'll have more lucrative options and will pursue them.
> I'm doubtful that a 50th-percentile or below CS degree / student will retain the value that they've recently had - and given what I read on here about the present job market for new grads on here, that's maybe already happening.
That may be, but they're still in better shape than a 50% percentile humanities degree holder, who also is having the value of their skillset eroded by AI.
> Anyway, I had to pick one, my money'd be on philosophy degrees rising in value: they're already sought out by financial firms. Have you seen the sort of analytical / symbolic reasoning they do?
Lol, they are not "sought out" in any sense of the word. Philosophy majors at top tier schools are sought out because everyone at the school is sought out, not because they majored in philosophy.
And yes, I took a number of philosophy classes in college as an undergrad because they were easy (have you seen the analytical/symbolic reasoning required of EE or CS majors? It's a lot more difficult that what is required of philosophy majors).
> [50th percentile CS grads] are still in better shape than a 50% percentile humanities degree holder, who also is having the value of their skillset eroded by AI.
That's the crux of it, and right now it appears to me that the ability to write unambiguous natural language prompts - in a variety of contexts, not specifically heavy-duty dev work - is going to be increasingly valuable. The 50th percentile english / philosophy grad is better at that than the 50th percentile CS major - while, at the same time, the bottom rungs of the developer ladder appear to have been kicked out.
I'm trying very hard not to make this into a "who's smarter?" question. That's a well-trodden and pointless argument, particularly if money is going to be the measuring stick. Besides, if that's where we're going, the finance bros and C-suite win, and do either of us think they're the geniuses in the room?
But, we'll see. We're living in Interesting Times.
> That's the crux of it, and right now it appears to me that the ability to write unambiguous natural language prompts - in a variety of contexts, not specifically heavy-duty dev work - is going to be increasingly valuable. The 50th percentile english / philosophy grad is better at that than the 50th percentile CS major - while, at the same time, the bottom rungs of the developer ladder appear to have been kicked out.
We don't agree here. I see no evidence that the average humanities major is better at writing unambiguous natural language, nor that it will be a partcularly valuable skill. Most people are incapable of understanding and describing a complex series of steps, including their side effects and tradeoffs regardless of the language used to describe them.
> I'm trying very hard not to make this into a "who's smarter?" question. That's a well-trodden and pointless argument, particularly if money is going to be the measuring stick. Besides, if that's where we're going, the finance bros and C-suite win, and do either of us think they're the geniuses in the room?
That's my point, there's no avoiding this. Standardized test scores used as part of college admissions are intelligence tests and income is highly correlated with intelligence. We have all these proxies that are providing the answer to this question.
And the hedge fund managers and CEOs of large companies are very intelligent on average (I'm sure some aren't but they are the outliers, not the other way around). Just like there are some very intelligent social workers, artists, and unemployed people, but the averages are what they are for various fields for a reason.
> I see no evidence that the average humanities major is better at writing unambiguous natural language
If you'd marked enough undergrad papers you would have. :-)
> Most people are incapable of understanding and describing a complex series of steps, including their side effects and tradeoffs regardless of the language used to describe them.
That's true!
But... The AI promise is that users won't have to do all of that part. They'll describe an end-state, and the machine will work out the steps needed to get there, asking clarifying questions along the way. If that's true, then skills like writing and interface design and "taste" and all the other "non-engineering" parts of making things rise in importance relative to the engineering skills that have been handed over to the machines.
That's a big "if", of course, and the machines aren't there yet, but that's what's promised. If it comes to pass, then I like my prediction (for, at least, the 50th percentile of both groups). If not, not.
It's a meaningless, feel-good rule. Every country has countless carve-outs. To give you a trivial example: in the US, you can't get a passport if you owe more than $2,500 in child support.
Whilst I agree, to be fair, a passport is usually only needed when entering a country, not leaving one, right? Under the cited rule, the US needs to allow you to leave, not help you in entering some other country.
That's mostly because transport companies have to pay to ship you back if you get turned away at the border, so they will want to see your permission to enter your destination country before you leave. I've traveled internationally a fair bit and I've never had to show my passport to government officials when leaving the US.
Would they do that for an international departure? They know where you’re flying, and I’d think they’d just tell you to stop being an idiot and show them the passport you obviously must have. But policies can be weird, so maybe not.
Yes, that's what I said above. The US government doesn't give a toss, but the airline has to fly you back if you're refused entry at your destination, so they will do their best to ensure you have the documents you need.
Ok, fair enough, but if I were German - I don't really think I would asylum anywhere on the basis of Germany maybe intending to conscript me in the future.
You generally do present your passport when leaving. Most places you get an exit stamp (which matches your entry stamp). They usually confirm things such as not overstaying a visa.
ex:
overstaying in Thailand results in a on-the-spot fine
China lately has exit checks when traveling to SEA (they try to intercept people traveling to scam centers)
Find a niche where you can resist the temptation to constantly compare yourself to eight billion other people on the internet. Something where success isn't measured in Github stars, Youtube likes, or Reddit upvotes. Once you get in that mindset, almost anything goes. I know people who collect RPN calculators and are having a blast. All kinds of hands-on crafts are great too. I like making electronic music and I'm pretty bad at it.
reply