Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There may be financial value in being early (if you're lucky), but there are other values in waiting.

My goal in life is not to maximize financial return, it's to maximize my impact on things I care about. I try to stay comfortable enough financially to have the luxury to make the decisions that allow me to keep doing things I care about when the opportunities come along.

Deciding whether something new is the right path for me usually takes a little time to assess where it's headed and what the impacts may be.

 help



> My goal in life is not to maximize financial return, it's to maximize my impact on things I care about.

In the vast majority of cases, financial returns help maximize your impact on the things you care about. Arguably in most cases it's more effective for you to provide the financing and direction but not be directly involved. That's why the EA guys are off beng quants.

The only real exceptions are things that specifically require you personally, like investing time with your family, or developing yourself in some way.


I knew this canned rebuttal was coming and almost addressed it in my previous comment.

I've not found this to be true at all, for a variety of reasons. One of my moral principles that extreme wealth accumulation by any individual is ultimately harmful to society, even for those who start with altruistic values. Money is power, and power corrupts.

Also, the further from my immediate circle I focus my impact on, the less certainty I have that my impact is achieving what I want it to. I've worked on global projects, and looking back at them those are the projects I'm least certain moved the needle in the direction I wanted them to. Not because they didn't achieve their goals, but because I'm not sure the goals at the outset actually had the long term impact I wanted them to. In fact, it's often due to precisely what we're talking about in this thread: sometimes new things come along and change everything.

The butterfly effect is just as real with altruism as it is with anything else.


But you're not supposed to accumulate the wealth, you're supposed to forward it to your elected causes.

Being a quant is inherently accumulating and growing someone's wealth for them, even if it's not your own.

If there were a way to be a true Robin Hood and only extract wealth from the wealthy and redistribute that to poor, I'd call that a noble cause, although finance is not my field (nor is crime, for that matter) so it's not for me.

My chosen wealth multiplier is working at a community-owned cooperative, building the wealth for others directly.


Not sure about this because many charities are designed to spend their income, rather than hoard it. A big part of choosing which charity to donate to is, or should be, how effective they are in spending what you give them.

I mean, I'm not arguing that if you can find a way to make a large amount of money in an ethical way without enriching yourself or the wealthy further and then find a way to accurately evaluate charities to maximize impact, that you shouldn't do that. But there are several very difficult problems embedded in that path, and I could easily sees just solving all of those problems becoming a full-time job by itself.

I also, candidly, haven't ever seen anyone successfully do that.


I didn't realize maximizing money is the way to achieve moral excellence. It's interesting how Puritanical the EA folks are

There is no moral excellence but which you invent for yourself. But given the first principle or goal of 'having the most impact', maximizing money is often quite useful.

Or, utilitarianism

> The only real exceptions are things that specifically require you personally, like investing time with your family, or developing yourself in some way.

So, the things that matter the most for most people?

Studies pretty consistently show that happiness caps off at relatively modest wealth.


That's not their stated goal. Their stated goal is to maximize impact, not their own happiness.

Impact is nebulous. For example, Zuckerberg has had impact but it’s been almost entirely negative. The world is a worse place for him having existed.

It being signed doesn't make it nebulous.

> That's why the EA guys are off beng quants.

Or in prison for fraud.


I want to cure lung cancer, therefore as an Effective Altruist™ I maximize my income by selling cigarettes to children outside playgrounds. The money will go towards research in my will, and in the meantime the incidence of lung cancer in teenagers will incentivize the free market to find a cure!

People don't become quants because they are EAs, they become EAs to justify to themselves why they became quants.


Being a quant is not that interesting and if you're not redirecting the money you're not really an EA, are you?

Your first paragraph is just a standard response to utilitarianism, although a poor one because it doesn't consider EV.

Nonetheless I'm not quite sure why merely mentioning EA draws out all these irrelevant replies about it. It was incidental, not an endorsement of EA.


> Arguably in most cases it's more effective for you to provide the financing and direction but not be directly involved. That's why the EA guys are off beng quants.

The EA guys aren't the final word on ethics or a fulfilling life.

Ursula K. Le Guin wrote that one might, rather than seeking to always better one's life, instead seek to share the burden others are holding.

Making a bunch of money to turn around and spend on mosquito nets might seem to be making the world better, but on the other hand it also normalizes and enshrines the systems of oppression and injustice that created a world where someone can make 300,000$ a year typing "that didn't work, try again" into claude while someone else watches another family member die of malaria because they couldn't afford meds.


Nobody is asking about ethics or a fulfilling life. We are talking about maximum _impact_.

Impact only has meaning per a chosen framework to measure within. For example, if I apply my ethical system to measure the impact of an EA, they have essentially no impact, since all they do is perpetuate a system that is the root of the problems they're trying to solve.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: