AI is a simulation (artificial). Even if you did the acrobatics of saying there could be II (inorganic intelligence), there's nothing to suggest self-awareness could transfer from organic to inorganic, and there's no way to distinguish between a simulation being self-aware, and having the "appearance" of being self-aware, so I don't see how anyone could claim to have made a self-aware machine.
If you believe that people have spirits, then the question is also moot. But if you don't believe that people have spirits, there's no way to prove that your memories transferred into anything other than a simulation of yourself. If you consider yourself "you", then something else would not be "you", even if it had the appearance of being "you".
Even if you believe that consciousness arises from the synapses, and is therefore a simulation or illusion to begin with, there's still no way to prove that you would "transfer".
You can't say "there is a singularity" - you can only hope, if that's what you hope for. Others hope there is a god. It's a matter of faith both ways, and a reflection of wanting to escape death.
I don't think there's any harm in believing in the singularity for entertainment value, but as for providing a door to immortality, I think the main danger is simply distraction. While all the effort and discussion and research is spent on something there is no way of proving -- in the meantime, 20,000 children die each day from preventable diseases.
Even if you are a rational egoist, I invite you to consider what you'd do if a child was dying right in front of you. Would you help them? The fact is children are at arm's length, courtesy of our mobile devices, with which we could place more attention on relieving suffering than in trying to achieve immortality by flipping a coin on an unproven allegation.
So the fact is, because we could choose to act, rational egoists included, we are genocidal in our indifference. The first step in facing this is to admit it, and then to try and take steps to do something about it.
I invite anyone who seriously considers a singularity to set it down, and put some effort into the relief of suffering. When we have preventable diseases and poverty figured out, then let's revisit the singularity. I'd enjoy working on it then.
> I don't think there's any harm in believing in the singularity for entertainment value, but as for providing a door to immortality, I think the main danger is simply distraction. While all the effort and discussion and research is spent on something there is no way of proving -- in the meantime, 20,000 children die each day from preventable diseases.
> I invite anyone who seriously considers a singularity to set it down, and put some effort into the relief of suffering. When we have preventable diseases and poverty figured out, then let's revisit the singularity. I'd enjoy working on it then.
The amount of money currently being spent on singularitarian pursuits is negligible compared to what is spent on development aid, or healthcare.
Furthermore, one of the necessary enablers of the singularity is faster computers. Regardless of the sustainability of Moore's law, it is economic competition between chip-makers that has provided the impetus for the continuous exponential increase in computing power that happened during these last decades; as long as it's physically possible, and as long as the chip market is not a monopoly, the increase in computing power is going to happen anyway, and the singularity would be nothing but a byproduct of these market forces - and will not come at the cost of a disinvestment from charitable pursuits.
As a species, it's nice to have a portfolio of possible futures - and it's nice to know that the singularity road is being probed by some people.
I think the most precious thing we have is time, not money, because our time is finite. When you consider the potential of the human race and computing power to help spare daily genocide, there is clearly an opportunity to do better. If a person is driven to spend ever more time on pursuing the innovations that could theoretically lead to a singularity, what would that person say to a stadium full of children who are going to die the next day? I wouldn't know what to say. I'm not sure they'd be comforted by assurances that a byproduct of the pursuit of the singularity results in economic momentum that employs people, as good as that may be.
Perhaps though the attendant momentum of deep mind projects might result in AI or II be putting to the task of humanitarian relief. But my guess is that any such system would probably ask, "what should the priority be for preserving life?" -- and if the prioritization were done by consensus, I'm guessing that most people would prioritize preserving life over pursuing the singularity. For example, I think most people would want a self-driving AI government to prioritize their own survival over the singularity. And if that were the case, and AI were tasked with helping to "load balance" priorities of ethics, efforts, solutions and systems, I wonder if it might not admonish us to make changes in our life.
I'm not self-righteous, I'm self-unrighteous. I'm just beginning to question my general acceptance of wherever technology goes, wherever I spend my time. I don't presume to tell people what to do, but I do propose that the ethic of choosing whether to relieve a suffering child right in front of you may be self-evident to some people -- and if so, that realizing children are at arm's length, that we can do something about it, may convince some people, myself included, to think about "re-balancing" priorities.
I read fiction regularly. I would be a hypocrite to judge anyone for "not spending enough time relieving suffering". Yet overall I question how content and comfortable I am with life. I guess in a portfolio of effort, I hope that the allocation of assets in my own life and others would place a primary emphasis on sustainability that includes the relief of suffering. I think that's what 20k kids dying each day says to me. I think that's what they'd post to Hacker News, if they could.
If you believe that people have spirits, then the question is also moot. But if you don't believe that people have spirits, there's no way to prove that your memories transferred into anything other than a simulation of yourself. If you consider yourself "you", then something else would not be "you", even if it had the appearance of being "you".
Even if you believe that consciousness arises from the synapses, and is therefore a simulation or illusion to begin with, there's still no way to prove that you would "transfer".
You can't say "there is a singularity" - you can only hope, if that's what you hope for. Others hope there is a god. It's a matter of faith both ways, and a reflection of wanting to escape death.
I don't think there's any harm in believing in the singularity for entertainment value, but as for providing a door to immortality, I think the main danger is simply distraction. While all the effort and discussion and research is spent on something there is no way of proving -- in the meantime, 20,000 children die each day from preventable diseases.
Even if you are a rational egoist, I invite you to consider what you'd do if a child was dying right in front of you. Would you help them? The fact is children are at arm's length, courtesy of our mobile devices, with which we could place more attention on relieving suffering than in trying to achieve immortality by flipping a coin on an unproven allegation.
So the fact is, because we could choose to act, rational egoists included, we are genocidal in our indifference. The first step in facing this is to admit it, and then to try and take steps to do something about it.
I invite anyone who seriously considers a singularity to set it down, and put some effort into the relief of suffering. When we have preventable diseases and poverty figured out, then let's revisit the singularity. I'd enjoy working on it then.