from your comment i am pretty confident that you are around 19 years old. it doesnt seem like you actually read my comment. i guess i will respond to you but there is no comment in the world that could set you straight. you need more experience and personal development before you could begin to understand this topic.
someone once said "do not see things as they are, see them as they might be." this quote is really about discoveries and the tendency of humans to only see things in terms of what already exists. and the implication is that humans have a big blind spot for seeing whats next. thats why we need a motivational quote to help us to see things as they might be rather than simply as they are. it is true that there has never been a global regulation or ban like the one we are talking about except maybe ozone layer emissions. but by this same metric, AGI can never exist because it has never existed before. its a silly response and a complete waste of time. even if such regulations already existed for other things in the past, you still would be here saying it was impossible but for some other reason. the key here is to make up your mind last, not first.
regulation can stop anything as long as it doesnt break the laws of physics. and, if you had read my comment, i explain why china wouldnt pursue AGI. even if china did pusue AGI, they probably wouldnt be able to crack it. none of the major breakthroughs have come out of china.
"i hope we get AGI as soon as possible. [it will lead to many incredible things]." you have no idea what AGI will lead to. you just cherry pick all the cool stuff that would be possible but totally ignore all of the other implications. there would be an immediate and total power vacuum caused by the advancements. these advancements would be so huge that it would change the geopolitical equation beyond recognition. the concept of a country would probably be economically and geopolitically untenable. there would have to be a transition to an entirely new order where the dominant meta-organisms arent countries but some bizarre AGI conglomerate that looks like an expressionist painting in comparison to what we have now. the transition to this new world, whatever it looks like, would involve war. probably the biggest war that has ever happened. this is intrinsic and unavoidable. it cannot be disproved or denied. the fundamental economic and geopolitical equation that underlies the current equilibrium would change suddenly and violently.
the current world order will disappear, you will probably lose everything you own and everyone you love along with your country. a global war will break out where there is a high chance that all established rules of engagement are ignored. weapons or methods that render the environment unlivable to humans will more than likely be used because the dominant organisms and meta-organisms wont need humans in any practical sense. and after the dust settles and a new equilibrium is reached, the existence of humans will end very quickly (if it hadnt already) because we will offer nothing of value anymore and if our existence presents the slightest inconvenience to the machines, they will allow us to die. and that is just the scenario where they are apathetic towards us. i have not even begun to discuss the repulsive, grotesque nature of our suffering if we ever are the subject of AGI malice. those possibilities are always brushed aside as fear mongering so i dont even bring them up. but they should play into our decision to move forward or not.
at the very best, we will somehow manage to attach ourselves as parasites to the new machine meta-organisms and experience an existence with no agency or purpose other than to ogle at the machines. but that wont happen because the machines will immediately embark on doing things that humans could never, ever understand.
"what if we had 12 fingers [...]." what if indeed. perhaps i was too hasty... no cost is too high in pursuing the deeper mysteries of the universe.
Neither do you? None of us do, in fact I’d imagine the people trying for AGI right now would have a better guess than you or I.
> there would be an immediate and total power vacuum caused by the advancements. these advancements would be so huge that it would change the geopolitical equation beyond recognition.
This sounds like you’re assuming someone will flip a switch one day and the most powerful mind in history will be let loose. I’m not sure AGI will advance that fast. We might have alot of incredibly “stupid” iterations of AGIs first, for many years before a clever one rolls around.
> this is intrinsic and unavoidable. it cannot be disproved or denied.
Were all just making assumptions here, I don’t think yours get to be called “intrinsic and unavoidable”.
I understand the concerns here, but if you’re willing to claim the end of the world, I would suggest basing your claims on something, or atleast making your assumptions explicit. E.g. “assuming we achieve AGI, and its equipped to rapidly become more powerful/intelligent than the whole of the human population…)
you can predict the behavior of complex systems axiomatically. my predictions are very general because they are axiomatic. the most important axiom in play is that natural selection will guide the development and behaviour of the creatures of the singularity. there may be points of friction that cause small deviations from this path, such as the total effort of all humans post-pandoras-box, but the ultimate shape of things is inevitable. these are assumptions in name only.
there are many possibilities so the idea that we get an outcome that is good for us is unlikely. its just basic probability. i think people get hung up on this because there isnt an example of it to reference in history.
of course AGI will immediately rocket upward. the only way it wouldnt is if it were created in total secrecy and held in perfect captivity forever. laughable. all that is needed is for word to get out that AGI has been created and it would be re-created the next day somewhere else. and one iteration of it would rocket upward. AGI, once created, is intrinsically unstable.
the burden of evidence and proof is on you, not me. we know what things will be like without AGI. it is only right for the people who advocate for the creation of sentient machines to produce evidence that they will not open the doors to a living nightmare. the same thing should have been done with nuclear weapons. it really makes me scratch my head when people demand evidence from me as if i were the one encroaching. you are right, people are only making assumptions when they talk about the singularity. and the idea that we will not bitterly regret the singularity is the most tenuous assumption of all. until they show up with something more substantive i will be firmly against the creation of AGI.
> from your comment i am pretty confident that you are around 19 years old... you need more experience and personal development before you could begin to understand this topic
And from the first paragraph of your comment, I didn't read the rest of it. Have a nice day (or don't).
lets say there was a global coalition of countries that considered the creation or advancement of AGI a material threat to the safety of all humans. this would qualify countries that do advanced AI research for retaliation from NATO or other international bodies that might be created. it is clear that AI is only able to move forward in an environment of cheap compute and international academic collaboration. progress in AI would slow to a snails pace if feature size were regulated, total compute were regulated like carbon emissions and explicit research on AI was banned. it would be an environment of very expensive compute and no mainstream research or collaboration. this would at the very least buy us massive amounts of time. can you say anything substantive to show otherwise?
i have no idea why people go straight to china every time regulating AI is brought up. its something to do with a rudimentary understanding of geopolitics i think. china. the answer to your question is the same answer to the same question regarding any country: a healthy majority of world powers forming a military-backed coalition would definitely stop outliers from carrying out the type of research that is at the leading edge of the current AI explosion. the chinese government is already worried about AGI so its ironic that everyone imagines them to be the outlier because in fact they would probably one of the first and most enthusiastic members of such a coalition. any country that wanted to resist, and that is highly unlikely given the fact that the need for regulation will become blindingly clear with every surge forward, would much rather cooperate than pursue far-fetched geopolitical strategies that involve AGI. most countries dont even output enough research to qualify for sanctions.
nobody has ever offered a lucid and axiomatic argument that shows regulation cannot work. there are two options, TRY to regulate or face a living nightmare where neither the best nor the worst outcome is even close to acceptable. it is so blindingly obvious that it boggles my mind: the only reasonable, rational response is to try to regulate, slow or stop AGI.
edit: i have a guilty confession. i was looking through your comments. i saw that you said that humans must eat meat to be healthy. i was surprised to see that and i want to tell you that i completely agree with you and i have often tried to explain to people why this is true and its like talking to a brick wall. there arent very many people who seem to get this even though its blindingly obvious. just wanted to give you a little encouragement to keep the fight going on the meat thing.
Haha, yeah that guy had a thing against “carnists”. Deranged.
I don’t think as the tech curve goes up there will be a long enough time period, even with a globally enforced military pact, to stop the rise of the machine. My reasoning is that there are more than enough clandestine organizations and families with a vested interest in pursuing the “power” it brings, a lot of them with a whole lot of control over these countries in our pact.
someone once said "do not see things as they are, see them as they might be." this quote is really about discoveries and the tendency of humans to only see things in terms of what already exists. and the implication is that humans have a big blind spot for seeing whats next. thats why we need a motivational quote to help us to see things as they might be rather than simply as they are. it is true that there has never been a global regulation or ban like the one we are talking about except maybe ozone layer emissions. but by this same metric, AGI can never exist because it has never existed before. its a silly response and a complete waste of time. even if such regulations already existed for other things in the past, you still would be here saying it was impossible but for some other reason. the key here is to make up your mind last, not first.
regulation can stop anything as long as it doesnt break the laws of physics. and, if you had read my comment, i explain why china wouldnt pursue AGI. even if china did pusue AGI, they probably wouldnt be able to crack it. none of the major breakthroughs have come out of china.
"i hope we get AGI as soon as possible. [it will lead to many incredible things]." you have no idea what AGI will lead to. you just cherry pick all the cool stuff that would be possible but totally ignore all of the other implications. there would be an immediate and total power vacuum caused by the advancements. these advancements would be so huge that it would change the geopolitical equation beyond recognition. the concept of a country would probably be economically and geopolitically untenable. there would have to be a transition to an entirely new order where the dominant meta-organisms arent countries but some bizarre AGI conglomerate that looks like an expressionist painting in comparison to what we have now. the transition to this new world, whatever it looks like, would involve war. probably the biggest war that has ever happened. this is intrinsic and unavoidable. it cannot be disproved or denied. the fundamental economic and geopolitical equation that underlies the current equilibrium would change suddenly and violently.
the current world order will disappear, you will probably lose everything you own and everyone you love along with your country. a global war will break out where there is a high chance that all established rules of engagement are ignored. weapons or methods that render the environment unlivable to humans will more than likely be used because the dominant organisms and meta-organisms wont need humans in any practical sense. and after the dust settles and a new equilibrium is reached, the existence of humans will end very quickly (if it hadnt already) because we will offer nothing of value anymore and if our existence presents the slightest inconvenience to the machines, they will allow us to die. and that is just the scenario where they are apathetic towards us. i have not even begun to discuss the repulsive, grotesque nature of our suffering if we ever are the subject of AGI malice. those possibilities are always brushed aside as fear mongering so i dont even bring them up. but they should play into our decision to move forward or not.
at the very best, we will somehow manage to attach ourselves as parasites to the new machine meta-organisms and experience an existence with no agency or purpose other than to ogle at the machines. but that wont happen because the machines will immediately embark on doing things that humans could never, ever understand.
"what if we had 12 fingers [...]." what if indeed. perhaps i was too hasty... no cost is too high in pursuing the deeper mysteries of the universe.