Music Banter

Music Banter (https://www.musicbanter.com/)
-   The Lounge (https://www.musicbanter.com/lounge/)
-   -   Your Day (https://www.musicbanter.com/lounge/8425-your-day.html)

SGR 03-18-2021 10:13 PM

Quote:

Originally Posted by adidasss (Post 2166703)
Yes! And I'm not convinced it will ever come to some kind of conflict, it hasn't so far so no reason to believe it will. We create machines to do a specific job, and that's their limitation. So, I don't see a dystopia but a utopia. :)

Who sets the limitations? "It hasn't so far so no reason to believe it will" won't cut it. We haven't reached mass AGI yet.

adidasss 03-18-2021 10:24 PM

Quote:

Originally Posted by SoundgardenRocks (Post 2166704)
Who sets the limitations? "It hasn't so far so no reason to believe it will" won't cut it. We haven't reached mass AGI yet.

Well humans do, obviously. If you program a machine to do something specific, why do we think it will go beyond its programming? I think people are trying hard to create a true AI for a long time now without success. What makes you think this is a certainty?

The Batlord 03-18-2021 10:25 PM

Quote:

Originally Posted by SoundgardenRocks (Post 2166700)
How will no one own the means of production? You think there's a future in which all these automated machines/factories take care of themselves? You think they'll become self-sufficient and self-serving?

If so, we are no longer in a discussion about political philosophy, but rather in a discussion about survival between man and machine. An automated dystopia.

Alright let's be perfectly clear and non-stupid here. You are literally talking about the fallout of Terminator. Terminator is, say it with me, a movie. There is literally no reasonable reason to fear the future you're imagining other than your imagination of... A movie. Stop being stupid.

SGR 03-18-2021 11:03 PM

Quote:

Originally Posted by adidasss (Post 2166705)
Well humans do, obviously. If you program a machine to do something specific, why do we think it will go beyond its programming? I think people are trying hard to create a true AI for a long time now without success. What makes you think this is a certainty?

I don't think it's a certainty. But if we achieve AGI, then it will go beyond its programming - that's the point. Self-sufficency - the ability to learn and grow based on inputs external to the original programming.

Quote:

Originally Posted by The Batlord (Post 2166706)
Alright let's be perfectly clear and non-stupid here. You are literally talking about the fallout of Terminator. Terminator is, say it with me, a movie. There is literally no reasonable reason to fear the future you're imagining other than your imagination of... A movie. Stop being stupid.

If you want to continue to tie the idea to cheesy '80s sci-fi movies, that's your prerogative, but it's becoming less of a fantasy the farther we move through time and human progression. You like Doom? Well, John Carmack is hard at work on AGI and he's one of many. The first person that achieves it will be enshrined in the history books, for better or for worse. If you don't want to take it seriously, that's fine. But personally, I believe AI will become more and more of typical subject for human beings to discuss when it comes to politics and their position relative to society. It is not going away any time soon, nor is it a fantasy. I am not saying there will be a doomsday scenario in which it wipes us off the face of the earth but a scenario in which it becomes so efficient that it replaces a large segment of our current labor force is not unimaginable.

Lucem Ferre 03-19-2021 01:07 AM

Alienating mostly for the lower classes that are denied any sort of dignity in a culture that teaches it's self things like workers are lazy, unmotivated and undeserving of sympathy. Take the assumption that we'd be too lazy if we lost our jobs to start a socialist revolution which is an incredibly ridiculous assumption not only on the idea that we'd some how get the funds to live leisurely enough that we can just binge Netflix and consume junk all day, but also after witnessing how restless people really got during a pandemic. The only thing that would ever prevent a revolution is how well our country has gotten it's citizens dumbed down and wired to be obedient little workers. Teaching the working class to hate the working class.

jwb 03-19-2021 02:19 AM

Quote:

Originally Posted by The Batlord (Post 2166706)
Alright let's be perfectly clear and non-stupid here. You are literally talking about the fallout of Terminator. Terminator is, say it with me, a movie. There is literally no reasonable reason to fear the future you're imagining other than your imagination of... A movie. Stop being stupid.

This take really annoys me tbh

why is it more rational not to fear the robots than to fear them???

yeah terminator is a movie. you know what else is a movie? Armeggedon. Doesn't mean a giant meteor can't strike the earth. 2012 is a movie. Doesn't mean climate catastrophe can't wipe us out. Dr strangelove is a movie. Doesn't mean nuclear holocaust can't take us out.

What possible leg do you have to stand on in assuming that if we continue the progression towards AI that it can't end up backfiring and producing something that represents an existential threat? It's like through sheer hubris humans assume we're just gonna keep using robots as slaves forever and that no matter how advanced we make them they're never gonna turn around and **** us right in the ass with their artificially huge robot dick.

Frownland 03-19-2021 06:25 AM

It's stupid because it's less realistic and shows a general misunderstanding of how AI functions. It's just as much hubris to think we can create something so powerful too.

It's more likely that our overreliance on technology would turn fatal and destructive in a Carrington event scenario. Or maybe we'll just die in good ole fashioned heat waves and famines that we're taking care of without some dorky ass computers.

The Batlord 03-19-2021 07:12 AM

Quote:

Originally Posted by jwb (Post 2166735)
This take really annoys me tbh

why is it more rational not to fear the robots than to fear them???

yeah terminator is a movie. you know what else is a movie? Armeggedon. Doesn't mean a giant meteor can't strike the earth. 2012 is a movie. Doesn't mean climate catastrophe can't wipe us out. Dr strangelove is a movie. Doesn't mean nuclear holocaust can't take us out.

What possible leg do you have to stand on in assuming that if we continue the progression towards AI that it can't end up backfiring and producing something that represents an existential threat? It's like through sheer hubris humans assume we're just gonna keep using robots as slaves forever and that no matter how advanced we make them they're never gonna turn around and **** us right in the ass with their artificially huge robot dick.

If you give the nuclear arsenal to a super AI with an emo haircut then maybe. If not then BOOM safe.

SGR 03-19-2021 07:24 AM

Quote:

Originally Posted by Frownland (Post 2166744)
It's stupid because it's less realistic and shows a general misunderstanding of how AI functions. It's just as much hubris to think we can create something so powerful too.

It's more likely that our overreliance on technology would turn fatal and destructive in a Carrington event scenario. Or maybe we'll just die in good ole fashioned heat waves and famines that we're taking care of without some dorky ass computers.


jwb 03-19-2021 10:52 AM

Quote:

Originally Posted by Frownland (Post 2166744)
It's stupid because it's less realistic and shows a general misunderstanding of how AI functions. It's just as much hubris to think we can create something so powerful too.

It's more likely that our overreliance on technology would turn fatal and destructive in a Carrington event scenario. Or maybe we'll just die in good ole fashioned heat waves and famines that we're taking care of without some dorky ass computers.

Less realistic than what exactly? If it is a distinct possibility and people are actively pursuing the goal of creating AGI then I don't think it's stupid to look at that and have some level of apprehension at the potential for it to backfire. Especially when in the same vein people are pontificating on some sort of futuristic post scarcity utopia where the technology we built does everything for us. That's no more grounded in demonstrable reality than the Terminator scenario.

In terms of risk assessment, we consider potential threats that are far less likely to happen than they are not to happen all the time. Things like biological warfare, terrorism, nuclear annihilation, etc. Saying something is less likely is not a good argument for not worrying about it at all.

As for understanding of how AI functions, let's be perfectly clear here. There are experts that know far better than you or I do how current AI functions who fall on both sides of this debate. They disagree with one another on the potential dangers at hand.I would recommend the lecture Intelligence Stairway by Jaan Tallin for a sober case for why this could be a potential existential threat.

The current AI is nowhere near the level of AGI and we might indeed never get to AGi, in which case yeah it's not a threat. It's like if we never got nukes then nukes wouldn't be a threat. But it does give me pause that there are so many scientists and engineers who are optimistically dedicated to the cause of bringing it into existence.


All times are GMT -6. The time now is 11:21 AM.


© 2003-2025 Advameg, Inc.