|
Register | Blogging | Today's Posts | Search |
|
Thread Tools | Display Modes |
03-18-2021, 11:24 PM | #74462 (permalink) |
Slavic gay sauce
Join Date: Apr 2005
Location: Abu Dhabi
Posts: 7,993
|
Well humans do, obviously. If you program a machine to do something specific, why do we think it will go beyond its programming? I think people are trying hard to create a true AI for a long time now without success. What makes you think this is a certainty?
__________________
“Think of what a paradise this world would be if men were kind and wise.” - Kurt Vonnegut, Cat's Cradle. Last.fm |
03-18-2021, 11:25 PM | #74463 (permalink) | ||
Zum Henker Defätist!!
Join Date: Jan 2011
Location: Beating GNR at DDR and keying Axl's new car
Posts: 48,199
|
Quote:
__________________
Quote:
|
||
03-19-2021, 12:03 AM | #74464 (permalink) | ||
No Ice In My Bourbon
Join Date: Mar 2010
Location: /dev/null
Posts: 4,325
|
Quote:
Quote:
Last edited by SGR; 03-19-2021 at 12:11 AM. |
||
03-19-2021, 02:07 AM | #74465 (permalink) | |
Cuter Than Post Malone.
Join Date: Sep 2015
Posts: 4,978
|
Alienating mostly for the lower classes that are denied any sort of dignity in a culture that teaches it's self things like workers are lazy, unmotivated and undeserving of sympathy. Take the assumption that we'd be too lazy if we lost our jobs to start a socialist revolution which is an incredibly ridiculous assumption not only on the idea that we'd some how get the funds to live leisurely enough that we can just binge Netflix and consume junk all day, but also after witnessing how restless people really got during a pandemic. The only thing that would ever prevent a revolution is how well our country has gotten it's citizens dumbed down and wired to be obedient little workers. Teaching the working class to hate the working class.
__________________
Quote:
Art Is Dead. Buy My ****. |
|
03-19-2021, 03:19 AM | #74466 (permalink) | |
Account Disabled
Join Date: Jul 2019
Posts: 4,403
|
Quote:
why is it more rational not to fear the robots than to fear them??? yeah terminator is a movie. you know what else is a movie? Armeggedon. Doesn't mean a giant meteor can't strike the earth. 2012 is a movie. Doesn't mean climate catastrophe can't wipe us out. Dr strangelove is a movie. Doesn't mean nuclear holocaust can't take us out. What possible leg do you have to stand on in assuming that if we continue the progression towards AI that it can't end up backfiring and producing something that represents an existential threat? It's like through sheer hubris humans assume we're just gonna keep using robots as slaves forever and that no matter how advanced we make them they're never gonna turn around and **** us right in the ass with their artificially huge robot dick. |
|
03-19-2021, 07:25 AM | #74467 (permalink) |
SOPHIE FOREVER
Join Date: Aug 2011
Location: East of the Southern North American West
Posts: 35,541
|
It's stupid because it's less realistic and shows a general misunderstanding of how AI functions. It's just as much hubris to think we can create something so powerful too.
It's more likely that our overreliance on technology would turn fatal and destructive in a Carrington event scenario. Or maybe we'll just die in good ole fashioned heat waves and famines that we're taking care of without some dorky ass computers.
__________________
Studies show that when a given norm is changed in the face of the unchanging, the remaining contradictions will parallel the truth. |
03-19-2021, 08:12 AM | #74468 (permalink) | ||
Zum Henker Defätist!!
Join Date: Jan 2011
Location: Beating GNR at DDR and keying Axl's new car
Posts: 48,199
|
Quote:
__________________
Quote:
|
||
03-19-2021, 08:24 AM | #74469 (permalink) | |
No Ice In My Bourbon
Join Date: Mar 2010
Location: /dev/null
Posts: 4,325
|
Quote:
|
|
03-19-2021, 11:52 AM | #74470 (permalink) | |
Account Disabled
Join Date: Jul 2019
Posts: 4,403
|
Quote:
In terms of risk assessment, we consider potential threats that are far less likely to happen than they are not to happen all the time. Things like biological warfare, terrorism, nuclear annihilation, etc. Saying something is less likely is not a good argument for not worrying about it at all. As for understanding of how AI functions, let's be perfectly clear here. There are experts that know far better than you or I do how current AI functions who fall on both sides of this debate. They disagree with one another on the potential dangers at hand.I would recommend the lecture Intelligence Stairway by Jaan Tallin for a sober case for why this could be a potential existential threat. The current AI is nowhere near the level of AGI and we might indeed never get to AGi, in which case yeah it's not a threat. It's like if we never got nukes then nukes wouldn't be a threat. But it does give me pause that there are so many scientists and engineers who are optimistically dedicated to the cause of bringing it into existence. |
|