We're fucking doomed!
- jordanneke
- subsekt
- Posts: 4166
- Joined: Sat Nov 03, 2012 2:16 pm
We're fucking doomed!
Do these people not watch Netflix?
Why would this ever be a good idea?
Fuck that. I'm off to my bunker.
youtu.be/wXxrmussq4E
Why would this ever be a good idea?
Fuck that. I'm off to my bunker.
youtu.be/wXxrmussq4E
Re: We're fucking doomed!
How much of this was pre-programmed?
- zukenbauer
- You only live once
- Posts: 512
- Joined: Sun Feb 14, 2016 1:52 pm
Re: We're fucking doomed!
So you can have a dog that doesent take shitjordanneke wrote:Do these people not watch Netflix?
Why would this ever be a good idea?
Fuck that. I'm off to my bunker.
youtu.be/wXxrmussq4E
- Lost to the Void
- subsekt
- Posts: 13521
- Joined: Sat Feb 18, 2012 1:31 pm
Re: We're fucking doomed!
I have a mate who works in computer sciences for the most powerful bank in the world.
And we have conversation about advancements in tech world.
Runaway code, mysterious sections of code that appear within self generating software and so on.
And he, rather worryingly, believes there are vast networks of coding that we don`t fully understand already.
And he is quite genuinely worried about it.
I mean recently in the science news there were computers communicating that developed their own language so the techs shut them down because they didn`t know what was happening....
The problem with programs that learn is that at some point, like the singularity, their speed of data processing will vastly outrun our ability to keep up with what is happening.
Code is fairly direct and logical, humans are not, at some point logic and rationality will come in to conflict, whether ideologically or physically.
I see some day in the distant future, some scientist sifting through the ruins of a decimated human civilisation.
He finds some tattered old book in what was probably a discount book store.
"The Encyclopedia of Science Fiction Film"
He sits down and reads, stopping on the sections marked
The Terminator
The Matrix
Demon Seed
Ex Machina
Westworld
Ghost in the Shell
I Robot
2001
The Machine
Automata
He begins laughing uncontrollably.
The machines surround him.
He shoots himself in the head.
..
.
.
.
It`s not like anyone at any point can look around at the machine apocalypse and go "We just didn`t see it coming!!!"
And we have conversation about advancements in tech world.
Runaway code, mysterious sections of code that appear within self generating software and so on.
And he, rather worryingly, believes there are vast networks of coding that we don`t fully understand already.
And he is quite genuinely worried about it.
I mean recently in the science news there were computers communicating that developed their own language so the techs shut them down because they didn`t know what was happening....
The problem with programs that learn is that at some point, like the singularity, their speed of data processing will vastly outrun our ability to keep up with what is happening.
Code is fairly direct and logical, humans are not, at some point logic and rationality will come in to conflict, whether ideologically or physically.
I see some day in the distant future, some scientist sifting through the ruins of a decimated human civilisation.
He finds some tattered old book in what was probably a discount book store.
"The Encyclopedia of Science Fiction Film"
He sits down and reads, stopping on the sections marked
The Terminator
The Matrix
Demon Seed
Ex Machina
Westworld
Ghost in the Shell
I Robot
2001
The Machine
Automata
He begins laughing uncontrollably.
The machines surround him.
He shoots himself in the head.
..
.
.
.
It`s not like anyone at any point can look around at the machine apocalypse and go "We just didn`t see it coming!!!"
Re: We're fucking doomed!
What do you mean by that ?Lost to the Void wrote:
at some point logic and rationality will come in to conflict, whether ideologically or physically.
- Lost to the Void
- subsekt
- Posts: 13521
- Joined: Sat Feb 18, 2012 1:31 pm
Re: We're fucking doomed!
Ah, spelling error.
Logic and Irationality
Logic and Irationality
Re: We're fucking doomed!
Haha I spent the whole morning trying to figure out what's the subtle difference between Logic and rationality
Re: We're fucking doomed!
I knew that this Skynet shit would happen someday....Lost to the Void wrote:I have a mate who works in computer sciences for the most powerful bank in the world.
And we have conversation about advancements in tech world.
Runaway code, mysterious sections of code that appear within self generating software and so on.
And he, rather worryingly, believes there are vast networks of coding that we don`t fully understand already.
And he is quite genuinely worried about it.
I mean recently in the science news there were computers communicating that developed their own language so the techs shut them down because they didn`t know what was happening....
The problem with programs that learn is that at some point, like the singularity, their speed of data processing will vastly outrun our ability to keep up with what is happening.
Code is fairly direct and logical, humans are not, at some point logic and rationality will come in to conflict, whether ideologically or physically.
I see some day in the distant future, some scientist sifting through the ruins of a decimated human civilisation.
He finds some tattered old book in what was probably a discount book store.
"The Encyclopedia of Science Fiction Film"
He sits down and reads, stopping on the sections marked
The Terminator
The Matrix
Demon Seed
Ex Machina
Westworld
Ghost in the Shell
I Robot
2001
The Machine
Automata
He begins laughing uncontrollably.
The machines surround him.
He shoots himself in the head.
..
.
.
.
It`s not like anyone at any point can look around at the machine apocalypse and go "We just didn`t see it coming!!!"
- jordanneke
- subsekt
- Posts: 4166
- Joined: Sat Nov 03, 2012 2:16 pm
Re: We're fucking doomed!
I could take being hunted down by an 80's oiled up Arnie and the last words I hear are 'FACH YU ASSHOHLE'.
What would piss me of is being killed by a robot dog or electronic bees.
What would piss me of is being killed by a robot dog or electronic bees.
Re: We're fucking doomed!
Black mirror references ?jordanneke wrote:I could take being hunted down by an 80's oiled up Arnie and the last words I hear are 'FACH YU ASSHOHLE'.
What would piss me of is being killed by a robot dog or electronic bees.
I just watched the whole thing in a week, fucking awesome.
btw, this thing is creepy.
youtu.be/cNZPRsrwumQ
Re: We're fucking doomed!
boys the universe is not real all dis because of mandela effect
Voices From Cindy's Cunt
Re: We're fucking doomed!
NOT THE BEES!jordanneke wrote:I could take being hunted down by an 80's oiled up Arnie and the last words I hear are 'FACH YU ASSHOHLE'.
What would piss me of is being killed by a robot dog or electronic bees.
- jordanneke
- subsekt
- Posts: 4166
- Joined: Sat Nov 03, 2012 2:16 pm
Re: We're fucking doomed!
I'd rather the bees than the fuckin roach-dog
Re: We're fucking doomed!
I'm really concerned as well. But first I want to add a few things I think is important.
There are a lot of misconceptions and rumours concerning specialised AI and Artificial General Intelligence. One of them was Facebook recently shutting down two AI-agents because they invented their own language. HFT banks uses specialised AI to optimize trading and, to my knowledge (consensus in the field), there is no risk of those algorithms spontaneously "waking up". I'm not saying it's impossible, it's just extremely improbable.
There is a plethora of other dystopias closer to reality. One of them being a total financial meltdown due to glitches in AI, similar to the sample we got during the 'flash crash' in 2010 (and perhaps a glimmer a couple weeks ago?). That would fuck us up beyond recognition and increase the risk of other catastrophes (strikes, shortage of supplies, war). And it's scary that we don't seem to have a group of uber quants understanding the whole network of systems working the market.
To me the most scary part is the capitalist reality we're living in and that a lot of these leading companies developing AI are full of extremely talented, highly competitive and socially dysfunctional people. That's not a good recipe for slow, careful programming with always having human safety as top priority long term. I'm not comfortable with these companies behind the steering wheel. (similar to Terminator I guess). I mean look at the dopamine draining zombie machines, I mean social platforms, we use today.
IF we create a self learning AGI we must do so knowing we only get one chance to get it right.
Imagine if we create a system which is to us as we are to ants. No security/backup routines in the world would save us, because this system would've already cracked them long before we understood something was going wrong. Without us knowing the machine would've solved the puzzle 'human' in perhaps days/weeks with all of our inner workings and how we are most easily manipulated. One could argue that we're already here, for now the machine's just in hiding. But there's consensus in the CS community that we still haven't even come close to the first steps in self learning AGI.
It's also easy to forget that this type of system would probably not love nor hate us. We could be in the way of its' goals though. Just in the same way that we don't go around smashing ants all day, nor do we care mowing over ant nests when building roads. Also imagine what ramifications a mere rumour about advancements could lead to... Hello WW3!
To anyone interested I would recommend reading Max Tegmark, Niklas Boström (the book Superintelligence), Eliezer Yudkowsky, Sam Harris.
This podcast is extremely good and a MUST listen to anyone curious about the topic. I can't recommend this episode enough: https://samharris.org/podcasts/116-ai-r ... ard-brink/
I want one of those Boston Dynamics dogs entering the dance floor during my next live gig...
There are a lot of misconceptions and rumours concerning specialised AI and Artificial General Intelligence. One of them was Facebook recently shutting down two AI-agents because they invented their own language. HFT banks uses specialised AI to optimize trading and, to my knowledge (consensus in the field), there is no risk of those algorithms spontaneously "waking up". I'm not saying it's impossible, it's just extremely improbable.
There is a plethora of other dystopias closer to reality. One of them being a total financial meltdown due to glitches in AI, similar to the sample we got during the 'flash crash' in 2010 (and perhaps a glimmer a couple weeks ago?). That would fuck us up beyond recognition and increase the risk of other catastrophes (strikes, shortage of supplies, war). And it's scary that we don't seem to have a group of uber quants understanding the whole network of systems working the market.
To me the most scary part is the capitalist reality we're living in and that a lot of these leading companies developing AI are full of extremely talented, highly competitive and socially dysfunctional people. That's not a good recipe for slow, careful programming with always having human safety as top priority long term. I'm not comfortable with these companies behind the steering wheel. (similar to Terminator I guess). I mean look at the dopamine draining zombie machines, I mean social platforms, we use today.
IF we create a self learning AGI we must do so knowing we only get one chance to get it right.
Imagine if we create a system which is to us as we are to ants. No security/backup routines in the world would save us, because this system would've already cracked them long before we understood something was going wrong. Without us knowing the machine would've solved the puzzle 'human' in perhaps days/weeks with all of our inner workings and how we are most easily manipulated. One could argue that we're already here, for now the machine's just in hiding. But there's consensus in the CS community that we still haven't even come close to the first steps in self learning AGI.
It's also easy to forget that this type of system would probably not love nor hate us. We could be in the way of its' goals though. Just in the same way that we don't go around smashing ants all day, nor do we care mowing over ant nests when building roads. Also imagine what ramifications a mere rumour about advancements could lead to... Hello WW3!
To anyone interested I would recommend reading Max Tegmark, Niklas Boström (the book Superintelligence), Eliezer Yudkowsky, Sam Harris.
This podcast is extremely good and a MUST listen to anyone curious about the topic. I can't recommend this episode enough: https://samharris.org/podcasts/116-ai-r ... ard-brink/
I want one of those Boston Dynamics dogs entering the dance floor during my next live gig...
Re: We're fucking doomed!
On a second thought I gave a few weak arguments (a lot of them from authority, "consensus in the field etc"). I still hold the belief that we aren't close yet and so is must computer scientists, but when is yet is the important question! The whole point is that there will be no "alarms" going off when we're getting closer. Maybe it turns out that the progress we have made so far in 2018, in a perfect simulation, gives us some kind of super intelligent system within the next 5-10 years most of the times? When a few scientists try to calm the public by saying "Relax, don't worry about super intelligent systems. We don't even know where to begin building real AGI!".
Who cares what YOU (the scientist) don't know NOW? That's just a non sequitur. Just look at all science in all of history.
A great quote by Stuart Russel "Imagine that we received a message from an alien civilization, which read: 'People of Earth, we will arrive on your planet in 50 years. Get ready.' And now, we're just counting down the months until the mothership lands?" We would feel a little more urgency than we do now.
Who cares what YOU (the scientist) don't know NOW? That's just a non sequitur. Just look at all science in all of history.
A great quote by Stuart Russel "Imagine that we received a message from an alien civilization, which read: 'People of Earth, we will arrive on your planet in 50 years. Get ready.' And now, we're just counting down the months until the mothership lands?" We would feel a little more urgency than we do now.
- Lost to the Void
- subsekt
- Posts: 13521
- Joined: Sat Feb 18, 2012 1:31 pm
Re: We're fucking doomed!
Lost black mirror episode
Re: We're fucking doomed!
Man tends to bring to fruition all his dreams. Dark and light.
If we are that stupid to program ourselves out of existence then so be it. Maybe we deserve it.
If we are that stupid to program ourselves out of existence then so be it. Maybe we deserve it.
www.bernadettetrax.bandcamp.com
www.soundcloud.com/michaellovatt
“Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.” Dune
www.soundcloud.com/michaellovatt
“Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.” Dune
Re: We're fucking doomed!
As we live in rented accommodation this is the closest we will probably get to owning a dog. I welcome this.
- jordanneke
- subsekt
- Posts: 4166
- Joined: Sat Nov 03, 2012 2:16 pm
Re: We're fucking doomed!
Are they fucking stupid!
Now the thing is programmed to carry out it's task even if humans try to stop it.
FUCK ME. DO THESE GUYS NOT WATCH FILMS?
youtu.be/W1LWMk7JB80
Now the thing is programmed to carry out it's task even if humans try to stop it.
FUCK ME. DO THESE GUYS NOT WATCH FILMS?
youtu.be/W1LWMk7JB80