Ignore Tesla CEO Elon Musk’s Artificial Intelligence Driven Apocalypse: AI CEO – Info AI

Out of curiosity, what are your general thoughts on the control problem for an AGI/ASI?

I generally agree with people like Stuart Russell and Nick Bostrom that ASI would pose an existential threat to humanity by default, and that we currently don’t know how to make it safe. We don’t know how long it will take to get AGI and we don’t know how long it will take from there to get ASI (although I think a hard takeoff is fairly likely), but we also don’t know how long it will take to solve the control problem, so I don’t think we should postpone working on it. Eliezer Yudkowsky asks an excellent question: if it’s too early to start safety work now, then when will you know it’s the right time to start?

The uncertainty about the future makes it hard to say how many resources we should invest in A(G)I safety exactly, but I think the least we can do is to acknowledge that the problem is real (even if it’s potentially far away).

Seems reasonable for AGI but impossible for an ASI to me.

Maybe. It’s not that hard to come up with categories of solutions that would work on entities below a certain level of , but wouldn’t for smarter ones. (Intelligence isn’t single-dimensional, but it’s easier to talk like this.) Obviously it’s possible to imprison some human-level intelligences (we do it all the time), for instance, but if intelligence is the cognitive ability to solve problems, there should (at least theoretically) be a level of intelligence above which the problem of escaping the prison can be solved (unless it’s literally impossible).

I don’t know if it’s possible to solve the control problem for arbitrary levels of (super)intelligence, but I certainly hope so. Even if we can successfully control the first AGI we build, there will be incentives to make it smarter until the (unknown) level at which it becomes too smart, so this doesn’t seem like a sustainable situation to me. Luckily value alignment doesn’t seem obviously impossible to me though.

Article Prepared by Ollala Corp

You might also like
Leave A Reply

Your email address will not be published.