yudkowsky.net/singularity/aibox

<p>When we build AI, why not just keep it in sealed hardware that can’t affect the outside world in any way except through one communications channel with the original programmers?<br />  That way it couldn’t get out until we were convinced it was safe. Right?</p>


Comments (0)

Sign in to post comments.