I think consciousness isn't a requirement for high intelligence, and the AI agents of the future won't be conscious by default. Inversely, creatures with relatively low intelligence may possess some degree of consciousness.
I see consciousness as a behavioral guidance mechanism similar to emotions. Emotions, like consciousness, aren't a requirement for intelligence, and strong AI won't have feelings by default. But both consciousness & emotions are quite useful for embodied creatures like ourselves.
I also see consciousness as both a qualitative and quantitative attribute: there is a threshold that a system has to cross in order to be conscious at all (0 to 1 -- a thermostat isn't conscious), and past that, systems may be more or less conscious.
You can follow @fchollet.
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.
Enjoy Threader? Sign up.
Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.