I think consciousness isn't a requirement for high intelligence, and the AI agents of the future won't be conscious by default. Inversely, creatures with relatively low intelligence may possess some degree of consciousness.
I see consciousness as a behavioral guidance mechanism similar to emotions. Emotions, like consciousness, aren't a requirement for intelligence, and strong AI won't have feelings by default. But both consciousness & emotions are quite useful for embodied creatures like ourselves.
I also see consciousness as both a qualitative and quantitative attribute: there is a threshold that a system has to cross in order to be conscious at all (0 to 1 -- a thermostat isn't conscious), and past that, systems may be more or less conscious.
You can follow @fchollet.
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.
Enjoy Threader? Sign up.
Since you’re here...
... we’re asking visitors like you to make a contribution to support this independent project. In these uncertain times, access to information is vital. Threader gets 1,000,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Your financial support will help two developers to keep working on this app. Everyone’s contribution, big or small, is so valuable. Support Threader by becoming premium or by donating on PayPal. Thank you.