I find this concept quite interesting: "the seriousness with which you attend to a matter isn’t always best evaluated in terms of the amount of time you spend on it, or the extent to which it exhausts your attention."
1) if you look at sensory substitution and sensory expansion work in years I think we are getting closer on the purely experiential side. (The ontological side not so much, but that may be a categories thing).
2) Even though Nagel’s essay has been very popular for the past fifty years, I fully expect his all-timer essay streak to keep going. Questions of AI consciousness are doubtlessly going to run into his arguments. How can we tell, folks? What does it feel like to compute hard enough to be conscious anyhow? Could we ever possibly know?
I have no idea if Nagel read that poem but it probably doesn’t matter. I’m pretty sure the pioneering work of Donald Griffin was a more immediate influence on choosing bats as an example.
I find this concept quite interesting: "the seriousness with which you attend to a matter isn’t always best evaluated in terms of the amount of time you spend on it, or the extent to which it exhausts your attention."
I’d never heard that lightbulb joke! So good.
It makes me laugh every time!
Love these 5 thing pieces.
Two brief thoughts on Nagel:
1) if you look at sensory substitution and sensory expansion work in years I think we are getting closer on the purely experiential side. (The ontological side not so much, but that may be a categories thing).
2) Even though Nagel’s essay has been very popular for the past fifty years, I fully expect his all-timer essay streak to keep going. Questions of AI consciousness are doubtlessly going to run into his arguments. How can we tell, folks? What does it feel like to compute hard enough to be conscious anyhow? Could we ever possibly know?
thank you!
I have no idea if Nagel read that poem but it probably doesn’t matter. I’m pretty sure the pioneering work of Donald Griffin was a more immediate influence on choosing bats as an example.