This book of artwork by Anselm Kiefer reminded me of some things. Whenever people ask the question “can _____ have consciousness”, they never actually ask that question. What they are asking is “can _____ have “human” consciousness” - this has absolutely nothing to do with the former question. It’s an absurd generalization to assume (or classify, rather) that “all forms of consciousness” must exhibit behavior or experiences such as “thought”, “emotions”, and “senses”. These are things that humans perceive through human consciousness. Just because something “thinks” differently from what we are used to calling (or experiencing) as “thought” does not (ABSOLUTELY not) mean that _____ does not “think”.
The whole AI debate is no more than a grossly exaggerated grand argument of DEFINITION, nothing else. I hate it when this happens, and the debaters never realize that they’re getting nowhere besides disagreeing on their definitions of “consciousness”. No matter how specific, how elaborate, how clever you think your definition is, it is a “definition”. It’s got nothing to do, nothing at all to do with anything at all. Absurd, just stop. I don’t attempt to undermine anyone’s definition of anything. What good would that do anyone, and what understanding would I ever gain by merely creating definitions and more and more that contradict each other? Why did philosophy have to evolve into this kind of “stuff”..
No comments:
Post a Comment