(2018-01-19) Rao Intellectualism In A Digital Milieu

Venkatesh Rao: Intellectualism in a Digital Milieu

...typically have upward-oriented metaphoric understandings of intellectual culture: head in the clouds, up in their ivory tower, ideas flying overhead. But intellectuals themselves -- and I'm using the term as a non-pejorative label -- tend to have a downward or inward oriented understanding of their passion for thought and ideas: digging below the surface, drilling down, diving deep, getting to the root of things, getting to the heart of the matter, looking beyond appearances.

If you favor the "up" metaphors, you're fundamentally anti-intellectual even if you're really good at thinking, or even a stable genius. If you favor the "inward" or "downward" metaphors, you're fundamentally an intellectual, even if you suck at it.

If you get no pleasure out of thinking, you will primarily be sensitive to its rewards: in terms of wealth, power, status, credentials, or class membership.

ideas and thinking do have practical potentialities. But to assume ideas are about their practical value, or that thinking is a "tool" to get at that value, is to make a big mistake.

If on the other hand, you do get pleasure out of thinking, the experience of the process itself, rather than its rewards or consequences, will be front-and-center.

Since much of the process happens inside your own head, your natural orientation metaphors will be inward and downward, towards a frontier inside your own head.

Historically, the attraction of the world of ideas has always been the possibility of finding, in that world, a secure source of meaning away from the ups and downs of "real" life. A promised land, a Xanadu or El Dorado inside your own head.

Of course, once you dive deep enough, you find that the world of ideas is not a refuge for the mind, or stable source of meaning. It is closer to a psychological torture chamber. But by then you're hooked.

are computers intellectual or anti-intellectual?

At a surface level, the obvious answer is that they're anti-intellectual.

But today, when you look at the fascinating things going on in the minds of computers, whether it is deep-dreaming, or discovering Go strategies no human ever considered, I wonder.

Computers have unplumbed depths to their thinking that might turn them into intellectuals addicted to exploring them. What does this mean for us?

Fears of a paperclip-maximizing AGI (AI) are something of an anti-intellectual projection bogeyman. To get past those fears, ask yourself, what would an intellectual AGI want?

Need evidence of that likely shift in relationship structure? Among humans, appreciating other species beyond seeing them as food, threats, anthropomorphized proxy babies, or pests, tend to be an intellectual thing to do.

Even when we have the ability to wipe out a species either intentionally or accidentally (and we've wiped out many in the past), these days, we tend to see that as a bad thing. We can expect an intellectual AGI to reach a similar conclusion.

AIs broaden and deepen the world of the thinkable, and if you have an intellectual orientation, there is no love-hate dynamic. It's an unambiguous love.


Edited:    |       |    Search Twitter for discussion

No twinpages!