In the last twenty years, intellectuals have been receiving increasingly rough treatment by the public and the media. The title of intellectual has, over time, become associated with negative stereotypes such as elitism, disengagement from reality, immorality and more. It’s worth wondering whether intellectualism has been driven to extinction or if it can experience another renaissance similar to what it experienced in the 19th century when philosophers such as Schopenhauer and Nietzsche were just starting their careers.
There was a time, not too long ago, when thinkers were admired. By thinkers, I don’t mean that they all read scholarly papers or spent countless hours in seminars discussing theories about physics and mathematics. However, in today’s climate I suppose some of them may have done that, too. In any case, what I’m referring to is people who put a lot of effort into educating themselves and thinking about issues critically.
The goal of intellectualism is to improve society by being informed, rational thinkers whose ultimate goal is to understand how and why things work like they do and eventually try to make things better. That’s what it means to be a thinker, to contemplate ideas and concepts deeply enough so as to come up with something new, something unique and valuable—either through critical analysis or original thought (or both).
Unfortunately, nowadays, it seems as if intellectualism has become more of a forgotten relic than anything else—especially amongst young people (who are supposedly more educated than ever before).
In a society that is increasingly collective and individualistic, it’s easy to fall into thinking that utopian ideas are antiquated and irrelevant in today’s society. But, let us not forget that there was a time when intellectual thought encouraged societal movement toward idealism and betterment of mankind as a whole. Movements like socialism, feminism, environmentalism, etc. have inspired change within societies all over the world.
As people seek to find meaning in their lives amidst political turmoil, social unrest and financial uncertainties, it is easy to become disheartened in our modernist society if we give up on using intellect to push against repression and oppression. We must remember that even though we may be fighting uphill battles with outdated ideals, they were once revolutionary ideas.
To say that intellectualism is dead would be to ignore those who came before us and paved ways for many of our freedoms. We can use intellect to inspire change in ourselves and others through creating conversation about what matters most to us at any given moment. Perhaps then, we can create a new generation of intellectuals who will continue pushing for societal progression through meaningful dialogue.
When it comes to education, our priorities may need some re-adjusting. In order to further education and cultivate intellectual thought in today’s society, we should focus more on developing rationality than on rote memorization. The ability to reason has been at the forefront of human development for thousands of years; when ancient Greek thinkers challenged classical ideas about logic, philosophy and science were born.
Today, a good grasp of logic is still important—and crucial to success in many fields—yet, we don’t place enough emphasis on teaching children how to think critically or rationally. The current educational system focuses too much on facts, figures and dates—not enough on developing critical thinking skills that will be useful throughout life. We could stand to spend less time teaching students what happened and more time helping them understand why things happened, or why they didn’t happen. The point isn’t to know everything there is to know, but rather to learn how to find out.
But, isn’t memorization important? Not necessarily. Though rote learning (memorizing information by repetition) was once considered an effective way of learning, educators now largely agree that such methods aren’t very effective, and they can actually hinder long-term memory retention. For example, research suggests that testing doesn’t lead to better test scores but does result in increased retention over time. Moreover, testing does not accurately predict adulthood accomplishments or success.
In a world that increasingly values technology, efficiency and progress over empathy, community and spirituality, religious fundamentalism will only grow in popularity. Some scholars say religion is humanity’s attempt to make sense of a chaotic and uncertain world; when life becomes too stressful or chaotic, many people turn to religion for answers about how to live better lives.
The danger with fundamentalism is that it can give rise to an us versus them mentality where members believe they are superior because of their faith—meaning they’re inclined to dismiss or denigrate those who don’t believe what they do. For example, some fundamentalist Christians see themselves as part of God’s chosen group while others (Muslims, Jews, atheists) are not. But no matter which side you fall on politically or religiously, there’s no denying that fundamentalism will continue to be a driving force in modern society. This doesn’t mean we have to agree with all aspects of these groups or even like them—but we should at least understand why so many people are drawn to such extreme beliefs.
The main reason that print media has been declining in influence is simply due to its inability to adapt to changing technology and social norms. The world has become increasingly digital and technology is growing at an exponential rate, yet we are still told that, if a story isn’t appearing on TV or in print media, it must not be worth reading or watching. We now live in a world where news stories can travel around the globe almost instantaneously, yet major news corporations choose to release their stories late into an evening when most people are asleep (consider major events such as elections).
While these corporations have remained fairly passive about allowing themselves to become obsolete with time, there have been small trends that suggest they will eventually fade away from relevance altogether. It’s likely that many of them will be replaced by more flexible internet-based organizations. There is also a significant generational gap between millennials and baby boomers, which may contribute to why millennials seem to place less value on print media than older generations do.
Millennials are often accused of being lazy and unproductive because they spend so much time using their phones instead of doing real work like researching information in books or newspapers. However, it could also be argued that millennials are merely responding rationally to how information is distributed today: why read something when you can just look up what you need online? Why pay for something when you can get it for free online? If millennials don’t value printed material anymore, then maybe there really isn’t any point in continuing to produce printed material anymore.
Of course, some things likely will never change no matter how far technology advances. People will always want to read novels and nonfiction books, although I believe many would agree that magazines and newspapers are outdated. They haven’t changed much since they were first introduced hundreds of years ago, and there hasn’t been any indication that anyone cares enough about them to keep them alive in a rapidly changing society. In fact, I think if magazines were going extinct tomorrow, most people wouldn’t even notice until weeks later when they went looking for one while grocery shopping or waiting at the doctor’s office only to find out they weren’t sold anywhere nearby anymore.
If ever there was a form of intellectual tyranny in today’s world, it is perhaps when some governments and religious leaders mandate what people must think and act on if they want to be part of the group, or to avoid being penalized by authorities. Yet, we need only look at history to see that authoritarian regimes are not new; total control over individuals, requiring citizens to give up their rights for protection, is as old as civilization itself. But, how does such a requirement make sense in our 21st century democratic societies?
Isn’t freedom an essential part of any democracy? Not necessarily! Democracy is also ancient! The Greek definition of demos meant people while kratos meant power. In other words, democracy means power to people. It’s about giving every citizen equal voice and influence over political decisions. The two most common forms of government are democracies (where power lies with all eligible voters) and autocracies (where power lies with one person). The latter has been around for thousands of years but has never been very popular.
Democracies have existed since 500 BC but were rare until after WWII (and even then autocratic systems were more common). Nowadays, however, more than half of all countries in the world are democracies—although many still have significant problems with corruption and inequality within society.
Of the twenty-two civilizations that have appeared in history, nineteen of them collapsed when they reached the moral state the United States is in now.Arnold J. Toynbee (April 14, 1889 – October 22, 1975)
What about intellectuals, then? Should they be free to say whatever they like? Shouldn’t everyone have a right to express their opinions? After all, isn’t that what democracy is all about—free speech for everyone? Yet, what if those opinions are dangerous, offensive, or just plain wrong?
Shouldn’t public figures who hold positions of responsibility—teachers, scientists, politicians etc. set good examples and show respect for other members of society by avoiding hurtful comments and promoting tolerance instead? Or, should each individual be allowed to express themselves freely without fear of sanction or censorship from others who disagree with them or feel offended by them? These are complex questions without easy answers.
At least one thing seems clear: freedom of speech doesn’t mean you can say whatever you like whenever you like without consequences!
In 2022, it can be argued that technology and its associated buzzwords are largely responsible for a shift in how society understands and interacts with itself. The internet, smartphones, cloud storage and social media; arguably one of the most important tools of communication in human history has evolved into a collection of interrelated services that allow people to connect with each other around activities, interests and even news stories. In turn, those services require resources to run which are stored on servers across multiple geographical regions which aggregate data from across said regions, adding untold value to all aspects of business and research which use these networks daily in their day-to-day operations.
As you might imagine, such an environment is ripe for exploitation by nefarious individuals who wish to gain access to sensitive information or disrupt critical operations. While security professionals have long been aware of potential threats posed by such an environment, many organizations still may not fully understand what they’re up against when it comes to what power worldwide reach gives individuals.
We live in a society where more and more people are becoming influencers—nouns that tend to refer to either a person or organization with a wide reach, whose opinions and choices set trends, shape opinions, and move markets. The rise of social media has only accelerated this phenomenon; Twitter influencers can promote brands that are miles away from their target demographic.
Today, as individual thought leaders play larger roles in public opinion than ever before, intellectualism must adapt to function within 21st century marketing tactics. Therefore, it is important to understand how modern consumers interact with influential figures and determine if intellectual trends within society have evolved alongside them. How can we still talk about critical thinking in an age when so many voices speak louder than one? How does being an intelligent consumer differ from being an intelligent thinker?
It’s an ever-changing world, with pressures to remain relevant and engaged at every turn. It may be that we have to alter how we view what is and isn’t intellectual in order to keep challenging ourselves to stay on top of our game as individuals and as a society. What it is not, however, is easy; doing so requires us not only to look hard at who we are now but also who we want (and need) to be in the future. There are no right or wrong answers here, just possibilities worth exploring. Whether you ultimately decide that intellectualism is dead or alive, we can all agree on one thing: learning about and discussing these issues will help make us better people.
This article is a follow-up piece to The Cult Of Ignorance.
The image used in this article is a depiction of a Phrenology chart. Phrenology is a pseudoscience which involves the measurement of bumps on the skull to predict mental traits, which was influential in the 19th century (circa 1810-1840).
Thursday Theories – Reality Is important | Off-Topic: Anti-Intellectualism | ’22 A To Z Challenge – A [AGNOTOLOGY] | Anti-Intellectualism | Matthew Arnold, Antonin Dvorak, And The Mystery Of America’s Underperforming Culture | 28 DAYS OF BLACKNESS-“You Sound White”: The Harm Of Internalized Anti-Intellectualism | On Trumpism: Promotion Of Violence And Mockery To Popularity | The World Is On Fire: Why We Should Be More Concerned Than Ever | Making The Case For Why I Didn’t Need An Education | Fame Itself Is A Cancer And Ego Its Seed | Don’t Focus On Winning, Focus On Progress