Why there will be no more en masse mourning of celebrities in the future

Subscribe to our Newsletter

Photograph: 'Walk of Fame' / Davide D'Amico
Photograph: 'Walk of Fame' / Davide D'Amico

2016 has seen a massacre of celebrity fixtures. The sad reality is age and sickness will come to us all in the end, but there’s been an unusually sincere outpouring of public grief. Perhaps it’s the scatter-shot demise of so many household names, or perhaps there’s a more stoic explanation for the phenomena.

Most of this year’s casualties were in the final quarter of their life; post-war successes that found fame in an age when television and radio were the only means by which everyone, from entertainers to politicians, could become a celebrity.

For people in their fifties and older, 2016 is reflective of the circumstances in which the most beloved of entertainers came to be so well-known in the first place. For decades, there was a natural editorial Darwinism that restricted what the population had access to. From fashion to film to books there was the illusion of consumer choice, yet the cost and limitations of production ensured that what was on offer could only be determined by a relative few. In Hollywood, this was called the ‘studio system’, in music, it was ‘industry’ and in publishing the ‘world’, but all are the money-making selection and promotion of what was likely to sell.

Today, in our interconnected, globalised and culturally internationalist world, it’s a macabre, but easy temptation, to look around and imagine which artists will generate the same shockwaves when they die. Who will, for the twenty-somethings of today, be the ‘legends’ that receive posthumous awards and extensive media coverage lavishing praise or skewering with retrospectives?

While there remain a handful of truly iconic celebrity names such as Stallone, Schwarzenegger, Spielberg, Hanks et al.; their generation will likely be the last with universal recognition and the general consensus that, like them or loathe them, they were something big. And it’s all to do with technology.

Gone is the brutality of artistic natural selection. While big name associations of studios or companies might mean more money for a larger marketing campaign, it is no surefire way to guarantee the best of the best art forms.

In the 21st century, if you want to act, sing or dance – get a YouTube channel. If you want to write – get a blog. If you want to publish a book, make a film or create your seminal magnum opus in any field, then crowdfund it. Technology has made it possible for anyone to create anything if they are IT-savvy and have the time.

‘Avant-garde auteurism’ is not a tautology, but a reflection of what’s now possible from your bedroom with a computer. ‘Independent’ is no longer a niche, art house word for ambitious but non-commercial, is now a statement of the totally obvious about the blurred sea of choice available to the consumer and critic alike in today’s market.

The results are implacably good. Netflix and Amazon Prime, for example, have produced shows that might otherwise have remained in the mind; they’ve resurrected dead ones and their programmes cater to every taste. YouTube and all manner of streaming services have made the distillation and spread of information, music, humour and news instantaneous and plethoric.

It’s now impossible not to at least be aware of blogs, albums, films and productions that would have once been too small-fry and too niche to make it into conventional mediums. The art forms, programmes, films and books which previously sat in the shadow of big business have attacked its domination and now rival it. Irrespective of who and how it was produced it, there’s a piece of the pie for any artist who can make something and is liked.

Has proliferation made it harder to judge quality? The philosopher Roger Scruton makes the point that a difference between “high culture” and “popular culture” is an important one. The former, he advocates, is founded in tradition and expertise accumulated with extensive practice and professional application “and by a broad endorsement of the surrounding social norms.” He adds that:

“If we look at the true apostles of beauty in our time…we are immediately struck by the immense hard work, the studious isolation, and the attention to detail which have characterised their craft. In art, beauty has to be won and the work is harder as the surrounding idiocy grows. But the task is worth it.”

Classical music and art are the typical models of high culture. Scruton contends that his differentiation is not elitist, but practical: “The high culture of our civilisation contains knowledge which is far more significant than anything that can be absorbed from the channels of popular communication.”

High culture, for Scruton, is a work or works which have truth expressed by an original and talented application. It forms the backbone of culture and should be preserved at all costs because it’s a “precarious achievement, and endures only if it is underpinned by a sense of tradition.”

Nevertheless, Scruton’s distinction is unsuitable for what might be called the post-modern digital age. The traditional line between high art and popular culture has been blurred, or certainly expanded, because everything is available at the command of a button. As 2016 has demonstrated, a critical eye is still necessary to determine the truly original from the mundane. Beyond the fact that many people mourn for celebrities when they die, the absence of criticism is not evidence of universal praise. As Scruton himself says:

“The cult of genius, therefore, led to an emphasis on originality as the test of artistic genuineness – the thing that distinguishes true art from fake.”

If that’s true, is it not going to be impossible to find diamonds and gems in a time already overpopulated by choice with little creative control?

To the contrary, compared to the past fifty years it will be easier to determine in the future. There will be less deference and passing of the buck to senior executives who alone have determined which artists will be backed and who (or so the logic goes) must be good on some level. Choice necessitates a better capacity for criticism.

In today’s world, proliferation makes originality harder to accomplish. Because someone can do something, is no guarantee that they can do it well; a point Scruton acknowledges:

“Originality is hard: it cannot be snatched from the air, even if natural prodigies…Originality requires learning, hard work, the mastery of a medium, but most of all the refined sensibility and openness to experience that has suffering and solitude as its normal cost.”

Ultimately, in such a fast-paced world of free-flowing information, there is no difference in high culture and popular culture beyond what is likely to last. Skill, on whatever level and for whatever subject, is not be appreciated by popularity alone, albeit that plays a factor, but rather by artistic merit. Nor does money poured into something automatically guarantee success, nor the absence of grand funding presume failure.

Popular culture, rooted in the zeitgeist of the day,  is defined by Scruton as “pre-eminently youth culture….a culture largely indifferent to national borders.” There is an element of truth to this, but the dialectic is complicated as popular culture can evolve into high culture over time – as was the case with the infant film industry in the US which is now the staple of American culture.

As the decades continue, there is not the absence of difficulty but a renewed challenge to finding those diamonds in the rough, and creative originality is the best benchmark for determining what is worthy of universal praise in the cultural lexicon.

As Scruton argues:

“Utter trash accumulates…largely because it has a price tag. You cannot own a symphony or a novel in the way you can own a Damien Hirst. As a result there are far fewer fake symphonies or fake novels than there are fake works of visual art.”

Copying, both literally and artistically, is easy in the technological age, but it does not bypass the central cornerstone of originality. The entertainers, musicians, actors and politicians who have all been usurped by the grim reaper this year represented a generational system, some of whom were truly original and others who dominated their field with personality but little talent.

The near universal kitsch and sentimentality seen in 2016 will likely never repeated in the decades ahead because the proliferation of mediums and choice has removed artistic domination of a few.

Surely, a welcome thing.

Share Darrow

We believe in the free flow of information. We use an Commons Attribution-NonCommercial 4.0 International License, so you can republish our articles for free, online and in print.

Creative Commons Licence

Republish

You are free to republish this article both online and in print. We ask that you follow some simple guidelines.

Please do not edit the piece, ensure that you attribute the author, their institute, and mention that the article was originally published on Darrow.

By copying the HTML below, you will be adhering to all our guidelines.


About Alastair Stewart 228 Articles
Alastair Stewart is a freelance writer and journalist. He was previously a press officer in the Scottish Parliament and worked in public affairs. He graduated from Edinburgh University with an MA in International Relations and writes regularly on politics and the arts in the Spanish and British press.

Be the first to comment

What do you think?