A day in the life of… A script writer
A sequel packed with character and heart
I find it hard to get excited about Marvel these days. Their ever-rolling conveyor belt of movies is rapidly approaching its tenth year, and the much-discussed ‘superhero fatigue’ is setting in. The studio’s track record with sequels is also undeniably shaky—none of Iron Man, Thor, or Avengers Assemble received the follow-up they deserved. It was therefore with some trepidation that I approached Guardians of the Galaxy Vol.2.
Its predecessor, Guardians of the Galaxy, was the surprise hit of 2014—arriving after a mixed batch of Marvel sequels, Guardians was the colourful, brash, foot-tapping shot in the arm the franchise needed. It was different. I was concerned that the spirit of a film so proudly unique would not survive a sequel, and that Guardians Vol.2 was destined to be remembered as just another part of Marvel’s ever-spreading homogeneous blob.
Guardians Vol.2 is a film that consistently defied my expectations. I was prepared for a conventional three-act CGI-fest, with some good tunes, kooky characters, and the smattering of one-liners that kept the trailer interesting. Instead, director James Gunn has crafted one of the funniest and most genuinely moving Marvel films I can remember.
This second outing sees Star-Lord, Gamora, Drax, Rocket, and Groot capitalising on the first film’s victory by jetting around the galaxy as heroes for hire. After a job for white and gold racial purists, the Sovereign, goes sideways, the Guardians find themselves with a price on their heads. Arriving to complicate matters are Star-Lord’s estranged father Ego, Gamora’s vengeful sister Nebula, and the gang’s erstwhile adversary ravager captain Yondu. These familial bonds manage to keep the action grounded in an otherwise bombastic cosmic ride—this balance between small and big lends emotional gravity previous Marvel instalments have lacked.
Gunn knows the strength of his characters, and plays to it. The Guardians are a team only two films in the making, yet they continue to crackle with chemistry far in excess of the established poster-boys of the Marvel Cinematic Universe. Dave Bautista’s Drax the Destroyer does more to endear himself to me in one brief conversation than Chris Evans’ worthy but bland Captain America has in his last five feature films, while Bradley Cooper’s consistently excellent Rocket Raccoon pushes the snarky engineer role to hilarious and destructive limits Tony Stark could only dream of. The greatest surprise comes in the form of Michael Rooker’s world-weary Yondu, a character afforded a level of depth and gravity belying his supporting role in the first movie.
Like its predecessor, Guardians Vol.2 is comically on-song. The novelty of a wisecracking raccoon may have worn off, but the laughs are now more evenly spread among the team, with Bautista stealing many scenes as the cluelessly blunt Drax. The visual comedy is equally well done, with a surprise Pac-Man cameo and interdimensional warp travel closer to Douglas Adams than George Lucas. Admittedly several jokes do fall flat, including an overly long Baby Groot prison break sequence.
In Ego, the Guardians face the most convincing Marvel villain since Avengers Assemble’s Loki, far outstripping the blue-faced stock character they squared off against in their first outing. Without spoiling the film’s excellent ending, I would also note how refreshing it is to see a Marvel film with actual stakes for our heroes, as opposed to the somewhat damp zero-casualty showdown conclusion of Captain America: Civil War.
In the end, my fears were unfounded. Guardians Vol.2 is not a perfect film—its second act meanders and the Sovereign feel extraneous—however it succeeds where so many Marvel sequels have failed. Gunn strikes an excellent balance between emotion and comedy, once again proving that the Guardians are the heart of the Marvel Cinematic Universe. Oh, and the soundtrack’s pretty neat too.
C+: “Free speech is the lifeblood of a university” says Oxford—but is it under threat amongst today’s “snowflake” students?
In a recent article for The Spectator, James Delingpole discussed what he views as Oxbridge’s ‘snowflake generation’. He portrays Oxbridge students as ‘snowflakes’ who are uncomfortable to practice their freedom of speech, “creating a sterile, conformist, PC monoculture of earnest state—indoctrinated Stakhanovites”. Yet Cherwell’s second investigation of this term shows evidence to the contrary. In a poll conducted by C+, it was evident that freedom of speech is important to Oxford students—79.9 per cent of students felt like freedom of speech is important as part of their university experience. The University states that: “Free speech is the lifeblood of a university. It enables the pursuit of knowledge. It helps us approach truth.”
However, the poll revealed mixed opinions amongst Oxford students about their experience and the future of freedom of speech. Whilst the majority of students believe that freedom of speech is important to their experience at university, 27.9 per cent of those who responded to the survey felt that their freedom of speech had been restricted by the University or their College. This shows that, whilst freedom of speech is highly valued by Oxford students, it’s not necessarily the reality for some.
Our results show trends in relation to the restriction of freedom of speech that Oxford students feel, between those who identify as right-wing and left-wing.
65.8 per cent of students who responded to the survey identified as left-wing or centre-left, but only 15.3 per cent of these respondents felt that their freedom of speech had been restricted. This shows a marked difference to the right-wing responses. 20.8 per cent of students who responded identified as right-wing or centre-right, and 57.1 per cent of those felt that their freedom of speech had been restricted by the University or their College. This was a common view amongst surveyed students.
Many students addressed the political issues that surround freedom of speech, viewing it as a passive issue among students, rather than an institutionalised ‘ban’. Many of the comments received in the survey discuss the fear of right- wing students in addressing opinions or issues that were contrary to the “mainstream left-wing viewpoint”.
The survey also found that 65 per cent of students think that freedom of speech in Oxford is under threat. This also relates to the left-wing and right-wing split in responses. 49.7 per cent of left wing responses thought freedom of speech was under threat, whilst 78.6 per cent of right wing students thought freedom of speech was under threat. These trends tie in to questions of freedom of speech in political thought over time—a more left-wing approach supports the silencing of some groups’ freedom of speech to promote the freedom of speech of those that may be marginalised in society, whilst right-wing thinkers tend to believe in having an equal platform for all voices.
This debate between platforming versus freedom of speech was evident in our responses. In their comments, many students expressed their opinion on the differences between platforming and freedom of speech.
Although only 7.1 per cent of students who responded had protested against a speaker at the Oxford Union, many students expressed their opinion on the debate on the merits of providing a platform for controversial speakers. One anonymous student said: “I think the difference between freedom of speech and being given a platform needs to be borne in mind constantly when considering this issue—one is a right, the other is a privilege.” Among the chants of protestors at the talk by Corey Lewandowski (Donald Trump’s former campaign manager) in late 2016 was the chant: “This is free speech, that is a platform.”
Another response to the survey advocated this no-platforming approach to speakers at Oxford venues: “In my view, refusing to host a speaker whose views are harmful and directly affect marginalised members of society is a perfectly legitimate action. There’s a distinct line between free speech and hate speech, and the rejection of the latter does not even come close to an abolishment of the former.”
Another student raised the point that students who protest against speakers at the Union are allowed to exercise their freedom of speech as it is “not threatening the free speech of anyone. Protestors on the street do not have the position of power over a speaker at the union to silence them in any way.” But, contrary to this, some students expressed the “irony” of protestors against plat- forming, suggesting that speakers at the Union should also have the right to practice their freedom of speech. Whilst some view platforming as a right of free speech, others viewed it as a way of giving voice to hate speech and discrimination.
Among the concerns expressed in the responses of our survey, students also offered solutions to the perceived threat of freedom of speech among Oxford students. Some students stressed the importance of discussion and debate as the way forward—”he whole purpose of free speech is to allow ideas to be critically analysed. We are entitled to speak freely, but others are entitled to call us out on what we say”
Other students highlighted the importance of the need for a safe environment, either within College or JCR meetings, for every voice to be heard. As many students felt that freedom of speech is especially restricted or feared in environments that are deemed as ‘safe’ and ‘open’ like JCR meetings or around college, a common verdict was that opinions should be conversed and open rather than condemned.
Snapshot: Salvador Dali and the legacy of surrealism
Surrealism is based on the exchange and juxtaposition between images grounded in reality, versus the unconscious and/or irrational. Among the many great surrealist artists of the 1920s and beyond, one that defined this movement so powerfully in both their persona as well as in their work, was Salvador Dali. He is famously quoted as stating: “The only difference between myself and a madman is that I am not mad!”—a phrase that perhaps touches most deeply on the fine balance his work stands on between the realms of reality, and surreality.
Salvador Dali was more than an artist—he was an icon and muse himself for other artists, film directors, and many more. Yet at the heart of his artistic inspiration was the disintegration of sanity itself.
He was the pioneer of the Paranoiac Critical Transformation Method, a way of perceiving reality in which irrational knowledge stemmed from the state of paranoia and creating a “delirium of interpretation”. Ranging in intensity from merely imagining other shapes within natural ones to even inducing states of paranoia in order to envision the surreal scenes Dali is known for, this method was the creative source of Dali’s surrealist works.
Dali’s “hand painted dream photographs” (a term he uses to describe much of his work), reflects this fluid exchange between the reality of the landscapes and recognisable features in his works, with the displacement of the unfamiliarity of his famous melting clocks and hordes of ants that both symbolise the passing of time.
‘The Persistence of Memory’ (1931) incorporates these features alongside the realism of the Catalan cliffs gleaming golden in the background of the landscape, a nod to Dali’s own homeland. One of the most fascinating features of this piece is the fleshy mass in the centre of the painting. In this, it is possible to discern facial features: a nose, eyelashes, and what could be taken as a tongue. The deformed, melting way in which it has been painted adds to the general atmosphere of decay. Twisting philosophical and unconscious threads of thought, Dali himself states that the intention of the work, using Paranoiac Critical Transformation Method, was to “systematize confusion and thus help to discredit completely the world of reality”. The menacing undertones are clearly derived from the negative energy of the paranoiac state: the twisted facial features, insect-like eyelashes and swarming ants.
Whilst the contradictory and dense philosophical theories Dali proclaimed may be complicated to interpret, it is clear that the exchanges between the real and unconscious incite a gross fascination for the viewers. Genuine hallucination rather than mere imagination thus serves as the stimulus for surrealism, drawing new lines in the way in which reality may be exchanged in art for the unconscious imagination, however subversive the paranoiac mind can twist familiar imagery.
The ICC’s neglect of Irish cricket
March 2, 2011 should have been the turning point for Irish cricket.
Seen previously by one and all as a minnow punching above their weight with an occasional upset and regular participation at world tournaments, Ireland’s stunning three-wicket win over England in Bangalore in a World Cup group match was a seminal moment.
For despite years of neglect from cricket’s governing body, the ICC, Ireland didn’t just win, but they won professionally.
Even when under the cosh, Ireland’s fielding was athletic, and of a high standard. And whilst larger-than-life all-rounder Kevin O’Brien played the innings of his life, he was supported by sensible, cool-headed knocks by Alex Cusack and John Mooney, who manoeuvred the world’s best spinner at the time—Graeme Swann—into gaps and turned ones into twos. The Irish were no longer a team of plucky amateurs, but a professional outfit.
Indeed, since their St. Patrick’s Day win against Pakistan just over a decade ago, Ireland have grown their cricketing infrastructure from that of a minor county to an impressive, full-time set-up. They have 30 full-time staff, 19 central playing contracts, and an academy run with the support of a ten-year deal with an Indian business conglomerate. Participation figures have quadrupled since 2013, with the number of active players moving from 13,000 to around 52,000, and the domestic provincial tournaments played between Leinster, Munster and Ulster have achieved first-class and List A status this year.
Irish cricket is growing with the long-term future of the game in mind: chief executive Warren Deutrom’s goal is “to make cricket a major sport in Ireland.”
But on the pitch, things have not gone quite so well since that famous night in Bangalore. Since the 2015 World Cup, where victories over West Indies and Zimbabwe showed their credentials as a major cricketing nation, Ireland’s results have fallen off dramatically. Indeed, even at the Associate level—the competitions between the nations that the ICC considers ‘second-rate’—Ireland’s dominance has stopped, as Afghanistan assert their standing as the best side without Test status.
This has been the main obstacle that Ireland’s growth has faced.
Despite their results at world tournaments often seeming to suggest they are superior to Zimbabwe and Bangladesh, Ireland do not have the Test Match status afforded to full members of the ICC. As such, they are unable to play red-ball cricket—seen by most as the pinnacle of the game—against the biggest sides, or even any other full members.
Furthermore, the lack of regularity to fixtures outside of the major tournaments means that Ireland rely on other sides to gain exposure to top-level cricket. They have often been granted one-off fixtures against teams that tour England as part of a warm-up for one-day series, but it is rare for a major nation to afford them a stand alone series.
Therefore, last week’s two-match ODI contest against England should have been something of a ground-breaker. Fixtures at Bristol and Lord’s in early-season conditions gave Ireland the opportunity to perform in front of big crowds and a large global audience hoping to see England slip up in their Champions Trophy preparation.
However, without disgracing themselves, Ireland showed the extent to which they had stalled over the past six years. The ‘golden generation’ of Will Porterfield, the O’Brien brothers, John Mooney and Ed Joyce is on the way out, and the replacements, most of whom are slightly too old to have benefited from the current player pathway system, are lacking in skill and nous. Their defeats—one crushing, the other comfortable—served as a reminder about the ICC’s neglect of smaller nations over the past decade.
It is impossible to imagine in football, for example, FIFA actively trying to avoid growing the game, and giving only the best-developed nations in the world the opportunity to play each other. It is harder still to imagine a country receive 196 times as much prize money for a first-round exit than a last-eight finish by virtue of being a bigger nation, but that is exactly what happened in 2007: Ireland’s Super Eights finish earned them some $56,000 in comparison to the eleven million afforded to Zimbabwe.
Indeed, by taking as long as they have to recognise Ireland’s achievements and progress—it is expected that Test status will eventually be granted to them next month— the ICC have stalled development, and made mismatches in Ireland’s first few Tests much more likely.
Worse still is that this means that the incentive to give more Associates a chance at the top level will be diminished, as a poor early string of results will seem to justify their reluctance to give Ireland an opportunity.
The mismanagement of the ICC has been well-documented, but the administrators’ heads should be hung in shame regarding their management of Irish cricket: for the good of the game, Ireland’s chance should have come by now.
Analysing men, makeup, and masculinity
In recent years, there seems to have been a noticeable shift in attitude towards male grooming. While once, a simple bar of soap would suffice, now many men boast skincare routines to rival the average woman. Encompassing nails, skincare, and hair removal, the male grooming market is growing rapidly, with sales worldwide predicted to top £15 billion this year. While nowhere near as mainstream, even cosmetics are slowly being incorporated into some men’s grooming routines, especially in emerging markets as disposable incomes rise. Celebrities such as Johnny Depp and Bradley Cooper have been photographed at premieres wearing definite traces of makeup. However, this trend is not just reserved for stars, as Tom Ford and Marc Jacobs have both launched new male cosmetics ranges in recent years.
The now commonly used term ‘metrosexual male’, describing a man who devotes time and money to his appearance, was first coined in 1994 by the journalist Mark Simpson. In an Independent article, Simpson drew attention to how the taboo around men caring about how they looked was finally changing. Over 20 years later, being well presented is certainly considered a desirable trait, with an almost competitive edge. Well groomed men are thought to give a far better impression, both socially and professionally, than those who don’t make the effort. But where has this confidence to delve into those areas of beauty traditionally considered feminine come from? And when did male grooming products transition from a frivolity to an essential basic?
With political agendas in the late 20th century breaking boundaries, male and female fashions began to merge for the first time, and the resulting punk movement produced an androgynous generation. Then, with the rise of social media after the turn of the century, along with the influence of the porn and fitness industry, new standards of physical beauty focusing on perfection began to emerge. In a ground-breaking move last year, Covergirl featured James Charles, a YouTuber, as the first man to be the face of any makeup brand; and earlier this year Maybelline followed suit, choosing Manny Gutierrez to front their ‘Big Shot’ mascara advertisement. In a statement, Covergirl said they were aiming to “redefine what it means to be beautiful”, an honourable move showing how global companies are slowly realising the power of diversity.
However, not all brands are quite as progressive, and the reality is that there is still a long way to go with respect to the advertising approach of most companies. While male grooming no longer has the stigma attached to it of previous years, there’s still a sense that beauty needs to be considered a masculine activity in order for men to buy into the idea. It is interesting to note that male-targeted products are referred to as ‘grooming’, a term usually applied to horses or dogs, whereas women’s are ‘beauty’.
This dichotomy has become critical to the marketing of male products, which are generally covered in dark, ‘masculine’ colours and are always clearly labelled ‘for men’, reminding the customer that their manliness has not been compromised by purchasing a facial cream (despite the high chance that the content is identical to those marketed to women). In fact, for one of the deodorant brand Axe’s more memorable advertising campaigns, they decided on the horrifying tagline ‘if you help her choose the clothes someone else will tear, she’s seeing you with braids’—with the suggestion that the body spray in question would make these poor ‘friendzoned’ specimens into ‘real men’.
Gendered products seem to be a clever way of extracting more cash from consumers, but they also help subtly promote the age-old stereotypes of gender we have been trying to leave behind. Adverts targeting men often tap into the traditional view that masculinity is associated with strength and dominance, promising men that this particular moisturizer will help their sexual prowess and financial prospects. Ironically, this means that an industry giving men access to traditionally ‘female’ products, which surely should be helping to broaden our understanding of masculinity, has ended up in some ways actually reinforcing the idea that men should distance themselves from anything ‘feminine’.
Beauty hasn’t always been considered exclusively feminine property. To the ancient Greeks, appreciating beauty was inherently part of the masculine, and Eros—as the god of desire—embodied this idea. Yet somehow we seem to have largely lost that concept. Understandably, brands are eager to tap into this lucrative growing industry, which for many years was only open to half of the population.
However, as agents of popular culture, cosmetics companies have a platform, and arguably responsibility, to inspire positive social change. Men are freer now than ever before to transform themselves in whatever way they want, embracing habits once dismissed as strictly female territory. Our understanding of gender is undergoing a renaissance in the 21st century, and perhaps it is about time that companies saw this diversification of masculinity as something to be celebrated.
A voice for the evidence of the refugee crisis
“It’s the usual suspects that attend these things”, observes Julia Katarina, an accomplished musician and founder of Music for Refugees, having just succeeded in leaving us on the edge of our seats after a consummately beautiful rendition of a refugee Syrian love song. Christ Church’s one-off exhibition, Art and Awareness: A Showcase in Solidarity with Refugees on 12 May, which saw a score of varied and talented performers and artists take to the stage in solidarity with refugees, hoped to tackle a sense of the desensitisation and numbness towards humanitarian crises that is so prevalent today.
It is a sobering thought that, out of the world’s official count of 21.3 million refugees not only are over half of them are under the age of 18, but more embarrassingly, among the areas in the world that host the world’s displaced, Europe hosts but six per cent of them. We ourselves cough to the fact that, prior to this exhibition, this harrowing statistic had neither come to our attention nor been within our contemplation, reinforcing the main premise of Julia’s concern: the veritable lack of awareness as to the extent of the refugee crisis, and a pressing need to use the medium of art as a conduit to inspire awareness, urgency and action.
Commencing the showcase with poetry and spoken word was Aleppo-born Smir Darwish, who sought refuge in the UK as an asylum seeker in the Second Gulf War. His poem comprised a response to what he described as the most divisive of questions universally faced by refugees and asylum seekers upon entry at their final destination: where do you come from? The answer to this question, he tells us, dictates the success of a refugee’s struggles, sacrifices and sorrow—a ‘wrong’ answer diminishing these efforts to an exercise of futility, cancelling out the pain and effort endured by mere virtue of their birthplace—a simple geographical accident.
The voice afforded to Amir and the collective message he transmitted through his poetry—of the ‘bullet-wounded’, of ‘hungry stomachs’, of ‘single mothers’—elevated his role as one tantamount to a spokesperson for the silenced and the suppressed.
Another contribution derived from the photography of Gideon Mendel, whose prolific works have spanned decades and delved into issues such as apartheid and climate change, but more recently the refugee crisis. The nature of his address was twofold: first, it constituted an exhibition of his photography of the refugee crisis specific to the Calais Jungle, and second, his ruminations on the role of the artist.
One aspect of the exhibition was a series of photographs presenting his findings in the ‘art of collecting’—namely, a hotchpotch of objects discarded across the dismantled migrant camps that he acquired during his time spent there. The vestigial remains of a burnt shirt, the remnants of children’s clothing, a tally of used toothbrushes, a Sudanese sandal, and most poignantly, an array of filth-laden toys and story-books—minus their owners. The photograph of ostensible plant pots which, upon closer inspection, are tear gas canisters painted by children in the Jungle nursery, tells a story of the destruction of innocence. These artefacts of destitution, as Mendel told us, act “almost like evidence” of the suffering endured for the public to inform themselves with. Mendel’s unique style of photography in the Jungle, prior to his archaeological stint, lies in the idea of him giving the refugees use of the cameras, so that the point of view of the actual victims themselves could inhere in his work, rather than that of the comfortable photographer, giving the refugees an activity, a platform, and as he put it, a “space to find a photographic voice”.
His eventual transition in the Jungle from photographer to collector, he told us, lay in a confrontation about how there are “so many photographers, so many photographs made—the people resented it and you felt like the enemy”. The conflict of interest between the photographer having to make a living and the need for immediate humanitarian help, he opined, amounted to an almost “destructive force” behind the excess of photographers, when married with the complete lack of immigration lawyers in the Jungle, for instance. The art itself used rebelliously here, as a medium to convey the truth of the refugee crisis, and the discourse had between the artists and the audience attempted to distinguish a positive contributor from an officious bystander when cataloguing the horrors of a phenomenon such as that of the Jungle.
How the ensemble superhero film became king
4 May 2012 was a day that changed the landscape of cinema. Joss Whedon’s Avengers Assemble, the climax of the first phase of the Marvel Cinematic Universe, was an epoch-making, trend-setting, earth-shattering event, the apotheosis of Marvel’s shared universe project. Still one of the best—and most successful—superhero films of all time, it laid the groundwork for a new age of team-up films. But how has Marvel managed to remain king of the genre, despite renewed assaults by Fox’s X-Men and the DC universe?
Many have pinned Avengers Assemble’s staggering success on the work that had already been done to build up its universe; the sixth film in the MCU, each of its predecessors teed up one of the major players. Such a foundation enabled Marvel to offset many of the major problems which tend to plague ensemble films: instead of being forced to watch cyphers fail to interact in any meaningful way, or endure hours of bland hollow shells going through the motions, they come together fully-formed, the audience understanding their motivations, histories, personalities, and drives.
And yet, it is untenable to argue that this is the only road to success, not least because 2014’s Guardians of the Galaxy was a smash hit which introduced a largely unknown cast of wacky characters. It made excellent use of Chris Pratt’s Peter Quill as a way into its ensemble cast, allowing the audience to get to grips with its universe one alien at a time. Its triumph is rooted in the regard it displays for each member of its cast: they all get their minute to shine, are all imbued with history and inner life, are all constantly forming opinions on the other characters and the group and then reforming them in the light of new circumstances.
This is the thread that links them all, the cloth from which all ensemble films worth watching are cut. Avengers Assemble, despite inheriting a pre-established cast, lets character dynamics play out organically: the ideological tension between Iron Man and Captain America, the science-based bonding of Tony Stark and Bruce Banner, the animosity felt by the team towards Fury’s secrecy all feel like natural consequences of these characters colliding. This is not a team composed of vapid symbols and vacuous icons, devoid of humanity, separated from comprehensible group dynamics in favour of trite iconography.
Marvel’s productions have notable (and well-noted) flaws: their villains are often plot devices where a character should be, their soundtracks rarely do anything more than exist, and the television-side of their universe has entirely failed to reach the heady heights of their silver-screen epics. Nevertheless, their unique character-driven formula has allowed for the creation of the best superhero ensemble films ever made. X-Men eat your heart out.
Getting to grips with the adult cartoon craze
Bojack Horseman. Archer. Rick and Morty. All of them are big-name shows, finding huge adult viewerships. All of them are worthy of the praise and attention lavished upon them. And, most intriguingly, all of them are cartoons. However, despite finding success within a few years of one another, each of them has carved out its own niche in this increasingly crowded sub-genre.
Archer hews most closely to what one might expect from an adult cartoon: a ridiculous, raucous, raunchy spectacle of anarchic violence, the show succeeds thanks to the quality and variety of its comedy. It is a masterclass in the modulation of various comedic disciplines, sometimes dabbling in gross-out humour, always replete with quick-fire dialogue, and perennially bursting at the seams with recurring gags. In fact, the show has developed such a rich tapestry of recurring jokes—be it Sterling shouting “Lana,” or the inevitable refrain of “phrasing” that follows every innuendo—that all it has to do is stitch them together in new ways to produce quality content.
This approach, this drive towards the distillation of pure comedy, is entirely distinct from those taken by Rick and Morty and Bojack Horseman, shows which marry their surreal hilarity with darker, more mature themes.
Rick and Morty, for instance, returns time and time again to Rick’s abusiveness, his oscillation between affection for his family and absolute disregard for their fates, between unlikely hero and vulgar villain. And yet, despite this interlacing of mature drama and madcap comedy, there’s something reticent about Rick and Morty. For all its off-the-wall humour, idiosyncratic gags, and wild plot twists, it seems somewhat unwilling to deal with the full ramifications of its darkest moments.
For me, at least, this is why Bojack Horseman stands above Rick and Morty and Archer. Bojack is, at its core, an exploration of the darkest recesses of the human experience. It is as much tragedy as comedy, as much a tale of depression as it is a colourful cartoon populated by anthropomorphised animals. While Rick and Morty might still have found great success if it had excised its forays into genuine drama, it is entirely impossible to imagine an iteration of Bojack devoid of the strain of sadness that runs throughout it.
Each of these styles has a place within the expanding territory of adult cartoons, and it would be entirely incorrect to suggest that Rick and Morty or Archer have somehow made a mistake by adopting different styles. Bojack Horseman, however, is a masterwork, an effortless blend of comedy and drama, and a series that deserves its place amongst the great shows of recent years, be they animated or not.
Cliché of the week: “Where’s the cue ball going?”
Where indeed is the cue ball going, John Virgo?
Is it really heading directly for the pocket, or have you merely chosen to remind your listeners of your single amusing moment in the hope of cheap laugh?
What once, and I mean once, was genuine and amusing has now become hackneyed and stale, trotted out for any loose white or misjudged safety.
Even worse, his time in the commentary booth is now spent waiting for just such an occurrence. No matter how fine the pot, how tight the safety, the audience knows that John Virgo is secretly disappointed that the white did not rattle the jaws.
Indeed, John Virgo no longer contains his catchphrase to the white ball.
Cries of “where’s the yellow ball going” and “where’s the black going” now echo through the halls of the Crucible. They haunt even the greatest players with their sheer mediocrity.
Yet Virgo seems somehow not to realise what any comedian will tell you—he is overdoing it and it is no longer funny.
You can hear the arena ripple with exasperation while Willie Thorne purses his lips as the words slip from Virgo’s.
Please, John, it is well and truly the right time to give up the ghost, because quite frankly it’s killing us.

