Expect the worst and hope for the best. That’s what they say, isn’t it? This mantra is one that would’ve been worth following for Brighton and Hove Albion fans watching their team’s Premier League match against Sheffield United at around 12:40pm on Sunday 20th December (this included 2000 lucky social-distancing attendees at the AMEX Stadium and at least 20,000 others – one being my poor self – yelling at their TVs from their living rooms). Yet with our opponents having only managed a single point from 13 previous games this season, languishing at the bottom of the league table like an overripe satsuma in a Christmas stocking, and now down to 10 players after John Lundstram’s 40th minute red card, we Albion fans could surely have been forgiven for thinking that our second home win in all of 2020 was on its way as a matter of course.
Having been a Brighton fan since the age of seven, however, I probably should’ve foreseen that my team was to spend the next 50 minutes of action passing the ball back and forth outside the Sheffield penalty box, conspiring to let a 20-year-old defender score past us on his debut, and only equalising with three of the 90 minutes remaining, through ex-wonderkid Danny Welbeck’s fifth Premier League goal in three seasons. After the game, fans and statisticians alike remarked that Brighton had chalked up 3.35 ‘Expected Goals’ over the course of the match, a tally far superior to Sheffield United’s 0.92: with our attackers having missed multiple open goals – one hitting the woodwork from about 10 centimetres out – we were well within our rights to have anticipated a better final result for our team.
What exactly are ‘Expected Goals’, then – and why do Brighton have so many of them? The Expected Goals (xG) metric comes from an analysis of each shot taken during a 90-minute match, and the probability that it will result in a goal, based on factors such as the shot’s location on the pitch, the pattern of play leading to it, and the body part used to shoot. A team’s total xG for one match is then calculated as the sum of the goal probabilities for each shot taken – against Sheffield United, Brighton had enough high-quality chances to have been expected, on average, to have netted three times, as opposed to their disappointing single goal in the 1-1 draw. In theory, Brighton are good at getting into positions where they are likely to score – but their lacklustre finishing hugely lets them down, leading to a massive underperformance in relation to xG.
Thanks to its insightful posts comparing Expected Goals with often wildly-different real-life results, Twitter user ‘The xG Philosophy’ has racked up nearly 90,000 followers over the course of the last few Premier League seasons, becoming a fundamental part of in-game analysis and the customary post-match banter (“Say the line @xGPhilosophy” / “Brighton won the xG” reads one Simpsons-inspired meme that seems to appear after every Albion defeat, much to my chagrin).
The account is run by James Tippett, author of the book also entitled ‘The Expected Goals Philosophy’, which puts forward the case for xG as the meaningful stat in modern-day football and details a number of recent Premier League and Championship success stories revolving around Expected Goals, such as Brentford’s xG-based scouting model, which has seen them recruit a number of hidden gems such as Aston Villa’s Ollie Watkins, West Ham’s Said Benrahma, and Brighton’s own Neal Maupay, who are now plying their trade at the very top of English football.
There seems to be a strong case, then, for valuing Expected Goals almost as much as actual ones when it comes to match analysis, player scouting, and even betting on football. Except that this is quite obviously not a watertight method – just look at Brighton’s performance this season. Over the course of 15 Premier League games played at the time of writing, my team has amassed 23.85 Expected Points (based on opposing teams’ xG scores from each particular game), placing us 5th (FIFTH!) in the ‘xG table’, a bonkers parallel universe where we rank higher than big hitters such as Tottenham (22.23), Manchester United (22.28), and Leicester (22.91). Of course, in real life we are almost 11 points worse off, sitting 16th in the table above only Burnley, Fulham, West Brom, and (thank goodness) Sheffield United, and with a real chance of being relegated to the Championship come the end of the season. No wonder my cranberry sauce tasted distinctly bitter on December 25th.
Interpreting Expected Goals is a tricky business: does Brighton’s huge xG underperformance mean that we’re much better than we think, absolutely crap, or just really unlucky? I’d argue that it’s a bit of each. Our manager Graham Potter has been praised for his attacking style of football, with exciting, pacey players like Tariq Lamptey and Solly March creating lots of chances, no matter who we are playing against. Yet our strikers seem more likely to squander these chances than gobble them up: top scorer Maupay has underperformed his personal xG by a massive 2.43, while deputies Welbeck and Aaron Connolly are also both in the negative for Expected vs actual goals so far this season. This might seem like a matter of fortune, but the pundits who suggest that Brighton’s poor showings for both goals and points are purely down to bad luck are mistaken. Most culpable for the team’s failings are a crippling lack of confidence in front of goal and our recurrent inability to defend or attack set pieces. It might be cliché to say that in football you need to make your own luck, but this certainly rings true for Brighton’s beleaguered, dispirited squad. The xG metric suggests that things might come good for my team, but at the same time, they might well not – especially if we can’t fix our more deep-rooted psychological and tactical problems. As ever, statistics fail to account for real human emotion and error: in this way, Expected Goals can only ever hope to tell half the story.