People are more sun-aware than ever before, slapping on the sunscreen at every opportunity to protect their skin. So why have we seen increases in the number of people with malignant melanoma, the most dangerous form of sun-induced skin cancer, following the introduction of sunscreens?
Sunscreens were developed to prevent burning with their UVB-blocking formulation. UVB is a type of electromagnetic (light) radiation which comes from the sun but, unlike the visible spectrum, can’t be detected by our eyes. It is this UVB which is the main cause of skin burning. Sun Protection Factor, SPF, is the main measure of a sunscreen’s effectiveness. It indicates the increase in the amount of time a person can spend in the sun before their skin becomes reddened. For example, if a person normally becomes red in 30 minutes then SPF15 will allow them to stay outside for 15 times longer (7.5 hours), and SPF50 will give them 50 times longer (25 hours).
It was logical to assume that skin redness and burning were the causes of skin cancers and evidence available did support this, demonstrating that a person who has a single incidence of sunburn before they are 20 has a doubled lifetime risk of melanoma. There is a particularly strong association between intermittent UV burns and blisters in childhood or adolescence with incidence of melanoma. So why, with the introduction of sunscreens, has incidence of melanoma increased rather than decreased?
In 1999 an Australian study followed two groups of volunteers for 4.5 years, one group with daily SPF16 application versus a no sunscreen control group. They found that sunscreen reduces the incidence of one form of cancer but that there is no significant reduction in the incidence of others, including melanoma. Moreover, reviews of all available evidence into whether sunscreen use prevents melanoma have remained inconclusive.
Let’s consider what might be producing this data.
Using sunscreens affords a false sense of security in the sun. People spend more time in the sun than they would otherwise, feeling protected because of the sunscreen rather than seeking shade. Furthermore, it is likely that people who use sunscreen the most tend to be the people who are most at risk of melanoma due to their skin type. This distorts the data.
And what about the sunscreens themselves? Sunscreens are exceptionally good at blocking UVB and preventing burning, however it is unknown which, if any, frequencies are melanoma causing. This could mean that our skin is still being damaged by something other than UVB. Sunscreens are tested at a specific thickness, yet the average consumer only applies 25–50 percent of the test amount. This therefore reduces the protection they provide. And sunscreens can prevent vitamin D synthesis, which requires UVB. Counterintuitively, VitD deficiency (due to lack of UVB) has been linked to increased melanoma growth. It is speculated that the decreased VitD in the skin could allow melanoma to grow, although the evidence for this is very limited.
Whilst the evidence against sunscreens appears pretty damning, you shouldn’t throw away the bottles just yet. Recent advances in sunscreen development mean they block a broader spectrum of UV radiation (UVA and UVB) even when the thickness applied is closer to the amount consumers actually put on. This means only 20-30 percent of UV radiation reaches the skin in comparison to 70-90 percent as with older sunscreens.
These newer sunscreens haven’t been in use long enough to say for sure that they are better at preventing melanoma and we will only find this out by looking at melanoma incidence in people born after the introduction of modern sunscreens. In the meantime the best advice is the old advice: don’t rely on sunscreen alone for protection.