Trop loin de la côte : le sort de l’expédition Franklin

– Par Robert W. Park et Douglas R. Stenton –

Les détails de base sont bien connus : en mai 1845, deux navires de la Marine royale, le NSM Erebus et le NSM Terror, quittent la Grande-Bretagne sous le commandement de Sir John Franklin pour tenter de franchir le passage du Nord-Ouest. Deux ans plus tard, ils déclarent que tout va bien, mais 11 mois plus tard, en avril 1848, les 129 marins partis de Grande-Bretagne ne sont plus que 105 survivants qui quittent leur navire en traînant des bateaux montés sur des traîneaux sur la glace et la neige dans une tentative désespérée d’échapper à l’Arctique. Au moins 25 de ces survivants périront à moins de 100 kilomètres des navires, et la distance la plus longue qu’ils aient parcourue est d’environ 350 kilomètres. L’issue tragique de l’expédition Franklin a captivé et retenu l’imagination du public pendant près de 180 ans et l’importance continue de l’histoire de Franklin est illustrée par la participation très publique de plusieurs organismes gouvernementaux et de certaines des élites industrielles et médiatiques du Canada à la recherche des épaves au XXIe siècle, et par le fait que le premier ministre du Canada s’est réservé l’annonce en 2014 de la découverte de l’Erebus par Parcs Canada.

Au-delà de l’excitation suscitée par les découvertes des épaves des deux navires, il y a au moins deux raisons pour lesquelles l’expédition Franklin résonne encore. La première et probablement la plus répandue est l’idée qu’elle demeure un immense mystère. Si vous recherchez sur Google la cooccurrence des termes « expédition Franklin » et « mystère », vous obtenez près de 80 000 requêtes. Dire qu’il s’agit d’un mystère laisse sous-entendre qu’il n’est pas encore compris, voire qu’il est inexplicable. Le deuxième argument très courant, parfois implicite, est qu’il est évident qu’une expédition navale britannique tentant de traverser l’archipel arctique canadien à bord de navires en bois à la fin de la période connue sous le nom de « Petit Âge glaciaire », en hivernant une ou plusieurs fois en cours de route, se terminerait inévitablement par un désastre. De ce point de vue, l’expédition Franklin n’est pas un mystère, mais plutôt une étude de cas sur l’arrogance et l’incompétence eurocentriques par rapport aux Inuits dont les ancêtres avaient habité cette région avec succès pendant des millénaires. En tant que chercheurs ayant consacré la majeure partie de leur carrière à l’étude de ces 4 500 ans d’histoire inuite dans l’Arctique canadien, cette dernière perspective a un certain attrait. Cependant, aucune de ces perspectives n’est vraiment correcte. La cause ultime de la catastrophe n’est pas un mystère car, bien qu’il existe de nombreux détails intéressants que nous apprenons encore, l’explication globale semble assez claire et bien comprise. De plus, la catastrophe n’était pas inévitable – de nombreuses expéditions de la marine britannique, équipées et commandées de manière comparable, dans les décennies précédant et suivant immédiatement l’expédition de Franklin, l’ont prouvé en passant l’hiver et en rentrant chez elles avec un taux de mortalité minimal.

Notre intérêt particulier pour l’expédition Franklin se concentre sur la compréhension des archives archéologiques créées après l’abandon des navires. Cependant, afin de comprendre et de tirer des enseignements de ces archives archéologiques, nous avons dû effectuer des recherches sur la santé des équipages de l’expédition Franklin, ce sur quoi nous nous concentrons ici. Vous trouverez beaucoup plus de renseignements à ce sujet dans Park et Stenton (2019).


– By Robert W. Park and Douglas R. Stenton – 

Background

The basic details are well-known: in May of 1845 two Royal Navy ships, HMS Erebus and HMS Terror, departed Great Britain under the command of Sir John Franklin to attempt the transit of a Northwest Passage. Two years later they reported “All well” but 11 months after that, in April of 1848, the 129 sailors who had set out from Britain had been reduced to 105 survivors departing their ships dragging boats mounted on sleds across the ice and snow in a desperate attempt to escape the Arctic. At least 25 of those survivors would perish less than 100 kilometers from the ships, and the furthest any of the survivors are known to have travelled was around 350 kilometers. The tragic outcome of the Franklin expedition has captured and held the public’s imagination for almost 180 years and the continuing importance of the Franklin story is illustrated by the very public involvement of several government agencies and some of Canada’s industrial and media elite in the 21st century search for the shipwrecks, and by the fact that Canada’s Prime Minister reserved for himself the 2014 announcement of Parks Canada’s discovery of the Erebus.

Beyond the excitement of the discoveries of the wrecks of both ships, there are at least two reasons why the Franklin expedition still resonates. The first and probably the most pervasive is the idea that it remains a huge mystery. If you search Google for the co-occurrence of the terms “Franklin expedition” and “mystery” you come up with almost 80,000 hits. Calling it a mystery implies that it is not yet understood, or even that it is inexplicable. The second very common rationale, sometimes implicit, is that of course a British naval expedition attempting to sail through the Canadian Arctic Archipelago in wooden ships late in the period known as the ‘Little Ice Age,’ overwintering one or more times along the way, would inevitably end in disaster. Viewed this way, the Franklin expedition is not a mystery but rather a case study in Eurocentric hubris and incompetence in comparison with Inuit whose ancestors had inhabited that region successfully for millennia. As researchers who have spent most of our careers learning about those 4500 years of Inuit history in Arctic Canada, that latter perspective has some appeal. However, neither of these perspectives is really correct. The ultimate cause of the catastrophe is not a mystery because, although there are many interesting details we are still learning, the overall explanation seems quite clear and well understood. Further, the disaster was not inevitable—many comparably equipped and commanded British Navy expeditions in the decades before and immediately after the Franklin expedition proved that fact by overwintering and returning home with minimal mortality.

Our own particular interest in the Franklin expedition focuses on understanding the archaeological record created after they deserted the ships. However, in order to understand and learn from that archaeological record, we needed to research the health of the Franklin crews, which is what we focus on here. Much more information on this can be found in Park and Stenton (2019).

Figure 1. The route taken by the Franklin expedition, showing where they were beset in the ice far from shore between September 1846 and April 1848. The positions are the ones reported by the expedition.

Mortality in the Royal Navy

The death of some members of the Franklin expedition over its planned three-year duration would have been expected. A statistical study of health in the entire Royal Navy between 1830 and 1836 (Troubridge, 1841) found that the average yearly death rate for that period from wounds, accidents, and illness was 1.38%. Six Royal Navy Arctic expeditions between 1819 and 1855 which utilized the same basic technologies and strategies as the Franklin expedition had an average annual mortality consistent with that—1.67%—with the highest being just 2.97%. Over three years—the length of time for which the Franklin expedition was provisioned—a cumulative mortality somewhere between 4.14% and 5.01% of its 129-man complement would therefore have been expectable, or five to seven deaths. The 100% mortality on the Franklin expedition in what was probably less than three and a half years is thus clearly extraordinary, but the distinctive timing of the deaths is significant. Three died in the first year (Beattie & Geiger, 1987) but the “All well” message written at the end of the second year suggests few subsequent deaths had occurred. Franklin himself died at the beginning of the third year and by the end of that year 20 more had died, so the cumulative mortality when the ships were deserted was already 18.6%. The remainder died during the trek southwards.

Taking all of that into account, what was different about the Franklin expedition that could account for such massive mortality? The difference or differences must be consistent with (a) a mortality rate over the first two years similar to that in other Arctic expeditions and in the Royal Navy generally, allowing them to write “All well” at the end of that period; and (b) a mortality rate in the third year far higher than that seen in other comparable Arctic expeditions, even ones that remained in the Arctic for several more years than Franklin’s.

Theories Concerning the Franklin Catastrophe

Many hypotheses have been advanced over the years to explain the Franklin catastrophe including diseases such as scurvy, tuberculosis, and trichinosis, or poisoning by lead, zinc, or botulism. Of these, lead poisoning (Beattie, 1985; Kowal et al., 1990) was the most promising but recent research showing high lead levels in Royal Navy sailors elsewhere, and no increase in lead over time in Franklin expedition sailors, has eliminated it as a primary factor (Millar et al., 2015; Swanston et al., 2018), and none of the other theories meet the criteria listed above (Park & Stenton, 2019). This brings us back to our central question: what was different about the Franklin expedition that could account for such massive mortality during its third year?

We have concluded that the one significant factor that does distinguish the Franklin expedition from previous and subsequent expeditions is wintering location. As each Arctic open-water sailing season came to an end the standard procedure was to overwinter close to shore. Indeed, Franklin’s orders instructed him to do this: “…you are to use your best endeavours to discover a sheltered and safe harbour, where the ships may be placed in security for the winter” (Belcher et al., 1855). The expedition did this at the end of the 1845 sailing season, wintering at Beechey Island. But in 1846 they did not, and in September Erebus and Terror were beset in Victoria Strait at least 20 kilometers from the nearest coast, King William Island. This was not immediately catastrophic, because nine months later the crews reported “All well,” undoubtedly anticipating that the summer break-up would soon free them to continue their journey. But the 1847 break-up never came, and the ships spent the expedition’s third year still locked in the ice far from land.

Too far from shore

The difference between wintering close to shore or 20 kilometers away turns out to be highly consequential. All British expeditions of this era set out with enough stored food to be self-sufficient for the duration of the time they anticipated spending in the Arctic, which was three years in the Franklin expedition’s case. However, all were also equipped to make, and assiduous in making, efforts to acquire game and fish from autumn to spring, while they were in harbour adjacent to land. Rules were established to provide an incentive for the hunters and to ensure that the resulting fresh food was distributed throughout the entire crew, and records show some expeditions acquiring thousands of pounds of meat and fish this way. Franklin had indicated that he intended to use every available opportunity to obtain game, and we know he did so during their first winter because hunting camps were later found at Beechey Island. But during their second and third winters, when the Erebus and Terror were beset at least 20 kilometers from the nearest coastline, the logistics of acquiring game or fish would have been extremely difficult. It would have been too dangerous to traverse the pack ice to King William Island until it had frozen solid, and by the time that had happened they would have been contending with dwindling hours of daylight, increasing cold, and a seasonal scarcity or absence of terrestrial game. Out on the sea ice near the ships there would have been few opportunities for hunting, apart from the occasional polar bear. There would have been ringed seals nearby, but the techniques used by Inuit to hunt them through their breathing holes would have been quite beyond the capabilities of the British sailors (M’Clintock, 1859).

For these reasons it is probable that the crews did not acquire significant quantities of fresh food throughout the autumn and winter of 1846-7. They clearly travelled to King William Island in the spring of 1847 since that is when they left the “All well” report there. However, the long distance between the ships and shore undoubtedly precluded sending out the numbers of hunting parties dispatched by other expeditions. That inference is supported by the fact that extensive archaeological surveys of the northwest coast of King William Island have found only one Franklin expedition campsite, and not the many hunting camps that might be expected had they been using the coast repeatedly during the entire time they were beset nearby: September 1846 to April 1848. Thus, after leaving Beechey Island, the Franklin crews ate a far higher proportion of stored food than expeditions which were able to hunt near their sheltered harbours. By the time the 105 survivors left the Erebus and Terror the crews had been cut off from significant fresh supplements to their stored food for around 21 months.

Figure 2. Aerial view of part of the Erebus Bay coast. Within the area shown in this photograph the survivors abandoned two boats on sledges, and at least 22 of the original survivors perished here after travelling less than 100 kilometres from the ships (Photo: D.R. Stenton).

Stored food

The Franklin expedition’s unique reliance on stored food suggests that a nutritional deficiency may lie at the root of the catastrophe. The most famous nutritional deficiency affecting the Royal Navy was ascorbic acid (vitamin C), whose lack produces the disease scurvy. However, the very limited number of fatalities from it on other expeditions would suggest that the lemon juice they all carried could be an adequately effective antiscorbutic, at least when combined with the additional ascorbic acid that they obtained from fresh game. But a less famous nutritional deficiency associated with stored food is likely more significant. The disease beriberi results from a deficiency in thiamine (vitamin B1). It produces a complicated range of symptoms but initially causes severe weakness and pain in the legs, to such an extent that walking may be difficult or impossible. Thiamine breaks down rapidly, so even stored food that started out with adequate quantities may have become deficient in it after storage in a ship’s hold (Cecil & Woodruff, 1962). Further, thiamine had not yet been discovered so the expeditions did not carry appropriate supplements like the lemon juice carried to avoid scurvy.

A vivid example of beriberi caused by stored food was documented in the early 20th century in poor Newfoundland fishing communities. Their winter diets consisted of little more than tea, white flour and biscuits, salt beef, salt pork, salt cod, margarine, molasses, and berries (Aykroyd, 1930), similar to the stored foods used by the Royal Navy. By late spring each year, after several months of consuming that diet, many Newfoundlanders developed the disease. Between 11 and 20 people died of it per year but the majority recovered by early summer due to the renewed consumption of thiamine via fresh foods. From this example it is clear that thiamine deficiency over just a few months can produce debilitating symptoms which can culminate in death if thiamine is not reintroduced to the diet.

Figure 3. Archaeological investigation of one of the Erebus Bay boat places, with the flags marking find spots (Photo: D.R. Stenton).

Catastrophe in the third year

It is thus plausible that Royal Navy expeditions entered the Arctic with stored food supplies whose thiamine content soon deteriorated. But expeditions that were successful in hunting and fishing each autumn and spring were largely able to avoid the debilitating effects of beriberi whereas the crews of the Erebus and Terror, beset for almost two years far from the nearest hunting areas, would not have been able to supplement their stored supplies with fresh foodstuffs to any significant degree. The timing of the Franklin expedition mortality is consistent with the crews suffering from compounding nutritional deficiencies due to complete reliance on stored provisions commencing with their departure from Beechey Island. The effects of beriberi were probably not yet widely debilitating when the “All well” report was written almost a year later, but they undoubtedly contributed to the deaths that subsequently occurred during the winter of 1847-8. The known early symptoms of beriberi—weakness and pain in the legs, affecting the ability to walk—also provide a grim insight into the experience of the 105 who attempted to escape the Arctic by man-hauling boats on sleds in frigid Arctic spring weather. We know that two of the boats and at least 22 sailors were left behind less than 100 kilometers from the ships at a place later named Erebus Bay. Just 25 kilometers further the remains of another large group of sailors would be discovered by Inuit at a place later named Terror Bay. It may be that some of these sailors were still alive but could no longer walk, and their colleagues no longer had the strength to pull them on the sleds. Archaeological research has confirmed that these parties had firearms and lots of ammunition so if there had been game available, they would have been able to shoot it, but at that time of year there would have been little to hunt.

Thus, the ultimate cause of the catastrophe can be linked to wintering in the ice pack. We do not know why in September of 1846 Franklin allowed his ships to become frozen far out in Victoria Strait rather than following his orders to seek a safe harbour along one of the adjacent coastlines. Several scholars have speculated that it must have been inadvertent, the outcome of a failed gamble to traverse the strait ahead of the autumn freeze-up, perhaps under their cutting-edge steam power (Cyriax, 1939; Cookman, 2000). If that is the case, it is evocative that the only skeleton we have been able to identify so far is that of John Gregory, the engineer who was responsible for running Erebus’ steam engine and who was one of the 105 who later attempted to walk out of the Arctic (Stenton et al., 2021). But someday we may learn more details of what happened in September of 1846 if Parks Canada’s ongoing investigations of the wrecks of Erebus and Terror manage to recover legible ships’ logs or journals.


Robert W. Park is an Associate Dean in the University of Waterloo’s Faculty of Arts and a Professor in its Department of Anthropology. For four decades he has participated in archaeological fieldwork in Southern Ontario, Yukon, Northwest Territories, and especially Nunavut.

Douglas R. Stenton, CM, is the former (retired) Director of Heritage for the Government of Nunavut, and an Adjunct Assistant Professor in the Department of Anthropology at the University of Waterloo. He has conducted archaeological fieldwork in what is now Nunavut since 1980 and has led investigations of the 1845 Franklin expedition since 2008.


Acknowledgements

The fieldwork that inspired this research was funded by the Government of Nunavut Department of Culture and Heritage, Nunavut Archaeology Program. We wish to thank the Inuit Heritage Trust and the Hamlet of Gjoa Haven for their support of our research, and the officers and crew of the Canadian Coast Guard Ship Sir Wilfrid Laurier for providing the outstanding logistical support that made it possible for us to explore so thoroughly the region where the Franklin catastrophe unfolded.

References

Aykroyd, W. R. (1930). Beriberi and other food-deficiency diseases in Newfoundland and Labrador. Epidemiology & Infection, 30(3), 357-386. https://doi.org/10.1017/S0022172400010500

Beattie, O. (1985). Elevated bone lead levels in a crewman from the last arctic expedition of Sir John Franklin (1845-1848). In P. D. Sutherland (Ed.), The Franklin Era in Canadian Arctic History 1845–1859 (pp. 141-148). Archaeological Survey of Canada, Canadian Museum of Civilization.

Beattie, O. B., & Geiger, J. (1987). Frozen in Time. Western Producer Prairie Books.

Belcher, E., Richardson, J., Owen, R., Bell, T., Salter, J. W., & Reeve, L. (1855). The last of the Arctic voyages being a narrative of the expedition in H.M.S. Assistance under the command of Captain Sir Edward Belcher, C.B., in search of Sir John Franklin, during the years 1852-53-54. (2 vol.). L. Reeve.

Cecil, S. R., & Woodruff, J. G. (1962). Long-term storage of military rations. Dept. of the Army, Quartermaster Research and Engineering Command, Quartermaster Food and Container Institute for the Armed Forces.

Cookman, S. (2000). Ice blink: the tragic fate of Sir John Franklin’s lost polar expedition. Wiley.

Cyriax, R. J. (1939). Sir John Franklin’s last Arctic expedition: A chapter in the history of the Royal Navy. Methuen & Co., Ltd.

Kowal, W., Beattie, O. B., & Baadsgaard, H. (1990). Did solder kill Franklin’s men? Nature, 343(25), 319-320. https://doi.org/10.1038/343319b0

M’Clintock, F. L. (1859). Discoveries by the Late Expedition in Search of Sir John Franklin and His Party. Proceedings of the Royal Geographical Society of London, 4(1), 2-14. https://doi.org/10.2307/1798820

Millar, K., Bowman, A. W., & Battersby, W. (2015). A re-analysis of the supposed role of lead poisoning in Sir John Franklin’s last expedition, 1845-1848. Polar Record, 51(3), 224-238. https://doi.org/10.1017/S0032247413000867

Park, R. W., & Stenton, D. R. (2019). Use Your Best Endeavours to Discover a Sheltered and Safe Harbour. Polar Record, 55(6), 361-372. https://doi.org/10.1017/S0032247419000573
Stenton, D. R., Fratpietro, S., Keenleyside, A., & Park, R. W. (2021). DNA identification of a sailor from the 1845 Franklin northwest passage expedition. Polar Record, 57, e14. https://doi.org/10.1017/s0032247421000061

Swanston, T., Varney, T. L., Kozachuk, M., Choudhury, S., Bewer, B., Coulthard, I., Keenleyside, A., Nelson, A., Martin, R. R., Stenton, D. R., & Cooper, D. M. L. (2018). Franklin expedition lead exposure: New insights from high resolution confocal x-ray fluorescence imaging of skeletal microstructure. PLOS ONE, 13(8). https://doi.org/10.1371/journal.pone.0202983

Troubridge, T. (1841). Statistical Reports on the Health of the Navy, for the Years 1830, 1831, 1832, 1833, 1834, 1835, and 1836—Part II. House of Commons Parliamentary Papers.

Planification de l’adaptation : Une approche interdisciplinaire de la réduction des risques liés au changement climatique

– Par Sarah Kehler et S. Jeff Birchall – 
Le changement climatique pose un problème complexe et sans précédent. Avec la hausse des températures, le climat mondial devient de plus en plus instable. Les incidences sur le climat, telles que l’élévation du niveau de la mer et la fréquence des phénomènes météorologiques extrêmes, ont de graves répercussions sur les systèmes écologiques et humains. Bien que le changement climatique soit un phénomène mondial, des conséquences uniques et graves se produisent à l’échelle locale. Une érosion côtière incontrôlable, des inondations dévastatrices ou des températures extrêmes atypiques peuvent facilement submerger une communauté non préparée. Les incidences sur le climat risquent d’être coûteuses : en 2025, le Canada devrait subir des pertes de 25 milliards de dollars en raison du changement climatique, et d’ici 2100, les pertes annuelles pourraient atteindre 100 milliards de dollars (Sawyer et coll., 2022). Ces risques soulignent l’importance de l’adaptation, le processus par lequel les communautés anticipent et se préparent au changement climatique.

Les incidences sur le climat locales uniques et qui s’intensifient obligent les communautés à s’adapter. Grâce à leur connaissance du lieu et à leur autorité, les gouvernements locaux, tels que les gouvernements municipaux et régionaux, sont dans une position idéale pour faciliter cette adaptation (Birchall et coll., 2023). La planification urbaine guide la prise de décision à l’échelle locale concernant l’utilisation future des terres, l’aménagement et les infrastructures. La planification de l’adaptation cherche à intégrer la science du climat dans la planification urbaine – une étape essentielle pour se préparer au changement climatique. La prise de décision qui tient compte des projections futures peut considérablement réduire les risques de catastrophe, renforcer la résilience des infrastructures et préparer les communautés à l’incertitude (Davoudi et coll., 2013).

L’adaptation anticipée est un gage de sécurité. L’adaptation atténue les risques de catastrophe et, si un événement se produit, elle peut réduire considérablement les coûts d’intervention et les pertes économiques. En fait, chaque dollar dépensé pour l’adaptation anticipée apportera jusqu’à 15 dollars de bénéfices dans les 75 ans (Sawyer et coll., 2022). Cependant, les avantages de l’adaptation vont au-delà de l’économie. L’adaptation, lorsqu’elle est mise en œuvre de manière équitable, assure la sécurité alimentaire et des moyens de subsistance, améliore la santé et le bien-être des personnes et préserve la biodiversité.

Il existe deux principaux types d’adaptation : la limite stricte de l’adaptation et la limite souple de l’adaptation (tableau 1). La limite stricte de l’adaptation est structurelle et se concentre sur la mise à jour des infrastructures pour qu’elles résistent aux incidences sur le climat. Aujourd’hui, la plupart des adaptations consistent en des mesures d’infrastructures strictes comme les digues (Kehler et Birchall, 2021). Les limites souples de l’adaptation sont non structurelles ou basées sur les écosystèmes, et se concentrent sur la gestion des risques en limitant l’utilisationdes terres et en renforçant les services écosystémiques. Les mesures souples sont de plus en plus reconnues comme des aspects essentiels de l’adaptation; le risque d’inondation, par exemple, peut être atténué par la préservation des zones humides ou des restrictions de zonage.

Pour s’adapter efficacement, il faut une capacité d’adaptation suffisante – les conditions qui permettent aux communautés d’anticiper le changement et d’y répondre (Cinner et coll., 2018). Les gouvernements locaux doivent être flexibles, coopérer entre les administrations et comprendre l’importance de l’adaptation (Birchall et coll., 2023). Elles doivent avoir accès à des ressources, telles que de l’argent, des technologies ou des services, afin d’initier l’adaptation. Le soutien du public, la mobilisation des intervenants et le leadership politique sont à la base de la capacité d’adaptation; sans personnes plaidant pour une action anticipée, l’adaptation a peu de chances de se produire (Ford et King, 2013). Il est de plus en plus évident que la mise en œuvre de l’adaptation est souvent inéquitable, les zones riches recevant un soutien plus important, ce qui diminue la capacité d’adaptation globale de la communauté (Kehler et Birchall, 2021).

L’adaptation n’est pas une panacée pour le changement climatique. Les limites strictes et souples de l’adaptation ont leurs limitations, leurs avantages et leurs inconvénients (tableau 1). Cependant, la dépendance excessive à l’égard de la limite stricte de l’adaptation a mis les communautés en danger : le changement climatique va rapidement dépasser notre capacité d’ adaptation par le biais des seules mesures strictes. Malgré les bonnes intentions, à long terme, la limite stricte de l’adaptation peut entraîner des coûts d’entretien élevés et des risques d’échec, tout en offrant moins d’avantages que la limite souple de l’adaptation (Birchall et coll., 2022). Dans certaines circonstances, les conséquences involontaires des infrastructures coûteuses et des adaptations techniques peuvent conduire à une maladaptation, lorsque les mesures d’adaptation ne diminuent pas les risques et augmentent plutôt la vulnérabilité au changement climatique.

Malheureusement, de nombreuses communautés, dépassées par le coût et l’ampleur des futures demandes d’adaptation, ne bénéficient pas du soutien public et de l’efficacité bureaucratique nécessaires pour y répondre (Birchall et coll., 2023). Ces obstacles peuvent empêcher le processus de planification d’entreprendre une adaptation efficace au niveau local, et augmenter le risque de maladaptation (Kehler et Birchall, 2021). Pour éviter la maladaptation, il faut une adaptation efficace et équitable. Pour ce faire, la planification doit tenir compte des contraintes de capacité d’adaptation et viser à équilibrer les limites strictes et souples de l’adaptation. Ces objectifs peuvent être atteints simultanément. Par exemple, en explorant d’ abord les options souples moins coûteuses et peu perturbatrices, les communautés peuvent éviter les risques de maladaptation, obtenir le soutien du public et conserver des ressources limitées pour le cas où des mesures strictes seraient inévitables.


– By Sarah Kehler, S. Jeff Birchall- 

Climate change presents a complex and unprecedented problem. As temperatures rise, the global climate is becoming increasingly unstable. Climate impacts, such as sea level rise and frequent extreme weather events, are causing acute impacts on ecological and human systems. Although climate change is a global phenomenon, unique and severe consequences occur at the local level. Uncontrollable coastal erosion, devastating overland flooding or atypical temperature extremes can easily overwhelm an unprepared community. Climate impacts are likely to be costly: in 2025, Canada is expected to see a loss of $25B due to climate change, and by 2100, annual losses could be as high as $100B (Sawyer et al, 2022). These risks underscore the importance of adaptation, the process through which communities anticipate and prepare for climate change

Unique and intensifying local climate impacts are forcing communities to adapt. With their place-based knowledge and authority, local governments, such as municipal and regional governments, are in an ideal position to facilitate this adaptation (Birchall et al., 2023). Urban planning guides local-level decision making around future land use, development and infrastructure. Adaptation planning seeks to integrate climate science into urban planning – a critical step toward preparing for climate change. Decision making that considers future projections can considerably reduce disaster risk, enhance infrastructure resilience and prepare communities for uncertainty (Davoudi et al., 2013).

Anticipatory adaptation provides security. Adaptation mitigates disaster risk and, should an event occur, can substantially reduce response costs and economic losses. In fact, every $1 spent on anticipatory adaptation will provide up to $15 benefit within 75 years (Sawyer et al, 2022). However, the benefits of adaptation go beyond economics. Adaptation, when implemented equitably, provides food and livelihood security, increases human health and well-being, and conserves biodiversity.

There are two main types of adaptation: hard adaptation and soft adaptation (Table 1). Hard adaptation is structural, focusing on updating infrastructure to withstand climate impacts. Today, most adaptation consists of hard infrastructure measures like sea walls (Kehler & Birchall, 2021). Soft adaptations are non-structural or ecosystem-based, focusing on managing risks through restricting land use and bolstering ecosystem services. Soft measures are increasingly recognized as critical aspects of adaptation; flood risk, for example, can be mitigated through wetland preservation or zoning restrictions.

Table 1: Adaptation approaches. Structural, non-structural and ecosystem-based approaches are defined and classified with examples provided according to climate vulnerabilities. Benefits and drawbacks are presented for each approach. Adapted from: Bonnett & Birchall (2020).

Adapting effectively requires sufficient adaptive capacity – the conditions that enable communities to anticipate and respond to change (Cinner et al., 2018). Local governments must be flexible, cooperate across jurisdictions, and understand the importance of adaptation (Birchall et al., 2023). They must have access to resources, such as money, technology or services, in order to initiate adaptation. Public support, stakeholder engagement and political leadership underpin adaptive capacity; without people advocating for anticipatory action, adaptation is unlikely to occur (Ford & King, 2013). There is growing awareness that implementation of adaptation is often inequitable, with wealthy areas receiving greater support, decreasing the overall adaptive capacity of the community (Kehler & Birchall, 2021).

Adaptation is not a cure-all for climate change. Both hard and soft adaptations come with limits, benefits and drawbacks (Table 1). However, over-reliance on hard adaptation has put communities at risk: Climate change will quickly outpace our capacity to adapt through hard measures alone. Despite good intentions, over the long-term hard adaptation can carry high maintenance costs and risks of failure, while providing less benefits than soft adaptation (Birchall et al., 2022). In some circumstances, the unintended consequences of expensive infrastructure and engineered adaptations can lead to maladaptation, when adaptation measures do not decrease risk, and rather increase vulnerability to climate change.

Unfortunately, many communities, overwhelmed by the sheer cost and magnitude of future adaptation demands, lack the consistent public support and bureaucratic efficiencies necessary to meet them (Birchall et al., 2023). These barriers can inhibit the planning process from undertaking effective local-level adaptation, and increase the risk of maladaptation (Kehler & Birchall, 2021). Avoiding maladaptation requires effective and equitable adaptation. To do so, planning must address adaptive capacity constraints and aim to balance hard and soft adaptations. These goals can be achieved simultaneously. For example, by exploring less expensive and minimally disruptive soft options first, communities can avoid maladaptation risks, garner public support and conserve limited resources for when hard measures are unavoidable.

Case study: The challenge of adaptation in the Canadian Arctic

Photo 1: Seawall in Nome, Alaska. An example of a structural adaptation. Photo courtesy of S. Jeff Birchall, taken August 12, 2016, 1am.

While climate change is impacting communities across the globe, the Canadian Arctic currently experiences intensified local-level climate impacts. Recent studies find that, since 1979, arctic amplification has caused northern regions to warm nearly four times faster than the global average (Rantanen et al., 2022). This intense warming has caused substantial physical impacts, such as permafrost thaw, sea ice loss, coastal erosion and biodiversity loss. Isolation, ecological fragility and remoteness further render northern communities vulnerable to climate change.

Extreme exposure to climate impacts require intensive adaptation. However, Arctic communities face unique adaptation barriers and maladaptation risks. Many of these barriers and risks remain unknown due to lack of technical data and personnel, and inadequate public consultation. As climate change worsens, it is becoming apparent that hard and soft adaptations typically used in warmer climates have little utility in the Arctic. Extreme cold narrows ecological niches, reducing biodiversity and constraining the feasibility of ecosystem-based adaptation. Simultaneously hard infrastructure adaptation is limited by structural and ecological fragility, and maladaptation often leads to widespread environmental degradation. As a result, infrastructure maintenance and upgrades are becoming increasingly expensive, which communities struggle to afford due to low property values. When infrastructure goes unmaintained, disaster risk increases and northern communities, which tend to be isolated and reliant on a single economic industry or subsistence food production, are vulnerable to even minor disruptions (Birchall et al., 2022).

For Arctic communities adaptation is complex. Indigenous knowledge systems are crucial to effective adaptation, and may offer insight into current unknowns. However, as mindset barriers and inequality prevent collaboration, high proportions of marginalized groups can restrict effective adaptation (Kehler & Birchall, 2021). Collaboration between Indigenous communities and local governments is necessary to overcome barriers and co-create effective long-term adaptation policies (Birchall & MacDonald, 2019). Adaptation in the Canadian Arctic offers a unique opportunity to address lagging reconciliation and set an example for integrating non-western knowledge systems into disaster risk reduction.


Sarah Kehler is currently a PhD student at the University of Alberta; her general area of study is Urban and Regional Planning. She is currently a research assistant with the Climate Adaptation and Resilience Lab, focusing on barriers to achieving equitable and effective policy for adaptation and resilience to climate change.

S. Jeff Birchall, PhD, RPP, MCIP is an Associate Professor of Local-scale Climate Change Adaptation/ Resilience, in the School of Urban and Regional Planning, Department of Earth and Atmospheric Sciences, University of Alberta, where he serves as Director of the Climate Adaptation and Resilience Lab. Jeff leads the UArctic Thematic Network on Local-scale Planning, Climate Change and Resilience. Further questions regarding the article can be directed to jeff.birchall@ualberta.ca

Surveiller le littoral de l’Île-du-Prince-Édouard

– L’équipe du laboratoire climatique de l’UPEI est dirigée par Adam Fenech et composée de Xander Wang, Don Jardine, Ross Dwyer, Andy MacDonald, Luke Meloche et Catherine Kennedy –

L’érosion côtière est le principal défi que le changement climatique pose à l’Île-du-PrinceÉdouard en raison des ondes de tempête, de l’élévation du niveau de la mer et des niveaux d’eau élevés. Les fragiles rivages de sable et de grès de l’Île-du-Prince-Édouard subissent souvent l’usure de l’eau, des vagues, de la glace et du vent. On a mesuré que l’élévation du niveau de la mer à Charlottetown, à l’Île-du-Prince-Édouard, a augmenté de 36 centimètres au cours du dernier siècle (1911-2011 d’après Daigle, 2012), et on prévoit qu’elle augmentera encore de 100 centimètres au cours des 100 prochaines années (GIEC, 2021). En ce qui concerne les tempêtes dommageables, le Groupe d’experts intergouvernemental sur l’évolution du climat (GIEC), l’autorité scientifique de la communauté mondiale en matière de climat, a conclu qu’il était « pratiquement certain » qu’il y avait eu une hausse de l’activité intense des cyclones tropicaux dans l’Atlantique Nord depuis les années 1970; et « plus probable qu’improbable » que ces cyclones tropicaux intenses augmenteraient dans l’Atlantique Nord à la fin du XXIe siècle (GIEC, 2013). En raison de ces changements océaniques, géologiques et de tempêtes prévus, l’érosion côtière devrait se poursuivre et probablement s’aggraver, menaçant les infrastructures publiques et privées et entraînant un coût économique important pour les 1 260 kilomètres de littoral de l’Île-du-Prince-Édouard (ACZISC, 2005).

L’étude la plus récente examinant les taux d’érosion côtière pour chaque mètre de côte de l’Îledu-Prince-Édouard (Webster et Brydon, 2012) en interprétant des photographies aériennes du littoral de l’Île-du-Prince-Édouard pour les années 1968 et 2010 au moyen de techniques d’orthorectification et de délimitation du littoral a calculé un taux d’érosion côtière annuel de 0,28 mètre par an.

Une évaluation quantitative des risques pour les résidences côtières (maisons, chalets), les infrastructures de sûreté et de sécurité (routes, ponts, usines de traitement des eaux, hôpitaux, services d’incendie, etc.) et le patrimoine (églises, cimetières, phares, sites archéologiques, parcs, etc.) a été réalisée par le Climate Lab de l’UPEI afin de déterminer quelles infrastructures de l’Île-du-Prince-Édouard sont menacées par l’érosion côtière. Il en ressort que plus de 1 000 résidences (maisons et chalets), plus de 40 garages, 8 granges et près de 450 dépendances sont vulnérables à l’érosion côtière. Même 17 phares, ces icônes de la culture maritime, ont été jugés à risque. Ces résultats scientifiques étaient importants, mais ils risquaient de se retrouver sur une étagère dans un rapport scientifique s’ils n’étaient pas suffisamment communiqués aux organisations et aux communautés de l’Île-du-Prince-Édouard. Mais comment faire au mieux?

Environnement de visualisation des impacts de la côte (CLIVE)

CLIVE est une interface géovisuelle qui combine les données côtières disponibles, les enregistrements historiques et les prévisions de changement climatique, et les traduit en un outil d’information géovisuel tridimensionnel permettant aux utilisateurs de « voler » au-dessusde l’Île-du-Prince-Édouard, en faisant monter et descendre le niveau de la mer et en cliquant sur les taux d’érosion côtière. Programmé dans le partagiciel UNITY, CLIVE combine des données provenant d’une vaste archive provinciale de photographies aériennes documentant l’érosion du littoral aussi loin historiquement que 1968, et les dernières données numériques d’ élévation à haute résolution dérivées de levés laser connus sous le nom de LiDAR, une technologie de télédétection qui mesure la distance en illuminant une cible avec un laser et en analysant la lumière réfléchie.

Une tournée d’engagement public dans seize villes de l’Île-du-Prince-Édouard a été organisée en 2014 et 2018, chaque séance présentant une introduction à la vulnérabilité de l’Île-du-Prince -Édouard à l’érosion côtière et à l’élévation du niveau de la mer, présentant CLIVE, examinant la vulnérabilité des communautés locales et répondant aux questions. Chaque séance était également précédée et conclue par un sondage écrit visant à évaluer les connaissances des participants, leurs préoccupations et leur volonté de s’adapter à l’érosion côtière et à l’élévation du niveau de la mer. L’inquiétude de chaque participant concernant l’érosion côtière était élevée et, dans la plupart des cas, elle a augmenté après avoir été présentée à CLIVE. Plus important encore, ces séances ont motivé les propriétaires de maisons ou de chalets côtiers à réagir à leur vulnérabilité en augmentant leur résilience à l’élévation prévue du niveau de la mer et à l’érosion côtière.

CLIVE a attiré l’attention à l’échelle nationale et internationale, y compris le journalisme de texte national du Globe and Mail (19 février 2014), la couverture de diffusion nationale canadienne de la Canadian Broadcasting Corporation (World Report Radio le 11 février 2014), la couverture de journal international (National Geographic, 16 décembre 2015) et la couverture de télévision internationale des médias Al Jazeera. CLIVE a remporté en 2014 un prix international décerné par le Massachusetts Institute of Technology pour la communication sur les risques et la résilience des côtes. La technologie CLIVE a été exportée vers la ville de Los Angeles, et mise en œuvre dans plusieurs comtés de la Nouvelle-Écosse et du Nouveau-Brunswick, ainsi qu’à travers le Canada.

Cette sensibilisation accrue aux problèmes côtiers a incité le gouvernement de l’Île-du-PrinceÉdouard à soutenir le Climate Lab de l’Université de l’Île-du-Prince-Édouard dans la mise en place d’un système de surveillance côtière servant de système d’alerte précoce pour l’érosion côtière. Ce système comprend des mesures à la ligne de piquetage, des relevés par drone, des marégraphes et des stations climatiques.


– UPEI Climate Lab team led by Dr. Adam Fenech, and including Dr. Xander Wang, Don Jardine, Ross Dwyer, Andy MacDonald, Luke Meloche, and Catherine Kennedy – 

Coastal erosion is the primary challenge that climate change presents to Prince Edward Island through storm surges, sea level rise, and high water levels. The sensitive sand and sandstone shorelines across Prince Edward Island often experience a wearing away by water, waves, ice, and wind. Sea level rise measured at Charlottetown, Prince Edward Island has increased by 36 centimetres over the past century (1911-2011 from Daigle, 2012) and is anticipated to increase by a further 100 centimetres over the next 100 years (IPCC, 2021). In terms of damaging storms, the Intergovernmental Panel on Climate Change (IPCC), the global community’s scientific authority on climate matters, concluded that they were “virtually certain” that there had been an increase in intense tropical cyclone activity in the North Atlantic since the 1970s, and “more likely than not,” these intense tropical cyclones would increase in the North Atlantic in the late 21st Century (IPCC, 2013). As a result of these anticipated ocean, geological, and storm changes, coastal erosion is expected to continue and likely become more severe, threatening public and private infrastructure at great economic cost to the 1,260 kilometres of coastline on Prince Edward (ACZISC, 2005).

The most recent study examining the rates of coastal erosion for every metre of coastline on Prince Edward Island (Webster and Brydon, 2012) by interpreting aerial photographs of Prince Edward Island’s coastline for the years 1968 and 2010 using orthorectification and coastline delineation techniques calculated an annual coastal erosion rate of 0.28 metres per year.

A quantitative risk assessment of coastal residences (homes, cottages), safety and security infrastructure (roads, bridges, water treatment plants, hospitals, fire departments, etc.) and heritage (churches, graveyards, lighthouses, archaeological sites, parks, etc.) was conducted by the UPEI Climate Lab to determine what Prince Edward Island infrastructure is at risk to coastal erosion, concluding that over 1000 residences (houses and cottages), over 40 garages, 8 barns, and almost 450 outbuildings are vulnerable to coastal erosion. Even 17 lighthouses, those maritime cultural icons, were deemed to be at risk. Such scientific results were significant but were threatened to sit on a shelf in a scientific report unless communicated sufficiently to the organizations and communities of Prince Edward Island. But how best to do this?

CoastaL Impacts Visualization Environment (CLIVE)

CLIVE is a geovisual interface that combines available coastal data, historical records, and climate change predictions, and translates them into a 3-Dimensional geovisual information tool allowing users to “fly” over Prince Edward Island, raising and lowering sea levels and clicking on and off coastal erosion rates. Programmed in the UNITY shareware, CLIVE combines data from an extensive provincewide archive of aerial photographs documenting coastline erosion as far back historically as 1968, and the latest highresolution digital elevation data derived from laser surveys known as LiDAR, a remote sensing technology that measures distance by illuminating a target with a laser and analyzing the reflected light.

A public engagement tour at sixteen towns across Prince Edward Island was held in 2014 and 2018 with each session presenting an introduction to Prince Edward Island’s vulnerability to coastal erosion and sea level rise, introduced CLIVE, examined the vulnerability of local communities and answered questions. Each session was also preceded and concluded with a written survey to gauge attendee’s knowledge, concern and willingness to adapt to coastal erosion and sea level rise. The concern for coastal erosion of each participant was high, and, in most cases, increased after being introduced to CLIVE. Most importantly, these sessions motivated coastal home or cottage owners to respond to their vulnerability by increasing their resilience to the anticipated sea level rise and coastal erosion.

CLIVE has garnered national and international attention including national text journalism from the Globe and Mail (19 February 2014); national Canadian broadcast coverage from the Canadian Broadcasting Corporation (World Report radio on 11 February 2014), international journal coverage (National Geographic, 16 December 2015) and international television coverage from Al Jazeera media. CLIVE won an international award in 2014 from the Massachusetts Institute of Technology for communicating coastal risk and resilience. The CLIVE technology has been exported to the City of Los Angeles, and implemented in several counties in Nova Scotia and New Brunswick, as well as across Canada.

This raised awareness of coastal issues prompted the Prince Edward Island government to support the Climate Lab at the University of Prince Edward Island to build a coastal surveillance system to act as an early warning system for coastal erosion. This system includes peg-line measurements, drone surveys, tidal gauges and climate stations.

Coastal Surveillance System

Figure 1: UPEI Climate Lab cap on peg for measuring coastal erosion year-to-year.

Every summer, one lucky student working at the Climate Lab at the University of Prince Edward Island has the best job on the Island because they get to visit 200 sites across the province (many with beaches) and take peg-line measurements. This coastline surveillance approach involves physically hammering two 1 metre (m) lengths of 15 millimetre (mm) diameter metal rebar “pins” into the ground in a line spaced 10 m and 20 m roughly normal to the coast; and then manually taking a measurement to the coastal indicator feature (e.g. cliff or bluff edge) using a measuring tape. Metal caps are hammered on the ends of the rebar using a rubber mallet just before the desired depth is achieved (Figure 1). GPS locations of each pin are taken using a Garmin eTrex recreation grade Global Positioning System (GPS) for general site mapping and locating pins year-to-year. Visiting year-to-year provides an early warning system to coastal erosion. Our results show that the average coastal erosion at these pin sites varies year-to-year, but the “usual suspects” show annual erosion rates ranging from 1 to 5 metres. And our preliminary analysis of the coastal impacts from Hurricane Fiona of September 23-24, 2022 show erosion rates at individual locations of over 10 metres.

To survey the full run of the coastline, the UPEI Climate Lab also flys drones to measure coastline change at 90 sites across Prince Edward Island. A DJI Phantom 4 RTK drone (see Figure 2) is flown at an altitude of 50 meters using 75% front and side flight plan overlaps, and then along the coast angling the camera at the coastline to give some depth to the imagery. Ground control points (GCPs) are laid throughout each site before flying to increase the accuracy of the resulting maps produced from the stitched imagery (known as orthomosaics), with the center of each GCP measured using a Trimble Real Time Kinematic (RTK) Global Positioning System (GPS) unit that provides an accuracy of 2 centimeters (cm). Flying drones as these sites year-to-year provides a good sense of how Prince Edward Island’s coasts are changing (see Figure 3).

Figure 3: Coastal monitoring using drones at Savage Harbour, Prince Edward Island. This 1.3 kilometre coastline has shown a net shore erosion of 4.38 metres from 2016 to 2021. Red colours show erosion over 1.5 metres per year.

A series of fifteen real-time tidal gauges have been installed across Prince Edward Island by the UPEI Climate Lab, working with the PEI Emergency Management Office and the Mi’kmaq Confederacy of Prince Edward Island. These sites provide the Island’s only direct monitoring for an early warning system of rising tidal levels and storm surges. These tidal gauges were instrumental in alerting PEI ports of the timing and magnitude of storm surges from Hurricane Fiona in September 2022, providing the only record of the actual height of the storm surges (see Figure 4). To complement these coastal stations are a series of inland climate stations installed by the UPEI Climate Lab to measure temperature, precipitation, wind, atmospheric pressure and solar radiation every 2 to 5 minutes, 24/7 at over 70 locations across Prince Edward Island. These stations provide support research on climate vulnerability, impact and adaptation studies across the Island; support the reconstruction of extreme climate events; and support groundtruthing of high-resolution regional climate models. But most importantly, these climate stations provide detailed real-time weather/climate information to Islanders in their day-to-day needs as farmers, fishermen, tourism operators, or simply beach goers.

Figure 4: PEI Storm Surge Early Warning System graph of Hurricane Fiona storm surge of record 2.4 metres on September 24, 2022. This real-time 24/7 early warning system reports 15 tidal gauges from across Prince Edward Island installed and maintained by the UPEI Climate Lab in partnership with the PEI Emergency Management Office and the Mi’kmaq Confederacy of Prince Edward Island.

Final Words

Climate change is presenting greater challenges to Prince Edward Island’s coasts with rising sea levels, storm surges and coastal erosion, but the Climate Lab at the University of Prince Edward Island is keeping watch over Prince Edward Island’s coastlines, measuring them annually with pegs and drones as an early warning system of coastal change. The Climate Lab also installed and maintains a 24/7 surveillance over the Island’s storm surges with 12 tidal gauges, and the Island’s climate with over 70 climate stations, for a real-time early warning system. With such significant vulnerability to anticipated future coastal threats, the UPEI Climate Lab’s climate change early warning systems need to be maintained, and could benefit from financial injections.


Dr. Adam Fenech has worked extensively in the area of climate change for thirty-five years and has edited eight books on climate change. Dr. Fenech is an Associate Professor in the School of Climate Change and Adaptation at the University of Prince Edward Island where he developed the curriculum for the first undergraduate programme in Applied Climate Change and Adaptation. He is presently the Director of the University of Prince Edward Island’s Climate Research Lab that conducts research on the vulnerability, impacts and adaptation to climate change, where his virtual reality depiction of sea level rise has won international awards including one from MIT for communicating coastal science. He maintains the largest fleet of drones at a Canadian university including the largest drone in the country with a four metre wingspan.

Le Groenland sera-t-il vraiment « vert » après la perte de sa masse glaciaire?

-Par Xander Wang, Pelin Kinay, Aminur Shah et Quan Dau-

Nombreux sont ceux qui pensent qu’en raison du réchauffement climatique, un mythe de longue date concernant le Groenland pourrait devenir réalité : une terre « verte », comme son nom l’indique, au lieu de la terre blanche couverte de glace qui existe actuellement. Des preuves scientifiques récentes donnent à penser que les couches de glace du Groenland fondent rapidement en raison de la hausse de la température de l’air et du réchauffement des eaux océaniques, ce qui entraîne une élévation du niveau de la mer et menace les zones côtières. Les actions des Nations Unies en faveur du climat visent à limiter le réchauffement de la planète en réduisant les émissions de carbone, ce qui permettrait à terme de réduire la fonte des glaces dans les régions polaires, dont le Groenland. L’avenir potentiel du Groenland, en revanche, est inconnu. La question clé reste la suivante : est-il possible de stabiliser la couverture de glace du Groenland ou deviendra-t-il un continent complètement « vert » à l’avenir si les objectifs de réduction des émissions de carbone fixés par l’Accord de Paris (2015) ne sont pas atteints?

Si les preuves passées de la fonte des calottes glaciaires au Groenland ont été étudiées, la prévision de l’avenir de l’énorme couverture de glace présente également un intérêt significatif pour les scientifiques. L’établissement d’une relation directe entre le recul de la glace et le réchauffement du climat pourrait contribuer à prouver le bien-fondé du schéma « émission de carbone – hausse de la température – fonte des couvertures glaciaires – élévation du niveau de la mer ».

Une étude récente publiée dans Earth’s Future a examiné comment la largeur spatiale de la calotte glaciaire du Groenland pourrait changer dans le contexte du réchauffement climatique, en utilisant un modèle climatique régional pour quantifier les changements futurs de la calotte glaciaire du Groenland couvrant divers scénarios d’émissions.

En raison des piètres performances des modèles climatiques en matière de simulation des précipitations (y compris le modèle PRECIS utilisé dans cette étude), l’équipe n’a abordé ce sujet qu’en utilisant les estimations des températures futures. L’étude a notamment utilisé la notion de climat de calotte glaciaire pour estimer si une zone sera ou non couverte par une calotte glaciaire.

Selon les auteurs, la couverture climatique de la calotte glaciaire du Groenland diminuerait régulièrement tout au long du siècle dans le cadre des deux scénarios RCP8.5 et RCP4.5, ce qui signifie que la superficie spatiale de la calotte glaciaire diminuerait de 15 % (RCP4.5) et de 25 % (RCP8.5) d’ici la fin du siècle. En comparaison, le scénario à faibles émissions (RCP2.6) offre la possibilité de limiter la perte de la couverture de la calotte glaciaire du Groenland à moins de 10 % d’ici le milieu du siècle, aucune perte supplémentaire n’étant prévue par la suite. Bien que diverses variables de surface influent sur l’évolution du processus d’équilibre de la masse de la surface glaciaire du Groenland, les chercheurs ont décidé de recourir aux projections de température uniquement pour étudier s’il est possible de stabiliser la calotteglaciaire du Groenland, étant donné que PRECIS fonctionne raisonnablement bien pour simuler la température de l’air proche de la surface au-dessus du Groenland.

Par rapport à la période de référence 1970-2000, l’étude prévoit la couverture glaciaire future du Groenland pour trois périodes – les années 2020, 2050 et 2080 – selon trois scénarios d’émissions. « Le scénario à faibles émissions RCP2.6 a le potentiel de stabiliser le réchauffement climatique au Groenland après les années 2050 et d’empêcher toute perte supplémentaire de la couverture glaciaire », concluent les auteurs. La glace qui ne couvre que 65,5 % du pays, selon les estimations de la période de référence, pourrait passer à 56 % dans les années 2050, puis à 57 % dans les années 2080, selon un scénario à faibles émissions. Par conséquent, de faibles émissions pourraient potentiellement limiter le réchauffement du Groenland à moins de 1°C au cours des 30 prochaines années et limiter la perte de la
couverture glaciaire à moins de 10 %.

En revanche, la couverture de glace diminuera continuellement tout au long de ce siècle, car le climat local du Groenland est susceptible de se réchauffer continuellement dans le cadre des scénarios d’émissions moyennes et élevées. Le pire pourrait être attendu dans les années 2080 dans le cadre d’un scénario à fortes émissions, avec seulement environ 40 % du pays couvert par des calottes glaciaires.

Les résultats de cette étude sont essentiels pour comprendre les conséquences de divers scénarios d’émissions de carbone sur la stabilisation ou la limitation du réchauffement au Groenland et donc de la perte de couverture de la calotte glaciaire, qui est liée à l’élévation du niveau des mers. Les résultats de l’étude supposent que les scénarios à fortes et moyennes émissions entraîneraient un réchauffement continu au Groenland et donc une perte importante de la calotte glaciaire. Toutefois, le scénario à faibles émissions présente un fort potentiel de réduction du réchauffement climatique local et de la perte de la calotte glaciaire avant les années 2050. Plus particulièrement, en supposant que le scénario à faibles émissions soit respecté, aucun changement significatif n’est prévu après les années 2050.

Ces conclusions sont importantes, non seulement parce qu’elles permettent aux défenseurs du climat d’espérer que la calotte glaciaire du Groenland sera préservée et que les populations côtières seront protégées de l’élévation du niveau de la mer, mais aussi parce qu’elles incitent toutes les nations à prendre des mesures immédiates pour réduire les émissions de carbone. Il est juste de s’attendre à ce que la couche de glace couverte par le climat de la calotte glaciaire reste en place indéfiniment, mais il est difficile de prévoir quand la couche de glace au-delà de la couverture du climat de la calotte glaciaire commencera à fondre et finira par disparaître, soulignent les auteurs.

Cela va de soi que la véritable terre « verte » est à l’horizon si nous ne prenons aucune mesure pour réduire les émissions de gaz à effet de serre et le réchauffement de la planète, et les plus grandes conséquences sont évidentes. Le Groenland pourrait envier la beauté d’un paysage vert luxuriant, mais le reste du monde souffrira des pires conséquences de l’élévation du niveau de la mer et des inondations côtières.

Les changements dans les nappes glaciaires du Groenland et de l’Antarctique ont un impact sociétal important car ils ont une incidence directe sur le niveau mondial des mers. Lorsque les glaciers et les nappes glaciaires fondent, davantage d’eau pénètre dans l’océan. Heureusement, certaines politiques et actions climatiques sont en place à l’échelle mondiale et locale. Il y a donc lieu d’espérer! Pourtant, il est urgent de prendre des mesures efficaces pour réduire les émissions de carbone afin de ralentir la disparition de la calotte glaciaire du Groenland et de sauver nos communautés côtières de grandes catastrophes.

Nouvelles sur l’article publié suivant : Wang, X., Fenech, A. et Farooque, A. A. (2021). Possibility of stabilizing the Greenland ice sheet. Earth’s Future, 9(7), e2021EF002152. https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2021EF002152


-By Xander Wang, Pelin Kinay, Aminur Shah, and Quan Dau-

Many people believe that, due to global warming, a longstanding myth about Greenland might be the reality – a ‘green’ land, resembling its name – ‘Green’, instead of the snowwhite ice-covered land that exists now. Recent scientific evidence suggests the ice sheets in Greenland are melting quickly because of rising air temperature and warm ocean waters, which is causing sea level rise and threatening coastal areas. UN Climate Actions are aimed at limiting global warming by lowering carbon emissions, which would eventually lead to less ice melting in the polar areas, including Greenland. Greenland’s potential future, on the other hand, is unknown. The key question remains: is it possible to stabilize Greenland’s ice cover, or will it be a completely “Green” continent in the future if the Paris Agreement’s (2015) carbon reduction goals are not met?

While past evidence of melting ice sheets in Greenland has been studied, predicting the future of the huge ice cover is also of significant interest to scientists. Establishing a direct relationship between ice retreat and a warming climate could assist prove the ‘carbon emission – rising temperature – melting ice covers – sea level rise’ pathways.

A recent study published in Earth’s Future explored how the spatial breadth of the Greenland ice sheet might change in the context of global warming, using a regional climate model to quantify future changes in the Greenland ice sheet covering various emission scenarios.

Due to the poor performance of climate models in simulating precipitation (including the PRECIS model used in this study), the team only addressed this subject using future temperature estimates. The study, in particular, employed the idea of ice cap climate to estimate whether or not an area will be covered by an ice sheet.

According to the authors, ice cap climatic coverage of Greenland would diminish steadily throughout the century under both RCP8.5 and RCP4.5, meaning that the spatial area of the ice sheet would fall by 15% (RCP4.5) and 25% (RCP8.5) by the end of the century. In comparison, the low-emission scenario (RCP2.6) has the possibility of limiting the loss of Greenland ice sheet coverage to less than 10% by the middle of this century, with no more loss expected after that. Though various surface variables influence the evolution of Greenland’s ice surface mass balance process, the researchers decided to use the temperature projections only to investigate if it is possible to stabilize the Greenland ice sheet given that the PRECIS does perform reasonably well in simulating near-surface air temperature over Greenland.

Compared with the baseline period of 1970-2000, the study projects future ice coverage over Greenland for three periods – 2020s, 2050s, and 2080s under three emission scenarios. “The low-emission scenario of RCP2.6 does have the potential to stabilize the warming climate in Greenland after 2050s and prevent further loss to its ice sheet coverage”, the authors conclude. Ice covering only 65.5% of the country estimated in the baseline period could reduce to 56% in the 2050s and then increase to 57% in the 2080s under a low emission scenario. Hence, low emissions could potentially limit the warming in Greenland below 1°C within the next 30 years and constrain its loss of ice sheet coverage below 10%. 

By contrast, ice coverage will continuously decline throughout this century as the local climate in Greenland is likely to warm up continuously under both medium and high emission scenarios. The worst could be expected in the 2080s under a high-emission scenario, with only around 40% of the country covered by ice caps.

The findings of this study are critical for understanding the consequences of various carbon emission scenarios on stabilizing or limiting warming in Greenland and thus the loss of ice sheet coverage, which is connected to rising sea levels. The results of the study imply that both the high- and medium-emission scenarios would result in ongoing warming in Greenland and thus major ice sheet loss. However, the low-emission scenario has a high potential for reducing local climate warming and ice sheet loss before the 2050s. Most notably, assuming the low-emission scenario is satisfied, no significant changes are projected after the 2050s.

The findings are significant not just for giving climate activists optimism that the Greenland ice sheet will be preserved and coastal populations would be protected from rising sea levels, but also for pressing all nations to take immediate action to decrease carbon emissions. It is fair to expect that the ice sheet covered by the ice cap climate will remain in place indefinitely, but it is difficult to predict when the ice sheet beyond the ice cap climate coverage will begin to melt and eventually disappear, authors highlight. 

It goes without saying that the real ‘Green’ land is on the horizon if we do not take any action to reduce GHG emissions and global warming, and the greatest consequences are obvious. Greenland might envy the beauty of a lush green landscape; however, the rest of the world will suffer from the worst impacts of sea level rise and coastal flooding.

Changes in the Greenland and Antarctic ice sheets have a significant societal impact because they have a direct impact on world sea levels, as glaciers and ice sheets melt, more water enters the ocean. Fortunately, there are some climate policies and actions in place at the global and local scales. So, there is hope! Yet, effective actions are urgently needed to reduce carbon emissions so that we can slow down the disappearance of the ice sheet over Greenland and save our coastal communities from big disasters.

News on the following published article:

Wang, X., Fenech, A., & Farooque, A. A. (2021). Possibility of stabilizing the Greenland ice sheet. Earth’s Future, 9(7), e2021EF002152. https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2021EF002152


Dr. Xander Wang is an Associate Professor in the School of Climate Change and Adaptation at the University of Prince Edward Island (UPEI). He is also the Director of Climate Smart Lab in the Canadian Centre for Climate Change and Adaptation. Dr. Wang has been recently elected as a member of the Royal Society of Canada (RSC) College of New Scholars. Dr. Wang has served as the Associate Dean (Interim) in the School of Climate Change and Adaptation and is a core member leading the development of the Canadian Centre for Climate Change and Adaptation at UPEI, which is a world-leading research and teaching cluster in climate change impacts and adaptation. Before joining UPEI, Dr. Wang worked as an Assistant Professor in the School of Geosciences at the University of Louisiana at Lafayette, US.

Dr. Pelin Kinay is a postdoctoral fellow of the Climate Smart Lab in the Canadian Centre for Climate Change and Adaptation at UPEI. Her research interest focuses on climate change adaptation and its associated impacts on human health, as well as the natural and human-caused variables that influence climate change.

Dr. Aminur Shah is a postdoctoral fellow of the Climate Smart Lab in the Canadian Centre for Climate Change and Adaptation at UPEI. His research interest focuses on vulnerability and risk assessment of social-ecological systems to natural hazards, sustainable flood risk management, sustainability assessment, climate change impacts and adaptation, community risk reduction, and nature-based solutions.

Dr. Quan Dau is a postdoctoral fellow of the Climate Smart Lab in the Canadian Centre for Climate Change and Adaptation at UPEI. His research interest focuses on water science and global climate change, including but not limited to, hydrological cycle, water resources planning and management, remote sensing, artificial intelligence, climate change adaptation, irrigation water management, socio-economic projection, and reservoir operating management.

 

Correction des biais dans les estimations de l’équivalent en eau de la neige de surface au moyen de l’apprentissage automatique

-Par Fraser King-

Pendant les hivers froids du Canada, les accumulations de neige qui ne sont pas constamment déneigées ou pelletées augmentent lentement en taille et en densité. Du point de vue du bilan hydrique, ces accumulations de neige agissent comme des châteaux d’eau éphémères, attendant que les températures printanières les réchauffent suffisamment pour les faire fondre en masse. Cette eau issue de la fonte des neiges constitue un élément essentiel des bilans d’eau locaux, car elle remplit les aquifères et alimente les rivières et les lacs voisins. Cependant, les périodes rapides de fonte des neiges peuvent rapidement saturer le sol, entraînant un ruissellement de surface et des inondations. Les inondations causées par la fonte des neiges sont devenues de plus en plus problématiques dans une grande partie du Canada au cours des dernières décennies, alors que les températures mondiales continuent d’augmenter, entraînant des millions de dollars de dommages pour les communautés locales et des perturbations pour le développement et la durabilité des écosystèmes régionaux.

La capacité de quantifier avec précision la quantité d’eau stockée dans la neige au sol représente donc un élément important de la prévision des inondations, permettant aux gouvernements locaux de mieux se préparer et d’atténuer les dommages causés par les futurs épisodes de fonte rapide des neiges. Comme le mentionne Ross D. Brown dans un bulletin de la SCMO publié récemment, le nombre de sites d’observation de la neige au Canada a chuté de plus de 50 % depuis 1995, laissant d’importantes lacunes non observées dans une grande partie du pays. Les modèles climatiques et les produits de réanalyse sont des outils puissants qui peuvent être utilisés pour combler ces lacunes spatio-temporelles dans les observations, mais aucun modèle n’est exempt de biais, d’erreurs et d’incertitudes, ce qui limite nos estimations de la teneur en eau réelle d’un manteau neigeux donné.

Dans un article soumis à Hydrology and Earth System Science en 2020, nous répondons à certaines des préoccupations susmentionnées concernant l’erreur de modèle en corrigeant le biais des estimations de l’équivalent en eau de la neige (EEN) à partir du produit EEN quadrillé du système d’assimilation de données SNOw (SNODAS). SNODAS est un ensemble de données quotidiennes de modélisation et d’assimilation de données à 1 km produit par le centre opérationnel de télédétection hydrologique du National Oceanic and Atmospheric Administration (NOAA) National Weather Service. Bien que ce produit ait été conçu principalement pour être utilisé dans la partie continentale des États-Unis, la partie nord de SNODAS chevauche le sud de l’Ontario. Grâce à la combinaison des nombreuses sources d’observations assimilées par SNODAS et de son modèle interne complexe basé sur la physique, SNODAS produit des estimations d’EEN de surface de la plus haute qualité dans la région.

L’apprentissage automatique est utilisé depuis des décennies dans le domaine des géosciences, mais les progrès rapides des ressources informatiques, combinés aux pétaoctetsde données d’observation désormais facilement accessibles, ont permis à ce domaine de recherche de prendre de l’essor ces dernières années. Alors qu’il peut être tentant de passer immédiatement à l’apprentissage automatique pour des problèmes tels que la réduction d’échelle ou la correction de biais, nous soutenons que cet état d’esprit peut être problématique. L’approche du rasoir d’Occam pour des problèmes tels que ceux-ci offrent aux chercheurs des occasions supplémentaires d’économiser sur les coûts de calcul (c’est-à-dire éviter les phases coûteuses de formation/hyperparamétrage du modèle) et de concevoir un modèle que l’on peut mieux interpréter et expliquer. Bien que nous soyons incroyablement optimistes quant à l’avenir de l’apprentissage automatique (et surtout de l’apprentissage profond) dans les géosciences, nous recommandons également aux futurs chercheurs de commencer par des méthodes simples avant de fouiller dans leurs boîtes à outils respectives.


-By Fraser King-

During Canada’s cold winters, snowpacks that aren’t consistently plowed or hovelled slowly grow in size and density. From a water-balance perspective, these snowpacks act as ephemeral water towers, waiting for spring temperatures to eventually warm them enough to melt en masse. This snowmelt-derived water is a critical contributor to local water budgets as it refills aquifers, and feeds nearby rivers and lakes. However, rapid snowmelt periods can quickly saturate the soil, leading to surface runoff and flooding. Snowmelt-derived flooding has become increasingly problematic across much of Canada in recent decades as global temperatures continue to rise, leading to millions of dollars in damage to local communities, and disruptions to regional ecosystem development and sustainability.

Figure 1: a) Relative bias in SNODAS SWE estimates when compared to in situ estimates from ECCC; and b) 7 year timeseries of SWE on ground estimates from SNODAS and ECCC.

The ability to accurately quantify the amount of water stored in snow on the ground is therefore an important component in flood forecasting, allowing local governments to better prepare for, and mitigate, damages caused by future rapid snowmelt events. As discussed in another recent CMOS bulletin from Ross D. Brown, the number of snow-observing sites across Canada has dropped by over 50% since 1995, leaving large unobserved gaps across much of the country. Climate models and reanalysis products are powerful tools which can be used to fill these spatiotemporal gaps in observations, however no model is without bias, error and uncertainty, which limits our estimates of the true water content in a given snowpack.

In a paper submitted to Hydrology and Earth System Science in 2020, we address some of the aforementioned concerns surrounding model error by bias correcting snow water equivalent (SWE) estimates from the SNOw Data Assimilation System (SNODAS) gridded SWE product. SNODAS is a daily, 1 km modelling and data assimilation dataset produced by the National Oceanic and Atmospheric Administration (NOAA) National Weather Service’s Operational Hydrologic Remote Sensing Center. While this product was primarily developed for use across the continental United States, the northern portion of SNODAS overlaps with southern Ontario. Through a combination of the many sources of observations assimilated by SNODAS, and its complex, physically-based internal model, SNODAS produces some of the highest quality estimates of surface SWE across the region.

However, when compared with independent in situ measurements recorded by the Climate Research Division of Environment and Climate Change Canada (ECCC), SNODAS displays clear spatiotemporal biases in its SWE estimates across much of southern Ontario (Figure 1). Temporally, SNODAS exhibits a strong positive bias pre-2014 (a period which marks a distinct change in the known assimilated datasets), along with strong positive spatial biases as we move further inland, away from the US border.

To address these biases, we explored a suite of increasingly sophisticated statistical bias-correction methods, culminating in the application of a nonlinear machine learning (ML) technique which displayed the best overall skill. Instead of jumping directly into ML, we followed an “Occam’s razor” methodological approach by starting with simple, well validated and interpretable methods of bias correction like mean bias subtraction (MBS) and linear regression to develop a performance baseline. The idea being that if a simple method does nearly as well as a more sophisticated ML-based method, we should use the simpler, more explainable technique.

Figure 2: RMSE and absolute mean bias for each model trained and tested using different spatiotemporal partitions of the full training dataset.

We evaluated four different models including the aforementioned MBS, a simple linear regression (SLR) model, decision tree (DT), and finally a random forest (RF). Each of these models were fit using the same training datasets over three periods

  1. December, January, February (DJF)
  2. March, April, May (MAM)
  3. DJF MAM (i.e. annual)

across two spatial domains (northern vs. southern Ontario). Each model was fit using a set of climate predictor variables (SNODAS SWE on ground, precipitation biases, surface temperature, elevation, year, and day of year) to model ECCC SWE on ground at 391 sites.

Our results indicated that the RF continually demonstrated the lowest overall RMSE and absolute mean bias over each period and across all regions (Figure 2). The overly simplistic MBS did an excellent job at removing the mean bias (by construction), however this was accomplished by overcorrecting the bias in some regions and under correcting it in others (resulting in the high RMSE for this method in Figure 2). The SLR fared better with a slightly lower overall RMSE, however these linear methods were unable to fully account for the nonlinear spatiotemporal bias from Figure 1. The best performing methods were the ML-based DT and RF, with the RF demonstrating improved performance annually (improved robustness).

Figure 3: Timeseries comparisons of monthly area-normalized discharge from SNODAS and the bias corrected SWE melt estimates at three river gauges in Ontario.

To further quantify the differences between bias-corrected and uncorrected SWE estimates, we also performed a simple water balance analysis across three watersheds in southern Ontario, anticipating that reductions in mean SWE would produce a more physically consistent fit with in situ melt measurements. Comparing monthly snow melt estimates (i.e. the negative SWE differences between consecutive monthly means from bias corrected and uncorrected SWE datasets) with area-normalized discharge across each basin (Figure 3), we found that the bias corrected RF-derived melt estimates were much closer in magnitude to in situ, and did not display the unphysical overestimation which was typical of SNODAS. These types of follow-up comparisons are incredibly useful methods for further validating the robustness of bias correction models like those explored in this work, and can be used to identify deficiencies which may be hidden upon first glance (e.g. preserving physical laws which are unknown to the ML model).

While the ML-based bias correction techniques applied here demonstrate good skill and a general robustness throughout the region, there are other options to choose from in the ML toolbox. In an upcoming study, we plan on evaluating some of these tools through a daily, ten-year Canada-wide bias correction of temperature, precipitation and radiation fields from the fifth-generation Canadian Regional Climate Model (CRCM5) (biases shown in Figure 4). With a much larger available sample in this follow-up project, we are able to experiment with more sophisticated neural network (NN) approaches for spatiotemporally bias correcting each climate variable. These bias corrected fields can then be used to drive land surface models and, in turn, bias correct surface snow estimates via a proxy correction of associated climate variables. When trained on the billions of available data points, early results suggest that NN approaches strongly outperform linear methods, and even beat out RF techniques for nonlinear biases like those present in surface temperature.

Figure 4: CRCM5 surface temperature (T2M) and precipitation (PR) biases at a) monthly and b) annual timescales; along with their corresponding climatological mean biases across Canada in c) and d).

ML has been used across the Geosciences for decades, however, rapid advancements in computing resources, combined with petabytes of now easily accessible observational data, has allowed this field of research to flourish in recent years. While it can be appealing to immediately jump to ML for problems like downscaling or bias correction, we argue that this mindset may be problematic. The Occam’s razor approach to problems such as these provide researchers with additional opportunities to save on computational costs (i.e. avoid expensive model training/hyperparameterization phases), and to develop a more interpretable and explainable model. While we are incredibly optimistic about the future of ML (and especially deep learning) in the Geosciences, we also recommend that future researchers take care by starting with simple methods before digging into their respective machine learning toolboxes.


Fraser King completed his PhD in remote sensing and machine learning of precipitation at the University of Waterloo in December, 2022. He is now a post doctoral research fellow at the University of Michigan, developing machine learning-based snowfall retrieval algorithms and using surface and spaceborne radars to improve our understanding of hydrometeor particle microphysics.

Tornades dans les Prairies canadiennes : 1826-1939

-Par Patrick McCarthy and Jay Anderson-

Résumé

Kendrew et Currie (1955) ont commenté les tornades des Prairies dans leur publication de 1955 intitulée The Climate of Central Canada : « … elles se produisent probablement en Alberta et au Manitoba, mais il n’y a pas de preuve définitive », tout en soulignant que la Saskatchewan n’en compte en moyenne qu’une par année. Après un effort exhaustif, nous avons compilé la base de données historique la plus complète sur les tornades des prairies canadiennes avant 1940. Celle-ci comprend un total de 589 tornades classées, dont 152 ont une trajectoire de dommages. Le projet a également permis d’obtenir un compte rendu plus complet des décès et des blessures dus aux tornades pour cette période.

Résultats

La plupart des premiers événements se sont produits dans les prairies de l’Est, plus peuplées. À partir des années 1880, des fermes agricoles et des communautés ont vu le jour le long des lignes de chemin de fer qui traversaient les plaines vers l’ouest. En 1900, le nombre de tornades signalées en Saskatchewan et en Alberta a commencé à augmenter considérablement (figure 1).

Figure 1 : Graphique montrant le nombre de rapports de tornades par année dans les trois provinces des Prairies de 1826 à 1939. La ligne verticale en pointillés indique une rupture dans la séquence annuelle. Il n’y a aucun rapport en 1939.

La présence de tornades de 1826 à 1939 se situe principalement au sud de la limite des arbres, dans la région des prairies (figure 2). Les rapports de tornades tracent souvent ces lignes de chemin de fer. Cet artefact est dû au fait que les rapports de tornades sont associés à une ville, un village ou une gare ferroviaire proche, même si la tornade peut avoir été assez éloignée. Ces distributions sont bien connues des prévisionnistes d’Environnement Canada, bien que les autoroutes aient remplacé les chemins de fer comme axe d’alignement. Les grandes caractéristiques climatologiques des tornades des Prairies sont apparues très tôt.

Figure 2 : La distribution des tornades enregistrées entre 1826 et 1939. Le réseau ferroviaire actuel est à la base des données sur les tornades.

La figure 3 indique que le plus grand nombre de tornades est survenu de la mi-juin à la mijuillet. Les observations des tornades de fin de soirée et de nuit sont limitées par l’obscurité, un biais que nous avons tenté de surmonter en examinant la situation synoptique et les traces de dommages signalées. La latitude nord de la région signifie que le crépuscule civil se termine à 22 h 30 et commence à 4 h 30 du matin à la fin du mois de juin, ce qui ne laisse qu’une période de six heures d’obscurité au plus fort de l’été. Cette période pour les tornades observables est parmi les plus longues en Amérique du Nord.

Figure 3 : Nombre de tornades pour chaque jour de l’année, 1826-1939.

La nature de la tornade des Prairies fait qu’il y a des variations stochastiques importantes dans les événements rapportés d’une année à l’autre, allant de zéro à 30 dans notre échantillon. Cela peut ou non représenter des différences dans les conditions climatologiques des années individuelles, car nous ne pouvons pas savoir à quel point notre échantillon est complet. Ce n’est que récemment, avec la popularité de la chasse aux tempêtes, des caméras des téléphones portables et des médias sociaux, que les comptes annuels de tornades ont commencé à refléter la fréquence réelle des événements de manière suffisamment proche pour en tirer une climatologie utile. Récemment, le Northern Tornadoes Project (2020) a démontré qu’un relevé intensif des dommages peut révéler un compte rendu plus complet des tornades et de leurs trajectoires.

De plus amples renseignements sur les efforts précédents, la méthodologie, les résultats ultérieurs et les travaux en cours sont disponibles en anglais ci-dessous.


-By Patrick McCarthy and Jay Anderson-

Abstract

Our project was to develop the most comprehensive and accurate database of Canadian prairie tornadoes. We have completed the period from 1826 to 1939. How this was accomplished is presented here.

Previous efforts

The systematic collection of Canadian prairie tornado events began with A.B. Lowe and G.A. McKay (1960). Brad Shannon (1976) extended that database to 1973. During the 1970s, the Prairie Weather Centre (PrWC) in Winnipeg established the Meteorological Service of Canada’s (MSC) first summer severe weather program. It was supported by the Regina Weather Office and was expanded to the Alberta Weather Centre. For verification and case study purposes, severe weather events, including tornadoes, were collected and archived for the three Prairie Provinces. These reports were often very incomplete, relying on informal contacts and phone calls to weather observers. At the time it was thought that sightings of only about 1/3rd of the tornadoes reached the weather office. In collaboration with the Ontario Weather Centre, damage surveys became a part of the PrWC records. Weather spotters were added to improve event detection and description. While the collection of annual events was ongoing, the archiving of historic occurrences remained very limited.

In the early 1980s, Dr. Keith Hage, at the University of Alberta, began an extensive study to identify historic Alberta and Saskatchewan tornado and windstorm occurrences. Michael Newark (1984) developed the first Canada-wide tornado database. His work would eventually include events as far back as the 18th century. Thomas Grazulis (1993) published an extensive American tornado dataset that included a few events that straddled the U.S.—Canada border. Hage (1994) published a second database for historic Alberta tornadoes, windstorms, and lightning fatalities. Dr. Alexander Paul (1995), at the University of Regina, produced a Saskatchewan tornado chronology from 1906 to 1991 (690 tornadoes). Grazulis (2000) published a compilation of Canadian “killer” tornadoes. Hage (2001) published his compilation of Saskatchewan tornadoes (720), windstorms, and lightning fatalities from 1880 to 1984.

While employed at the MSC office in Winnipeg, we began consolidating and correcting these sources, beginning in the early 1990s and continuing to the present. This effort formed the basis of the current MSC Prairie Severe Weather archive, and which is updated annually. Eventually, a formal version was made available to the public and researchers (McCarthy, 2010). The database contained roughly 19,000 severe convective events, including 3100 reported tornadoes dating back to 1826.

Methodology

We noted many problems in the 2010 database. These included vague reports, data gaps, unverified assumptions, missing and inaccurate dates and times, possible duplications, and conflicting death and injury totals. We undertook a project to review and update all the tornado reports in the archive and to add previously undiscovered events to the record.

We worked independently to take advantage of our unique but overlapping research approaches. In 2019, our two datasets were merged and differences in our records were resolved. This 1826-1939 portion of the review took about five years to complete. A small amount of new information has since been added; something that will likely occur on an ongoing basis.

The data-collection phase of the project had many challenges. Prior to 1940, there were no damage surveys, no video evidence, damage and tornado photographs were rare, eyewitness accounts were limited, and construction practices were varied. We are both experienced storm-damage surveyors and we used our experience to assess damage reports, leading to a subjective damage rating when sufficient evidence was available.

A multitude of sources, mostly unofficial and difficult to unearth, were used in the research. These included digital databases, particularly newspapers, books, historic school sources, community/rural municipality/county histories, historical societies, and obituaries. Most events appeared in newspapers, though there was typically little follow-up after the initial reports. Downed telephone and telegraph lines and poor roads often delayed the reports, leading to inaccurate dates in the record. Delayed information seldom made it into the newspapers, allowing initially bad information to linger in articles for weeks. To account for these issues, we searched for descriptions of events for up to a month after their occurrence, and in a few instances, much later.

Eyewitnesses have an expansive view across the open prairie. Tornadoes could be seen from long distances and reports often resulted in misplaced locations. The “highways” of the prairie were the railroads. Railway station names were often referenced, further misplacing the tornado’s position. Many stations and some communities have disappeared over time. We controlled many of these difficulties by triangulating available reports to narrow the location error. On days with multiple events, the information was plotted using eyewitness accounts, storm information, and damage characteristics, to identify storm damage swaths and potential tornado tracks. The numerical NOAA-CIRES 20th Century Reanalysis (V2c) ( https://psl.noaa.gov), displayed via the (https://meteocentre.com/) website, was also used to help assess storm types, potential storm motion, and storm potential. To validate a tornado occurrence, the authors used a decision-tree approach, similar to Sills, et al (2003).

A hybrid version of the Fujita (https://www.spc.noaa.gov/faq/tornado/f-scale.html) and Enhanced-Fujita Scales (https://www.spc.noaa.gov/faq/tornado/ef-scale.html) was used to rate tornado damage. The EFScale is too modern for most early events, which lack the necessary details in their descriptions. The FScale rating, which allows for a broader interpretation, was primarily used for the database. For events that crossed the U.S. – Canada border, the rating is for damage on the Canadian side, only.

The 1826-1939 timeframe was a period of immigration, spreading from east to west across the Northern Plains. There was a slow growth in population, communities, roads, railways, newspapers, etc., which affected the availability of data. For example, most cemeteries did not appear until after 1910 in Saskatchewan and Alberta.

One of our major goals was to provide a more complete account of the number of deaths and injuries. To make accurate casualty totals, a rigorous effort was made to identify the names of victims. This included victims who may have succumbed to the injuries well after the event. We used cemetery records, obituaries, ancestry/genealogy records, church records, provincial/state censuses, and death records to track down these victims.

Our effort has culminated in the most complete and descriptive archive of Prairie tornadoes and a record for researchers to build on. Some of our findings from this early compilation are presented in Part 2 of this series.


References

Grazulis, T.P., 1993. Significant Tornadoes, 1880-1991. Volume II: A Chronology of Events. Environmental Films, St. Johnsbury, VT. https://books.google.ca/books?id=yW5NAQAAIAAJ&printsec=frontcover&source=gbs_ge_summary_r& cad=0#v=onepage&q&f=false

Grazulis, T.P., 2000. Canadian Killer Tornadoes. The Tornado Project, St. Johnsbury, VT.

Hage, K.D., 1994. Alberta Tornadoes, Other Destructive Windstorms and Lightning Fatalities: 1879-1984. Self-published, Spruce Grove, Alberta, Canada.

Kendrew, Wilfrid George, and Balfour W. Currie, 1955. The climate of central Canada– Manitoba, Saskatchewan, Alberta and the Districts of Mackenzie and Keewatin. E. Cloutier, Queen’s Printer, Ottawa, 152.

McCarthy, P. (ed), 2011. Prairie and Northern Region 1826-2010 Severe Weather Event Database. Prairie and Arctic Storm Prediction Centre, Meteorological Service of Canada, Environment Canada, CD-ROM CD_095.

McKay, G.A., A.B. Lowe, 1960. The tornadoes of western Canada. Bull. Amer. Meteor. Soc., 41, No. 1, pp. 1-8. https://doi.org/10.1175/1520-0477-41.1.1

Newark, M. J., 1984: Canadian Tornadoes, 1950- 1979. Atmos.-Ocean, 22, 343-353.

Paul, A.H., 1995. The Saskatchewan Tornado Project. University of Regina, Department of Geography Internal Report.

Raddatz, R.L., R.R. Tortorelli, M.J. Newark, 1983. Manitoba and Saskatchewan Tornado days 1960 to 1982. Environment Canada, Atmospheric Environment Service, Canada Climate Centre, Downsview, ON, CLI-683, 57 pp.

Raddatz, R.L., J.M. Hanesiak, 1991. Climatology of Tornado Days 1960-1989 for Manitoba and Saskatchewan. Climatological Bulletin. 1991, 25, Num 1, pp. 47-59.

Shannon, T.B., 1976: Manitoba Tornadoes 1960-1973. Atmospheric Environment Service, Meteorological Applications Branch,

Environment Canada, Project Report No. 29. https://publications.gc.ca/collections/collection_2020/eccc/En57-46-29-eng.pdf

Sills, D.M.L., Scriver, S.J., and King, P.W.S. 2004. The tornadoes in Ontario project (TOP). Preprints, 22nd AMS Conference on Severe Local Storms, Hyannis, Mass., American Meteorological Society, CD-ROM 7B.5. http://www.yorku.ca/pat/research/dsills/papers/SLS22/SLS22_TOP_AMS.pdf


Abstract

Kendrew and Currie (1955) commented on prairie tornadoes in their 1955 publication The Climate of Central Canada: “…they probably occur in Alberta and Manitoba but there is no definite evidence,” while noting that Saskatchewan averaged only about one per year. After an exhaustive effort, we have compiled the most complete historical database of Canadian prairie tornadoes before 1940. This included a total of 589 rated tornadoes, with 152 having a damage track. The project also yielded a more complete account of tornado deaths and injured for the period.

Results

Most early events occurred in the more-populated eastern prairies. Beginning in the 1880s, agricultural farms and communities grew along the railway lines that were being laid westward across the plains. By 1900, there began a significant increase in tornado reports in Saskatchewan and Alberta (Figure 1).

The 1826-1939 tornado occurrences are mostly south of the tree line, spread across the prairie region (Figure 2). Tornado reports often trace out those railway lines. This artifact is due to tornado reports being associated with a nearby town, village, or rail station, even though the tornado may have been quite distant. These are well-known distributions to Environment Canada’s forecasters, though highways have replaced railways as the alignment axis. The broad climatological characteristics of Prairie tornadoes were evident from a very early stage.

Figure 3 shows that the largest number of tornadoes occurred from mid-June to mid-July. Observations of late evening and overnight tornadoes are limited by darkness, a bias that we attempted to overcome by examination of the synoptic situation and the reported damage tracks. The northern latitude of the region means that civil twilight ends at 10:30 pm and begins at 4:30 am in late June leaving only a six-hour period of darkness at the height of summer. This period for observable tornadoes is among the longest in North America.

It is the nature of the Prairie tornado that there are significant stochastic variations in reported events from year to year, ranging from zero to 30 in our sample. This may or may not represent differences in climatological conditions in individual years, as we cannot know how complete our sample is. It is only recently, with the popularity of storm chasing, cell phone cameras, and social media that annual tornado counts have begun to reflect the actual frequency of events closely enough to derive a useful climatology. Recently, the Northern Tornadoes Project (2020) has demonstrated that intensive damage surveying can reveal a more complete account of tornadoes and their tracks.

We estimated the strength of tornadoes according to a hybrid Fujita – Enhanced Fujita Scale, as described in our Part 1 article. The ability to come up with a definitive value is limited when information is sparse. Table 1 shows the distribution of our F-value estimates. The deficiency in F0 values is most likely due to their weak and transient nature, which makes them less likely to be observed, reported, or newsworthy. The data also suggests that deaths and injuries are more likely with stronger tornadoes.

Most early tornado reports contained only a single observation. Typically, only the more severe events included comments about the path location and length. Individual reports also tended to be limited by reference to a single nearby community. We endeavoured to be more precise by finding additional eyewitness accounts of each event that would allow a track to be established and by triangulating reports to obtain a more accurate position. From this improved dataset, we were able to estimate the tracks of 152 tornadoes in the pre-1940 era. In the case of the 1912 Regina tornado (Figure 4), we were able to find seven tornado tracks in the outbreak that day—a result that is comparable to the 1987 Edmonton tornado event.

One consequence of the research was the uncovering of additional events for inclusion into the severe weather database, including elements such as deaths due to lightning, major hail events, and unusual phenomena such as raining frogs and fish falling from the sky. In a few exceptional cases, a detailed assessment of extreme non-tornadic wind events, such as the 1922 Manitoba derecho, was constructed. These mesoscale convective systems often have embedded and difficult-to-detect tornadic circulations.

Ongoing work

As online digital databases continue to grow, we will likely see further adjustments to this early dataset. Moving forward, we are now examining the period from 1940 to 1979. The current database for this time frame contains over 900 possible tornado events. The goal is to have a comprehensive, reliable, accurate, and public dataset of tornadoes from the first reports on the Canadian Prairies to the present.


References

Kendrew, Wilfrid George, and Balfour W. Currie, 1955. The climate of central Canada– Manitoba, Saskatchewan, Alberta and the Districts of Mackenzie and Keewatin. E. Cloutier, Queen’s Printer, Ottawa, 152.

Sills, D. M. L., Kopp, G. A., Elliott, L., Jaffe, A. L., Sutherland, L., Miller, C. S., Kunkel, J. M., Hong, E., Stevenson, S. A., & Wang, W. (2020). The Northern Tornadoes Project: Uncovering Canada’s True Tornado Climatology, Bulletin of the American Meteorological Society, 101(12), E2113-E2132. Retrieved Oct 7, 2022, from https://journals.ametsoc.org/view/journals/bams/101/12/BAMS-D-20-0012.1.xml


Patrick McCarthy is a retired Meteorological Service of Canada meteorologist and former Head of the Prairie and Arctic Storm Prediction Centre. He is an active weather history buff, storm chaser, and the Chair of the CMOS Winnipeg Centre.

Jay Anderson is a meteorologist, formerly with Environment Canada, where he worked primarily in Winnipeg and Vancouver over a 34-year career. Since his retirement, he has been working casually as a consultant, primarily in the travel industry, and teaching storm chasing at the University of Manitoba

 

Modélisation océanique à haute résolution des fjords de la Colombie britannique

-Par Krysten Rutherford and Laura Bianucci-

L’océan côtier et proche du rivage est une région importante, qui fait office de tampon entre la terre et l’océan ouvert. Par conséquent, elle subit les effets du changement climatique en provenance de la haute mer et absorbe également les effets de la terre et des changements d’utilisation des terres. De plus, elle est souvent la plus utilisée par l’homme pour des activités récréatives, économiques et traditionnelles, ce qui peut ensuite avoir des effets, par exemple des marées noires et des opérations d’aquaculture. Les scientifiques du monde entier ont donc entrepris de mieux comprendre ces régions. L’un des outils à leur disposition est la modélisation océanique à haute résolution. 

Laura Bianucci, Ph. D., qui travaille actuellement pour Pêches et Océans Canada à l’Institut des sciences de la mer (Sidney, C.-B.), est l’une de ces chercheuses qui étudient les zones côtières de la Colombie-Britannique. Elle a toujours cherché à mieux comprendre les processus océaniques côtiers, de l’échelle du plateau continental à celle du littoral, et travaille avec des modèles numériques océaniques connexes depuis ses études supérieures. Au départ, son travail était axé sur les modèles régionaux à haute résolution, mais il a évolué au fil des ans pour se concentrer de plus en plus sur les questions relatives aux processus littoraux. Elle est d’avis que le type de modèles à haute résolution qu’elle utilise nous aide à mieux comprendre les parties de l’océan côtier qui sont si importantes pour nous en tant que société et pour l’écosystème. 

Comme le souligne Mme Bianucci, les modèles à haute résolution ne sont pas nécessairement nouveaux, mais la puissance accrue des ordinateurs a permis d’augmenter considérablement la capacité de ces modèles au cours des dernières années. Le terme « haute résolution » peut avoir plusieurs significations lorsqu’il s’agit de modèles numériques d’océan, car il existe de nombreux types d’applications pour cet outil scientifique. La résolution d’une photographie (ou d’un modèle) dépend de la taille des pixels (ou de la taille des cellules de la grille du modèle), une photographie (ou un modèle) à haute résolution ayant des pixels (cellules de la grille) plus petits pour capturer plus de détails. Trouver la résolution spatiale optimale d’un modèle dépend en grande partie de l’équilibre entre son coût de calcul et les questions scientifiques posées et les processus modélisés. Les grands domaines de modélisation et/ou les modèles à haute résolution nécessitent davantage de puissance informatique et de stockage; en d’autres termes, ils sont plus coûteux sur le plan informatique. Néanmoins, les processus ou les régions ne peuvent être résolus avec précision s’ils se situent à des échelles inférieures à la largeur de la cellule de la grille du modèle (c’est-à-dire la résolution), ce qui doit être pris en compte lors de l’élaboration de modèles pour des applications spécifiques.

Cet article a été rédigé par Krysten Rutherford, Ph. D., sur la base d’un entretien avec Laura Bianucci, Ph. D., et d’images de Glenn Cooper. 

 


 

-By Krysten Rutherford and Laura Bianucci-

The nearshore and coastal ocean is an important region, acting as a buffer between the land and the open ocean. As a result, it experiences climate change impacts from the open ocean and also absorbs impacts from the land and land-use changes. Moreover, it is often the most utilized by humans for recreational, economic, and traditional activities, which can subsequently lead to impacts from, for example, oil spills and aquaculture operations. Scientists across the globe have therefore set out to better understand these regions. One tool at their disposal is high-resolution ocean modelling.

Dr. Laura Bianucci, who currently works for Fisheries and Oceans Canada at the Institute of Ocean Science (Sidney, BC), is one such researcher studying the coastal areas of British Columbia. She has always been interested in understanding coastal ocean processes ranging from continental shelf to nearshore scales, and has been working with related numerical ocean models since graduate school. Initially, her work focused on high-resolution regional models, but it has evolved over the years to focus more and more on questions regarding nearshore processes. She believes that the type of high-resolution models she employs help us to better understand parts of the coastal ocean that are so important to us as a society and for the ecosystem.

Figure 1: Fish farms in Clayoquot Sound along the west coast of Vancouver Island (photo credit: Glenn Cooper).

As Dr. Bianucci points out, high-resolution models are not necessarily new, but increasing computer power has meant that the ability of these models has significantly increased in recent years. The term “high-resolution” can mean many things when it comes to numerical ocean models since there are many different types of applications for this scientific tool. Here, we are referring to a model’s spatial resolution, which is similar to pixels in a photograph – the resolution of a photograph (or model) depends on the size of the pixels (or size of the model grid cells), with a higher resolution photograph (model) having smaller pixels (grid cells) to capture more detail. Finding a model’s optimal spatial resolution depends largely on balancing its computational expense with the scientific questions being asked and processes being modelled.  Large model domains and/or high-resolution models require more computer power and more storage; in other words they are more computationally expensive. However, processes or regions cannot be accurately resolved if they are on scales smaller than the model grid cell width (i.e. resolution), which must be taken into consideration when developing models for specific applications.

For Dr. Bianucci, who studies the intricate nature of the many inlets and fjords of British Columbia’s coastline, high-resolution often means a model grid cell width ranging in size from 10 to 100 m. These coastal areas are made up of narrow channels that often cannot be resolved with models that have a resolution of >1km. For reference, some high-resolution regional ocean models have a resolution of 2-10km and global Earth System Models (ESMs) often have a resolution of ~100 km. With her models having such high spatial resolution, they often need to cover a smaller area to limit the computational expense. As such, Dr. Bianucci develops model applications for specific inlet and fjord systems. Currently, she is working on 4 different high-resolution models which encompass Queen Charlotte Strait, Discovery Islands, West Coast of Vancouver Island, and Quatsino Sound (see Figure 2).

Figure 2: Four model domains: Discovery Islands, West Coast Vancouver Island, Quatsino Sound and Queen Charlotte Strait. Inset shows zoomed in model grid for Queen Charlotte Strait as an example of the model resolution (credit: Krysten Rutherford).

The questions that Dr. Bianucci tries to tackle with her projects are multidisciplinary in nature. As such, she often brings together a diverse team of individuals. “Ocean science today is in deep need of interdisciplinary research and, most importantly, interdisciplinary teams. Siloed research is not the right approach when facing a multifaceted challenge like climate change,” Dr. Bianucci comments. Given that the regions Dr. Bianucci studies are highly utilized by humans, she additionally tries to involve as many stakeholders as possible in her projects. She works closely with fisheries managers as well as with First Nation partners who are interested in finding out more about the resilience of their systems that are changing very rapidly due to climate change. She keeps them up to date on the modelling side, but also integrates them into the implementation of monitoring plans of the various fjords she is studying. Many of these fjords are lacking in observations because it is difficult to visit them frequently and sample in high enough spatial resolution to constrain the system. The implementation of these monitoring plans, particularly when combined with high-resolution modelling, is therefore crucial to better understanding these systems.

Figure 3: CTD and Niskin deployment off the CME Anderson in Quatsino Sound (photo credit: Glenn Cooper).

Models in general are great tools that can be used to investigate the inner workings of the ocean without altering the real ocean, both at present-day and under future conditions, Dr. Bianucci argues. They are powerful tools that help us interpret sparse observations, and allow scientists to test mechanisms and hypotheses that can be hard to observe in the real ocean. Given the many uses of numerical ocean models, there are many different types of modelling systems available. Currently, Dr. Bianucci specifically uses one called Finite Volume Community Ocean Model (FVCOM). This type of model is unstructured, which means that the grid cell size can vary throughout the model. This feature is beneficial for the type of modelling that Dr. Bianucci is doing since it allows her to represent different types of areas, which may need different resolution, within the same model (e.g. continental shelf vs. fjords).

British Columbia’s inlets are often home to many aquaculture operators. The development of these inlet-specific models can help operators and regulators in a variety of ways, such as by helping them plan best management practices and for potential future changes. For example, Dr. Bianucci and her team have implemented particle tracking in her models, which can simulate the dispersal of pathogens or contaminants from aquaculture farms (DFO 2021). Her current work is also focused on hypoxia and deoxygenation since this is what many fjords along the British Columbia coastline are experiencing. A couple of her recent collaborative studies have found long-term deoxygenation and warming trends in four BC fjords (Jackson et al. 2021), and have explored the seasonal occurrence of near-surface hypoxia in another inlet based on recent measurements (Rosen et al. 2022). She aims to use her models to improve the understanding of the dynamics behind these observed low oxygen events as well as how these events and dynamics may change under future climate scenarios. Furthermore, by studying several inlets and fjords, Dr. Bianucci hopes to address the spatial diversity of coastal hypoxia in these geomorphologically complex regions. For example, it is not yet fully understood why a subset of the inlets along the West Coast of Vancouver Island experience hypoxia while some do not. Dr. Bianucci believes that differences in the bathymetry and sill locations, how the inlets are aligned with the main winds, and the type of freshwater forcing reaching the fjords are likely some of the key drivers in setting inlet biogeochemical properties. This work is still in progress, but Dr. Bianucci is very much looking forward to seeing the outcomes of her modelling work in this region.

Over the coming years, Dr. Bianucci hopes to expand her models to include even more coastal areas. These models can be used to see the impacts on and from any given aquaculture farm, and help to establish the best management practices. She also hopes to use these models to understand how extremes and climate change as a whole will impact the important near shore regions of British Columbia, in terms of hypoxia, acidification, and temperature. It will be incredibly important over the coming years to get a firm grasp on these stressors and how they will affect the local fish stocks and aquaculture farms; high-resolution ocean modelling is a formidable tool for the task.

 


 

This article was written by Dr. Krysten Rutherford based on an interview with and input from Dr. Laura Bianucci and images from Glenn Cooper.

Krysten Rutherford (she/her) is a postdoctoral fellow at the Institute of Ocean Science in Sidney, BC, and completed her PhD in 2021 at Dalhousie University. She implements and develops high-resolution models to better understand present-day processes and the potential future impacts of climate change on coastal systems.

Laura Bianucci (she/her) is a research scientist at the Institute of Ocean Science in Sidney, BC. Before joining Fisheries and Oceans Canada in 2017, she was a scientist at Pacific Northwest National Laboratory (Seattle, WA, USA) and a postdoc at Dalhousie University. She holds a PhD from the University of Victoria (2010).

 


References

DFO. 2021. Hydrodynamic Connectivity between Marine Finfish Aquaculture Facilities in British Columbia: in support of an Area Base Management Approach. DFO Can. Sci. Advis. Sec. Sci. Resp. 2021/042.

Jackson, J. M., Bianucci, L., Hannah, C. G., Carmack, E. C., & Barrette, J. (2021). Deep waters in British Columbia mainland fjords show rapid warming and deoxygenation from 1951 to 2020. Geophysical Research Letters, 48, e2020GL091094, https://doi.org/10.1029/2020GL091094.

Rosen S, Bianucci L, Jackson JM, Hare A, Greengrove C, Monks R, Bartlett M and Dick J (2022). Seasonal near-surface hypoxia in a temperate fjord in Clayoquot Sound, British Columbia. Front. Mar. Sci, 9:1000041, doi: 10.3389/fmars.2022.1000041.

Rétrospective: Robie W. Macdonald, OC, FRSC (1947-2022)

-Par Sophia C. Johannessen and Jules M. Blais-

Cette rétrospective a été publiée pour la première fois sur FACETS le 22 septembre 2022.

Robie (Rob) Macdonald a étudié les systèmes océaniques à grande échelle, notamment le cycle global du carbone, le changement climatique et les contaminants, dans les océans Arctique et Pacifique. Il était reconnu internationalement comme l’un des principaux experts du pays sur le comportement des contaminants dans l’environnement marin, et il mettait à contribution de nouvelles idées et une large perspective à chaque projet.

Continue reading

Changements océaniques futurs : Que peuvent nous apprendre les modèles d’écosystèmes?

– Par Andrea Bryndum-Buchholz –

Le changement climatique a une incidence sur tous les aspects de la vie dans le monde. Les océans se réchauffent et s’acidifient, entraînant une cascade de conséquences pour la vie marine — mortalité accrue, réduction de la calcification et modification de la répartition des espèces ne sont que quelques-uns des changements majeurs déjà observés.

Continue reading

Pour les numéros précédents allez à la page Archives sur le site principal.

© 2017 Société canadienne de météorologie et d’océanographie

Designed & powered by Creative Carbon