Subscribe to RSS Feed (Opens in New Window)


A hand-held, photogrammetric, approach to riparian and stream assessment and monitoring

Joseph Dehnert

Over 37,000 riparian restoration projects have been reported in the United States since 1980, but only 38% of those projects are currently being monitored; 70% of those being monitored reported that the restoration actions were not accomplishing their intended purposes. Because riparian zones serve as the connection between terrestrial and flowing freshwater ecosystems, it is important to identify and quantify their structural and functional roles in natural systems and monitor the restoration efforts being implemented. A rigorous understanding of the connection between riparian, geomorphic, and hydraulic processes provides a sound ecological foundation when identifying riparian management objectives and evaluating current, and future, land-use practices. It is known that vegetative cover in riparian zones influences stream ecosystem function by stabilizing stream banks, providing habitat and food for aquatic and terrestrial biota, and improving water quality. However, quantifying these influences demands a high cost in terms of financing, human resources, and time.

Two of the biggest weaknesses in stream restoration and monitoring are: 1) subjective estimation and subsequent comparison of changes in channel form, vegetative cover, and in-stream habitat; and 2) the high costs in terms of financing, human resources, and time necessary to make these estimates. Hand-held Structure from Motion Multi-view Stereo (SfM/MVS) photogrammetric technology can solve these problems by offering a resource efficient approach for producing hyperspatial 3D models for a variety of environments. SfM/MVS photogrammetric technology is the result of cutting-edge advances in computer vision algorithms and discipline-specific research in the geosciences. By expanding the application of hand-held photogrammetric technology to stream assessment and restoration monitoring projects, it should be possible to both increase and improve data collection in terms of accuracy and efficiency. 3D models produced via hand-held cameras at close range may help to further standardize the way stream attributes are measured and quantified.

The traditional approach for stream mapping using cameras is done via an Unmanned Aerial Vehicle (UAV) from above. There are well-established methodologies and workflows in place for UAV studies. However, there is a gap in the literature when attempting to implement similar workflows from an oblique or on the ground perspective, especially when attempting to use consumer grade sensors like digital cameras and cell phones. Therefore, the primary goal of this research is to develop a standardized approach to stream assessment and monitoring using 3D visualizations created from consumer grade, hand-held, cameras. Ideally, this approach will provide the possibility for more frequent surveying and produce 3D models that are capable of detecting changes in bank structure, instream habitat, and vegetation within heavily vegetated riparian and stream environments. This presentation will include findings on camera selection, software workflows, and measurements that can actually be determined from the 3D visualizations of the stream.

A wolf in fox’s clothing? Using stable isotopes to quantify ecological replacement

Tyler James Clark

During the Anthropocene, human-induced extinctions, especially of large predators and herbivores, have resulted in catastrophic ecosystem collapses. Recently, conservation tools such as rewilding (e.g., re-introducing a species to its native habitat) have been used to mitigate the effects of extinction. Another increasingly popular conservation tool, ecological replacement, proposes to introduce functionally similar, non-native species to fill the ecological roles of extinct, native species, thereby restoring ecosystems towards their pre-Anthropocene state. For example, the non-native Aldabra giant tortoise was introduced to the Mauritian island of Ile aux Aigrettes to replace the ecological role of extinct Cylindraspis giant tortoises. However, the use of these tools is controversial, not least because it is difficult to predict the effects of species introduction. Furthermore, inferring the ecological functions of species that were driven to extinction 100s to 1000s of years ago is a major obstacle.

Many species, including predators, impact ecosystems through their diet, so one way to test ecological replacement is to compare how the diets of non-native potential ecological replacements overlap with those of their extinct counterparts. Stable isotope ratios in the tissues of predators reliably predict what those predators are eating, therefore isotopic diet overlap between modern and historical tissue samples could test the effectiveness of ecological replacement as a conservation tool. Stable isotopes are elements with the same number of protons but different numbers of neutrons, which are integrated into tissues (e.g., bones, feathers, hair) from food consumed. In food available to predators, stable isotopes of carbon-13 increase from terrestrial to marine food sources, and isotopes of nitrogen-15 increase up the food chain in food sources.

To exemplify this approach with stable isotopes, we tested whether non-native South American grey foxes (SAGF) act as de facto ecological replacements for extinct Falkland Islands wolves (FIW). Prior to their extirpation by man, the FIW was the only native terrestrial predator and land mammal inhabiting the Falkland Islands, South Atlantic. SAGFs were introduced to the Falklands in the 1920s for fur farming but rapidly became feral. Due to their impact on native birds and farmed sheep, they have been eradicated from three islands in the Falklands, with complete eradication being suggested. However, if SAGFs act functionally like FIWs, the Falklands ecosystem could be closer to its pre-Anthropocene state with them than without.

We therefore measured stable isotopes in 32 hair samples collected from SAGFs on Weddell Island, Falklands, and in hair samples of FIWs from 8 of 9 known specimens held in museums throughout the world. We found that the stable isotope values of modern SAGFs almost completely overlaps that of extinct FIWs. Our results indicate that SAGFs could act as unintended ecological replacement for FIWs. Given the burgeoning interest in ecological replacement as a conservation tool, our technique is one way in which this method can be assessed objectively and quantitatively. Without these techniques, it is unclear if these conservation tools can adequately restore degraded ecosystems to their pre-Anthropocene state.

Aphasia VR

Kristina D. Mahagamage, University of Montana, Missoula

Aphasia Intervention using Virtual Reality Technology

Kristina Mahagamage, MFA Graduate Candidate, Media Arts, University of Montana

Co-Authors/Editors :

Michael Musick Media Arts University of Montana

Jenna Griffin Speech, Language, Hearing and Occupational Sciences University of Montana


Virtual Reality technology can help healthcare clinicians work with persons with Aphasia in therapeutic exercises that simulate real-world experiences. “Aphasia is an impairment of language, affecting the production or comprehension of speech and the ability to read or write" (“Aphasia Definitions - National Aphasia Association”). Virtual Reality (VR) is a groundbreaking technology that allows people to interact with computer-generated (CG) simulations in real-time. In the study, Virtual Reality For Stroke Rehabilitation, Kate Laver compares the results and methodology of using VR to improve upper limb activity in post-stroke patients (Laver). By examining similar VR methods, the intention is to design and develop an immersive experience that simulates real-world challenges that persons with Aphasia may face with a speech therapist or Clinician's guidance.


Using current therapeutic methods for Aphasia intervention and applying them to a simulated immersive experience, users/patients can practice and improve their communication deficiencies in a simulated environment. VR provides a safe space for users/patients to practice real-world communication skills.


The development of this experience is both a technical and a design challenge. It requires an understanding of who the user base is and how to cater to that user. Interactivity is key. Developing an interactive therapy experience in Unreal Engine 4 (UE4), a game engine, allows for the design of experience functionality while also creating stunning, high-resolution environments. Multi-user functionality is also enabled so that clinicians may be present in the simulation.

In this project, users/patients will experience a coffee shop environment. The user/patient is seated at a table with a menu in front of them. The menu is simple, with a single word. A Computer-Generated (CG) character or Clinician then interacts with the user/patient. When the word is said correctly, the patient receives what they asked. These exercises will then be followed up with different words to further the experience. The goal is for the user/patient to feel confident through the interaction and improve their communication skills.


Virtual technologies are currently used in Aphasia therapy for stroke-related communication disabilities. The most prominent example is EVA Park, designed and developed at the University of London. EVA Park is a compelling experience for persons with Aphasia. It has an astoundingly positive effect on people, with a "high-rating of enjoyment" (“EVA – Evaluating the Effects of a Virtual Communication Environment for People with Aphasia”).

While EVA Park uses virtual worlds accessed through standard computer interfaces (i.e. a screen, keyboard, and mouse), this experience's originality will focus on practice and training using a head mount display (HMD), Oculus or Vive, for a completely immersive experience. The objective is to help users/patients build confidence in speaking and reading.


The practical application of Virtual Reality technologies is currently being used in medical intervention and rehabilitation. The potential of this technology could lead to medical breakthroughs. In this experience, VR presents a simulated real-world situation for users/patients by building on the current methods of virtual therapies used today.

Creating a virtual reality simulation for intervention can improve confidence and communication skills for users/patients with Aphasia. The Aphasia VR project can supply different methods of intervention by providing a practical option for isolated people. It can increase confidence and improve quality of life. Clinicians also have the opportunity to observe and understand the conflicts and challenges they may face in a real-world environment and allow for adjustment in one's therapy if needed.

Works Cited

“Aphasia Definitions - National Aphasia Association.” National Aphasia Association,, Accessed 21 Jan. 2021.

Laver, Kate, et al. “Virtual Reality for Stroke Rehabilitation.” Stroke, 15 Dec. 2011,

“EVA – Evaluating the Effects of a Virtual Communication Environment for People with Aphasia.” EVA, Accessed 21 Jan. 2021.

Blocking a specific ion channel reduces inflammation in lung cells

Becky Kendall, The University Of Montana

Silicosis caused by exposure to silica dust is a serious occupational health threat in the United States and worldwide. For example, artificial stone workers are exposed to dust that has a higher percentage of silica than natural stone, creating a more significant exposure in a shorter period of time and placing these workers at an enhanced risk of developing silicosis. Silicosis is an incurable lung disease marked by chronic inflammation. When silica dust is inhaled, the cells responsible for clearing it out of the lungs, macrophages, become damaged and initiate a cycle of chronic inflammation that results in lung and systemic diseases. Because the mechanisms leading to inflammation from silica are poorly understood, effective treatments for silicosis are not available. It is known that as the silica particle is taken up by macrophages it moves into a degradative organelle (lysosome) where it causes damage that results in leakage of lysosomal contents into the rest of the cell initiating a cycle of cell death and chronic inflammation. Therefore, determining the mechanisms of this leakage is crucial to being able to understand and prevent chronic inflammation. I propose that this early step in the initiation of inflammation can be prevented by blocking specific ion channels (in this case a K+ channel) within the organelle. To test this hypothesis, macrophages were plated in 96-well plates and pre-treated with the K+ ion channel blocker paxilline 30 minutes prior to 24-hr exposures to silica (50 ug/mL). Cell death and inflammatory activity were assessed. For comparison, macrophages were also pre-treated with an alternate K+ ion channel blocker, iberiotoxin, that only blocks ion channels at the cell surface. Lactate dehydrogenase is released when a cell membrane is damaged and was measured as an indicator of cell death in these experiments. Interleukin 1 beta (IL-1B) is a cell signaling protein that sends inflammatory signals to immune cells and is measured as an indicator of activating the inflammation pathway in macrophages. Pre-treatment of the cells with paxilline before silica exposure resulted in reduced cell death and significantly reduced inflammation in two types of macrophages. In contrast, iberiotoxin did not reduce cell death or inflammation. In conclusion, blocking a specific K+ ion channel decreases silica-caused cell death and inflammation in macrophages, offering valuable insight into the mechanisms of inflammation for further development of silicosis therapeutics and prevention.

Controls on the magnitude of bedrock recharge in two paired, snow-dominated watersheds

Kimberly Bolhuis, University of Montana, Missoula

Controls on the magnitude of bedrock recharge in two paired, snow-dominated watersheds

Mountains in arid and semi-arid regions receive a disproportionately large amount of precipitation compared to their bounding valley aquifers due to orographic effects, but little is known about how precipitation is partitioned and transported within the mountain block. Recent research has illuminated dynamic interactions between soil and bedrock reservoirs, showing that bedrock permeability is potentially a major control on the volume of bedrock groundwater recharge. This study focuses on i) evaluating differences in bedrock effective porosity, and ii) estimating how recharge magnitude is influenced by bedrock effective porosity. Two paired watersheds underlain by different bedrock lithologies were studied in the Lubrecht Experimental Forest near Greenough, MT. One watershed is underlain by 1.4 Ga argillite, while the other is dominated by 65 Ma granite. Bedrock well hydrographs were used to estimate bedrock recharge at the hillslope scale from snowmelt and storm events. Bedrock permeability and effective porosity were estimated and compared using Mean Residence Time (MRT) estimates, core sample analysis, and outcrop fracture mapping. Annual bedrock recharge in the argillaceous catchment ranged between 400 and 850 mm, and between 15 and 50 mm in the granite catchment. Mean residence time for the argillite watershed is approximately 4.74 years, and 2.12 year in the granite watershed. Mean effective porosity is 1.11% for the watershed underlain by argillite and 0.45% for the granitic catchment. Bedrock well hydrographs in the granitic catchment exhibit a flashy response to input and slow recession, while hydrographs in the argillite catchment show rapid response and recession. The difference in the recharge magnitudes, inferred storage, bedrock permeability, and well recession between both catchments implies bedrock strongly influences each catchment’s subsurface flow dynamics. This research expands current knowledge of the magnitude and controls on recharge to the bedrock reservoir in mountainous terrain, demonstrating the importance of understanding the role that bedrock plays in partitioning and transmitting flow through mountain blocks.

Cough Desensitization Treatment: Randomized Control Trial Update

Jane Reynolds

Cough Desensitization Treatment: Updates from a randomized controlled trial


Chronic cough is estimated to impact approximately 11% of Americans1 and is one of the leading reasons that patients seek medical evaluation2. For approximately 20% of patients, the typical treatments for medical causes of chronic cough fail and they are said to have refractory chronic cough (RCC)3. RCC has negative implications on a patient’s health-related quality of life4. Cough Desensitization Treatment (CDT) is a novel intervention for RCC that combines behavioral cough suppression therapy with systematic desensitization of the cough reflex using controlled amounts of capsaicin (a known cough stimulant). Fourteen patients with RCC participated in a single-blind randomized placebo controlled trial evaluating the effectiveness of CDT. Participants in the treatment group completed six sessions where they were coached to suppress the “urge-to-cough” elicited with incremental doses of nebulized capsaicin vapor. The placebo group inhaled subthreshold doses of capsaicin vapor that did not elicit an urge-to-cough sensation and practiced suppression techniques. Change in health-related quality of life was assessed with the Leicester Cough Questionnaire (LCQ)5at one-week and three-weeks post-treatment. Additional outcomes included urge-to-cough (UTC) testing (perceived UTC and cough frequency when presented with certain stimuli and tasks that commonly cause cough in RCC), and in 24-hour cough frequency monitoring. At one-week post-treatment, 6/8 treatment participants and 2/6 control participants achieved a clinically meaningful improvement in LCQ scores. This change was maintained at the three-week follow up. A mixed effects linear regression model revealed very weak evidence of a difference in LCQ over time between the two groups (F(2,24)=1.58, p-value=0.23). However, the differences were in the direction of higher mean LCQ in the treatment group. Follow-up contrasts were conducted to estimate the change vs. baseline in each group. There was strong evidence of change in the treatment group at one-week post (t(24) = 4.24, p-value < .001) and three-weeks post (t(24) = 4.60, p-value < .001), with estimated mean increases of 2.95 and 3.20 LCQ points higher compared to baseline, respectively. There was very weak evidence of change in the control group at one-week post (t(24) = 1.47, p-value = .310) but moderate evidence of a change at three-weeks post (t(24) = 2.17, p-value = .08), with increases of 1.18 and 1.75 points vs. baseline, respectively. There was strong evidence of a difference over time between the treatment and control groups in total coughs produced during UTC testing (Chi-square(2)=18.0, p-value = 0.0001). Follow-up contrasts were conducted to estimate the change vs. baseline in each group. There was strong evidence of a change in mean cough counts during UTC testing for both groups at one-week post and three-weeks post (p-value

Keywords: chronic cough; cough hypersensitivity; cough reflex

1. Song, W., Chang, Y., Faruqi, S., Kim, J., Kang, M., Kim, S., Jo, E., Kim, M., Plevkova, J., Park, H., Cho, S., Morice, A. (2015). The global epidemiology of chronic cough in adults: a systematic review and meta-analysis. European Respiratory Journal, 45(5), 1479-1481, doi:10.1183/09031936.00218714

2. Centers for Disease Control and Prevention. (2017). National Hospital Ambulatory Medical Care Survey.

3. Haque, R., Usmani, O., & Barnes, P. (2005). Chronic idiopathic cough: A discrete clinical entity? Chest 127(5), 1710–1713. 
doi: 10.1378/chest.127.5.1710

4. French, C., Irwin, R., Curley, F., & Krikorian, C. (1998). Impact of chronic cough on quality of life. Archives of Internal Medicine, 158, 1657-1661. doi: 10.1001/archinte.158.15.1657

5. Birring, S., Prudon, B., Carr, A., Singh, S., Morgan, M., & Pavord, I. (2003). Development of a symptom specific health status measure for patients with chronic cough: Leicester Cough Questionnaire (LCQ). Thorax 58(4), 339-343. doi: 10.1136/thorax.58.4.339

Developing potential land surface temperature as an indicator of forest regeneration potential in the western United States

Robin Rank, University of Montana, Missoula

Forests face accelerating threats due to increases in the severity and frequency of drought and heat stress associated with climate change. In particular, as rates of disturbance (e.g, wildfire) increase, and the climatological and hydrological suitability of landscapes for forest regeneration are diminished, vegetation shifts from forest to non-forest are expected to become more frequent and widespread. Most efforts at predicting these shifts have used correlative approaches which lack the ability to generalize to novel temporal or spatial conditions. To reduce uncertainty when predicting forest regeneration, researchers need mechanistic models and predictors of seedling mortality based on physiological processes and hydraulic function. Here, we introduce a novel environmental metric: potential land surface temperature (pLST), which is an estimate of land surface temperature in the hypothetical absence of overstory vegetation. Land surface temperature (LST) is a radiometric measure of the energy balance at the Earth’s surface: it is governed by net radiation and soil moisture, has strong effects on seedling physiology, and can be remotely sensed. Therefore, it is a key variable in many studies of how vegetation, hydrology, and climate interact. To develop pLST as a metric of forest regeneration suitability, we produce pLST estimates using a spatially explicit, physics-based ecohydrologic model and evaluate the skill of these estimates at predicting forest cover in watersheds distributed across the western U.S. We find that forest cover predictions based on modeled pLST can accurately reproduce satellite-measured patterns of forest cover throughout the western U.S, although the accuracy of these predictions is lower in the Southwest. This work leverages advances in ecohydrologic modeling and remote sensing, along with an easily observable and intuitive climate metric, to produce information about forest regeneration suitability that can be used by forest managers.

Effect of home-based Tai Chi, Yoga or conventional balance exercise on functional balance and mobility among persons with idiopathic Parkinson's disease: An experimental study

Arva Mufaddal Gadly
Brammatha Pandian
Arul Selvan Dr.

Background and Purpose: Individuals with Parkinson’s Disease (PD) invariably experience functional decline in a number of motor and non-motor domains affecting posture, balance and gait. Numerous clinical studies have examined effects of various types of exercise on motor and non-motor problems. But still much gap remains in our understanding of various therapies and its effect on delaying or slowing the dopamine neuron degeneration. Recently Tai Chi and Yoga both are gaining popularity as Complementary Therapies, since both have components for mind and body control. The aim of this study was to determine whether 8 weeks of home-based Tai Chi or Yoga was more effective than regular balance exercises on functional balance and mobility.

Methods: Twenty-seven individuals with Idiopathic Parkinson’s disease (Modified Hoehn and Yahr stage 2.5-3) were randomly assigned to either Tai Chi, Yoga or Conventional exercise group. All the participants were evaluated for Functional Balance and Mobility using Berg Balance Scale, Timed Ten Meter Walk test and Timed Up and Go test before and after eight weeks training.

Results: The results were analyzed using Two-way mixed ANOVA which showed that there was significant main effect for time as F (1,24)=74.18, p=0.000, ƞ��2=0.76 for overall balance in Berg Balance Scale. There was also significant main effect of time on mobility overall as F(1,24)=77.78, p=0.000, ƞ��2=0.76 in Timed up and Go test and F(1,24)=48.24, p=0.000, ƞ��2=0.67 for Ten Metre walk test. There was significant interaction effect for time × group with F (2,24)=8.67, p=0.001, ƞ��2=0.420 for balance. With respect to mobility the values F (2,24)=5.92, p=0.008, ƞ��2=0.330 in Timed Up and Go test and F(2,24)=10.40, p=0.001, ƞ��2=0.464 in Ten-meter walk test showed a significant interaction. But there was no significant main effect between the groups for both balance and mobility.

Conclusion: The findings of this study suggests that Tai Chi as well as Yoga are well adhered and are attractive options for a home-based setting. As any form of physical activity is considered beneficial for individuals with PD either Tai Chi, Yoga or conventional balance exercises could be used as therapeutic intervention to optimize balance and mobility. Further studies are necessary to understand the mind-body benefits of Tai Chi and Yoga either as multicomponent physical activity or as individual therapy in various stages of Parkinson’s Disease.

Forest resilience to fire in the western US: a test case for using satellite metrics to assess spatial and temporal patterns of forest recovery

Marie Johnson

Fire and other climate driven disturbances are shaping the future of our forests. Given the vital ecosystem services provided by forests, it is crucial that we manage forests for their continued existence under changing climate conditions. This requires the immediate investigation of forest resilience, the ability of an ecosystem to withstand a disturbance by regaining its function, structure, and composition before a threshold is crossed and an ecosystem transitions to a new state. It has been widely recognized in ecology that resilience should be the primary goal of forest restoration and management, however, few studies have been designed to comprehensively examine forest resilience to wildfire across broad ecoregions. The objective of this research is to gain a comprehensive understanding of forest resilience at different spatial scales across the western US. To achieve this goal, we have developed a way to combine multiple metrics of ecosystem recovery into a forest ecosystem resilience framework. We have begun our work by characterizing the recovery of ecosystem functioning of a 11,400 acre forest in northwestern Montana, 15 years postfire. We will continue this work by using several different remote sensing platforms available to examine the functional, structural and compositional recovery of forest ecosystems following wildfire. This methodology will be expanded to forests across the western US to help identify predictors of resilience and inform management practices.

Natal Stream Characteristics Associated with Migratory Westslope Cutthroat Trout

Troy W. Smith

Natal Stream Characteristics Associated with Migratory Westslope Cutthroat Trout

Westslope Cutthroat Trout, hereafter cutthroat, are a native trout species of conservation concern in Montana that have historically exhibited a diversity of life histories within a population. This includes migratory individuals that move from larger rivers to their natal streams to spawn, as well as resident individuals that remain in natal streams for their lifetime. Due to habitat degradation and interbreeding (or hybridizing) with nonnative Rainbow Trout, cutthroats are becoming rare in large river systems and increasingly occur as isolated resident populations in headwater systems. This loss of migratory individuals can lead to extirpation of theses isolated, local populations. Rock Creek in Western Montana is a population of conservation and of ecological interest because it has retained a population of migratory non-hybridized cutthroat. Understanding characteristics of streams that promote migratory individuals and the persistence of the migratory life history within a population would be useful to help prioritize conservation of habitat that maintains life history diversity.

Based on ecological theory we hypothesized that factors such as stream habitat quality (temperature, flow, pool depth) and competition (fish density) would be associated with migratory behavior. From 2018 to 2020, we implanted 73 radio telemetry tags and tracked cutthroat movements over this time period to determine their migratory life history and where they spawned. We collected habitat and fish population data on 37 tributary streams within the Rock Creek drainage. We used a generalized linear model to evaluate the relationship between the number of migratory individuals and certain tributary characteristics. Our results show that natal stream size is related to the abundance of migratory individuals. This study is unique in that we were able to capture and track a large number of individuals over a large landscape, which is uncommon for these types of studies. By understanding the drivers behind migratory cutthroat populations, this study will directly help managers make crucial conservation decisions about protection of watersheds and guide decisions about reconnecting habitats to provide migratory pathways.

Physical Activity, Injury, and Burnout in Collegiate Esports Players

Hailee M. Eckman, University of Montana, Missoula

Purpose: Esports is the competitive application of various video games, where teams or individuals compete against one another with varying levels of skills. Esports play can emphasize professional or personal development in the individual player, and is gaining significant popularity around the world, with an estimated audience of 395 million in 2018(Nagorsky & Wiemeyer, 2020). With this increase in popularity, collegiate teams have been established to cultivate community and develop player skills in a parallel fashion to collegiate athletic teams. These esports players develop skills by training many hours a week in a similar manner to their traditional athletic counterparts, including structured drills and competition scrimmages. esports athletes are also experiencing similar aspirations to compete at the professional level and be financially compensated for their playing skills. In an attempt to elevate player skill, there has been a greater emphasis on physical and mental training in professional esports players(DiFrancisco-Donoghue, et al., 2019; Polous, et al., 2020). The combination of increased training load and greater emphasis on competing at a high level can lead to overuse injury and burnout. The purpose of this study is to understand physical activity and potential effects it has on esports related injuries, as well as the effects and prevalence of burnout in collegiate esports players.

Methods: All participants were active members of the University of Montana Esports Gaming Team and completed an anonymous online survey. The survey included five demographic questions, eight esports training questions to better understand training load and frequency, a 15-question battery to understand burnout symptoms, six questions to describe physical activity habits and preferences, and eight questions to understand if an injury related to esports training had ever occurred. Questions were compiled from previous surveys exploring the physical activity habits, training loads, and injuries in professional esports. The burnout questions were adapted from Raedeke and Smith’s Athlete Burnout Questionnaire (2001). Data collection is ongoing, with first responses completed in December 2020. Responses are compiled and quantified by descriptive statistics. Secondary analyses include identifying correlations between metrics of training, physical activity, injury, and burnout.

Originality: While there has been some research focused on the physical activity patterns, potential injuries, and burnout in professional esports players, there is minimal literature available concerning collegiate esports players. These findings can facilitate some improved understanding of esports athletes; however, there is greater value in understanding the consequences of collegiate esports training on the bodies and minds of athletes trying to improve their performance.

Significance: As athletic trainers diversify to caring for non-traditional athletes, the findings of the present study may reveal the need for including esports athletes. Regardless of whom may treat Esports athletes, the present findings are a valuable first step to better understanding the physical and emotional toll of deliberate training in esports. Future research will focus on better understanding how specific traits of esports athletes, such as wanting to pursue a professional career, affects injury and burnout rates, and how regular physical activity may mitigate these negative consequences of intentional training.

Quantifying Subsurface Parameter Uncertainties with Surrogate Modeling and Environmental Tracers

Nicholas E. Thiros, University of Montana, Missoula
Payton Gardner, University of Montana, Missoula

Identifying the uncertainty in predictions made by groundwater flow and transport numerical models is critical for effective water resource management and contaminated site remediation. In this work, we combine physics-based groundwater reactive transport modeling with data-driven machine learning techniques to quantify hydrogeologic model uncertainties for a site in Wyoming, USA. We train a deep artificial neural network (ANN) on a training dataset that consists of groundwater hydraulic head and environmental tracer concentration (3H, SF6, and CFC-12) fields generated using a high-fidelity groundwater reactive transport model. Inputs of the training dataset and reactive transport model include variable and uncertain hydrogeologic properties, recharge rates, and stream boundary conditions. Using the trained ANN as a surrogate to reproduce the input-output response of the reactive transport model, we quantify the full posterior distributions in predicted model hydrogeologic parameters and hydraulic forcing conditions using Markov-chain Monte Carlo (MCMC) calibration and field observations of groundwater hydraulic heads and environmental tracers. The coupling of the physics-based reactive transport model with the machine learning surrogate model allows us to efficiently quantify model uncertainties, which is typically computationally intractable using reactive transport models alone. This technique can be used to help improve hydrogeologists' ability to assimilate field observations into subsurface numerical model calibration procedures and improve prediction uncertainty quantification.

Quantifying the effects of Beaver Dam Analogs on fish habitat and communities

Andrew B. Lahr, University of Montana Wildlife Biology Progarm

Beaver-based restoration has shown to be an effective restoration strategy for turning incised streams into complex and dynamic systems. However, there are still uncertainties regarding the pros and cons of beaver and beaver dam analogs (BDAs) as it pertains to whole system effects. Concerns focus on whether these impoundments increase temperatures beyond thermal tolerances of cold-water species, reduce stream connectivity, and exacerbate negative effects of non-native species. In order to provide more context surrounding these concerns, we began a multiyear Before-After-Control-Impact (BACI) study on three pairs of streams (one restored with BDAs and one control) across Western Montana. In each stream, we measured changes in physical habitat (e.g., summertime water temperature, hydrologic residence time, instream habitat) and fish characteristics (e.g., movement, species composition).

Thus far we have completed one season of pre-restoration (2019) and one season of post-restoration (2020) data collection. Generally, the effects of the restoration varied by site. For example, in one set of sites, there was evidence of an increase in residence time of water post-restoration leading to increased baseflows downstream of the BDAs. While we did not observe this effect in the other two restored streams. Additionally, we observed more variation in mean and maximum temperatures, as well as warmer temperatures, in the restored sites post-restoration compared with the controls. Even though we observed higher temperatures in the restored sites, this did not translate to increased temperatures downstream of the restoration reach. Here we present these and other preliminary findings from our field efforts.

Relating Geodetic Deflection Under Hydrologic Loading to Stream Discharge Through Storage-Discharge Relationships

Noah B. Clayton

Relating Geodetic Deflection Under Hydrologic Loading to Stream Discharge Through Storage-Discharge Relationships

Noah Clayton, MA Geosciences Candidate

Dr. W. Payton Gardner, University of Montana

Dr. Hilary Martens, University of Montana

Climate change impacts and increasing demands for water require better methods for estimating water resources at small to intermediate watershed scales (~ 1500 km2). In this study, we analyze relationships between GPS vertical deflection under hydrologic loading and stream discharge to investigate temporal changes in terrestrial water storage in watersheds with strong seasonal hydrologic events. Using publicly available GPS time series from UNAVCO and UNR and stream discharge time series from USGS, we isolate vertical GPS deflections resulting from hydrologic loading, use that deflection as a proxy for changes in watershed storage with daily to weekly temporal resolution, and investigate relationships between terrestrial water storage and discharge during streamflow recession periods. We compare terrestrial water storage inferred through conventional storage-discharge relationships to GPS measurements as a proxy for changes in storage to develop knowledge of the spatial and temporal patterns of storage and discharge in our studied watersheds. Our results indicate that geodetic deflection can potentially be used as a fundamental constraint of the watershed’s hydrologic behavior. The geodetic deflection measurement provides unprecedented insight into the antecedent storage conditions and/or evapotranspiration which could lead to significantly improved streamflow prediction and water resource estimates.

Should we use climate analogs to predict climate impacts? A contemporary validation.

Svetlana Yegorova, Univeristy of Montana, Franke College of Forestry
Solomon Z. Dobrowski, University of Montana - Missoula
Sean A. Parks, USDA Forest Service

There is a need for location-specific, fine-grain, climate impact information to inform regional and local-level climate adaptation. However, such climate information is hard to obtain from the existing models either due to the lack of sufficient data or too coarse of a model output to be useful. Analog impact models (AIMs), provide an alternative approach. AIMs rely on climate analogs (locations in space and time that have similar climate) to forecast future climate impacts at a spatial grain as fine as a four kilometer pixel. Analog impact models assume that locations with equivalent climate (climate analogs) also share other important characteristics, such as vegetation type, primary productivity, disturbance regimes, etc. AIMs provide spatially resolved, fine grain predictions of climate impacts, which have the potential to inform location-specific climate adaptation. The use of AIMs is growing, yet there is a lack of information on the quality of AIM predictions. Validating AIMs is a challenge, as actual climate impacts can not be observed in the present and compared to AIM predictions. We evaluate AIMs by testing their performance on climate analogs in space in the reference climate period.

We identify spatial climate analogs in the western US for the 1961-1990 period using 30 year normals of four climate variables (mean maximum temperature of the warmest month, minimum temperature of the coldest month, actual evapotranspiration, and climatic water deficit). We evaluate AIM performance by comparing remotely sensed Landsat tree-canopy data at each pixel of interest (i.e. the observed value) to the tree cover at its candidate analog pixels (i.e. the predicted value) at increasing climatic dissimilarity levels. We find that the AIM predicts tree cover well: the slope of the linear fit of predicted vs actual cover is 0.78 (R2 = 0.78) for climatically closest analogs. Model bias increases and precision decreases with increasing climate dissimilarity between the focal and the analog pixels. Tree cover is often overpredicted for pixels with low tree cover, suggesting that recent disturbance may drive the error at the low cover end. Our study provides support for the utility of climate analogs as a climate impact assessment tool and provides details on the effects of climatic dissimilarity, the number of climate analogs considered, and spatial distribution of spatial analogs on the quality of prediction.

The relationship between Indigenous Cultural Connectedness and health among elders living on the Flathead Indian Reservation

Mattea M. Grant, University of Montana, Missoula

Research Gap. Strengths-based approaches to wellness among NA communities link increased cultural connectedness to protective health behaviors (King et al., 2019; Chandler et al., 2003; Garret et al., 2014), yet it is unclear how modern health behavior practices relate to this concept. An improved understanding of how current health behaviors related to cultural connectedness may inform strengths-based strategies to support healthy behaviors. Therefore, the research question addressed in the proposed research is, “How is cultural connectedness related to health behavior among NAs?”

Methods. I will use qualitative methods and a theoretical framework to conduct a secondary analysis of interview data among NA older adults (ages 50 years and older) living on the Flathead American Indian Reservation (n=21). Interviews were conducted by the Growing Older Staying Strong research study team.

Aim 1. Select a theoretical framework and analysis plan. I will conduct a comprehensive scientific literature review of cultural connectedness and health promotion among indigenous populations. I will select methods that align with expert guidance for public health research with NA populations.

Aim 2. Apply the theoretical framework and perform analysis.

Aim 3. Disseminate findings. I will prioritize local community dissemination activities and scientific products. This will include presentations at local tribal council and culture committee meetings. I will also develop one scientific abstract at a national conference, and one manuscript for peer review and publication.

Results. Findings from this research project will include themes arising from the voices and experiences of NA older adults, describing cultural connectedness in relation to current PA practices. Themes may be used to generate new hypotheses and may inform strategies for supporting cultural connectedness while increasing health behaviors through public health interventions.

Impact Statement. As a Native American graduate student in public health, the proposed research project will provide me with an excellent opportunity for training and research experience. Activities included in this proposal are designed to increase my knowledge in theoretical approaches in my field of interest, improve my skill in qualitative methods, and gain confidence in community and scientific dissemination. Upon completion of this research, I will have identified ways in which cultural connectedness relates to health behaviors, contribute to a growing foundation of strengths-based approaches for health promotion among NA communities, and laying a pathway for my future research directions.

Using classification trees to identify bumble bees

Rebekah Brassfield
Janene Lichtenberg Mrs, Salish Kootenai College

Identification of bumble bees is a time-consuming process that often involves lethal collections of specimens for microscopic identification. With declining bumble bee populations, new methods of identification are in demand. High resolution cameras have provided taxonomists with an alternative to lethal captures. High definition photographs provide a clear source of key identifying features, but still requires the evaluation of hundreds of combinations of features to correctly identify specimens.

Machine learning uses decision trees to classify large amounts of data in short periods of time. Much like taxonomists, the program learns to identify a specimen using combinations of characteristics. Two methods of machine learning were compared to explore the effectiveness and accuracy of identification of bumble bee species using machine learning. Random forest modeling (RFM) builds a forest of decision trees to classify data based on the consensus of trees in the forest. Recursive partitioning (RP) builds a single tree that classifies data by sorting similar characteristics together until a majority is reached.

In this experiment, RFM and RP were used to identify species of the bumble bee genus Bombus using characteristics such as thorax pattern, face length, and abdominal segment colors. RFM had an accuracy of 93% while RP had an accuracy of 88%. Ecological observation data is often not equally distributed with and this can affect the accuracy of some models. RFM showed no bias towards species with fewer observations in the data, suggesting it may be a better model for data sets with unequal distribution of observations. RP models showed significant bias towards species with more observations, suggesting it was less likely to correctly predict on species it encountered less frequently. My study indicates using machine learning may streamline identification of bumble bees and reduce or remove the need for lethal collections.

White ash trees (Fraxinus americana) originally from colder, drier areas are more susceptible to emerald ash borer (Agrilus planipennis)

Timber E. Burnette
Laurel J. Haavik
Justin G. A. Whitehill
Vaughn B. Salisbury
James M. Fischer
Lina Madilao
Kistie B. Brunsell
Joerg Bohlmann

Trees are dying prematurely across the globe at alarming rates, often due to drought and/or insects. Insect attacks can range in severity with some tree species surviving (e.g., the American beech and beech bark disease) and some species becoming essentially extinct (e.g., the American chestnut and the chestnut blight). Tree responses may also vary because of genetic differences among “families” of trees (i.e., intraspecific variation). It is thus hard to predict how tree species will respond to insect attacks. To this avail, we examined tree canopy dieback (leaf loss), drought stress, wound healing, and chemical defense responses of 44 families of white ash (Fraxinus americana) from across the species range to a recent emerald ash borer (Agrilus planipennis) outbreak in an arid common garden in Kansas. At our site, white ash families originally from drier, colder areas (northern USA) exhibited more canopy dieback than families from wetter, warmer regions (southern USA). We also found that tree families that were sensitive to drought and/or slower to heal bark wounds were more likely to show canopy dieback. Overall, we are the first to demonstrate significant intraspecific variation in white ash canopy dieback due to emerald ash borer and connect it to temperature and precipitation conditions of original seed sources. If this canopy dieback pattern holds for mortality, southern white ash may be suitable for reintroduction to the American north, where 99% of white ash has already died.