Authors: Morgan Heimbigner, Cali Caughie, Phoebe Bean, Samantha Russell, Madison Goldstein
Faculty Mentor: Stuart Hall Department of Psychology University of Montana 32 Campus Drive, Missoula, MT 59812
Objective: Wilderness therapy is a model of treatment intervention which utilizes components of traditional therapy in combination with outdoor expeditionary principles. The current pilot project seeks to understand the role of executive function, attachment, resiliency, and hope in wilderness therapy treatment outcomes in a population of at-risk youth (13-17 years of age) taking part in the 5-week Inner Roads treatment program run out of Missoula, Montana. The study aims to gather data to better inform Inner Roads program development and to add to the literature informing wilderness therapy effectiveness for youth populations.
Participants and Methods: The participants (n=11) completed a series of surveys and cognitive tests at intake and discharge from the program. The surveys targeted behavioral and emotional factors such as resilience (YOQ, CYRMR), attachment (AAQ), and hope (BHS). The cognitive tests targeted executive function (TMT, COWA). Student participants were additionally engaged in semi-structured 20 minute interviews with clinical researchers at intake, discharge, and two weeks post discharge.
Results: All measured domains improved from pre-to-post with one exception (COWA). This aligns with our hypotheses, as well as the goals of the program. The most significant and positive main effect of intervention occurred on the somatic subscale of the YOQ (t=3.198, p=.019).
Conclusion: The data shows overall positive improvement in participants cognitive, behavioral, and emotional states from pre- to post-intervention. However only one of the outcomes measured reached statistical significance. This is expected given the small sample size and low statistical power used in this study. Now that a pilot study has been conducted, future studies with more participants and improved methods is the next step.
Samantha C. King
Electrochemical and spectrophotometric pH of the Clark Fork River were studied to assess the data quality of both methods. There have been very few comparisons between spectrophotometric pH and electrochemical pH. This comparison is important because pH electrodes remain the primary method for measuring freshwater pH but the data quality is always questionable. Using the spectrophotometric method for measuring pH could potentially improve pH data quality significantly. This is because spectrophotometric pH relies on absorbance measurements that are very reproducible whereas glass electrode measurements have various sources of inaccuracy that are difficult to quantify. Water samples were collected from August 30, 2017 to January 17, 2020 along the Clark Fork River from near Deer Lodge down to Missoula. Using a pH electrode (YSI, Inc.), the electrochemical pH was measured at the same time the samples were collected. The samples were then brought back to our lab and the spectrophotometric measurements were carried out on a benchtop spectrophotometer (Cary 300, Varian) using a 10 cm path length optical cell. A total of 326 samples have been analyzed. The data was compared by graphing the spectrophotometric pH vs the electrochemical pH as well as the difference between the two pH methods vs spectrophotometric pH. Our preliminary analysis shows that the electrochemical pH is usually higher than the spectrophotometric pH and that the spectrophotometric method is more precise. The standard deviation for the electrochemical method is ± 0.18 pH units and the standard deviation for the spectrophotometric method is ± 0.11 pH units. The next step in this study is to devise experiments to determine the sources of these offsets.
Carly R. Andlauer
As a headwaters state, changes in water across Montana’s landscape can have major implications for nearly two-thirds of the land area of the Conterminous United States to which its two major river basins drain. Monitoring watersheds in Montana provides important metrics for drought detection, agricultural management, and other natural resource management decisions. The Montana Mesonet, a wireless network of 79 meteorological, soil moisture, and groundwater monitoring stations, was created in 2016 to provide monitoring data. This network is supported through federal, state, tribal and private partnerships. Each station records measures at 15-minute intervals, including: soil moisture, soil-water conductivity, precipitation, relative humidity, temperature, wind speed and direction, solar radiation, and groundwater levels. Developing information about soil characteristics will put sensor data into a context applicable for land management decisions. As such, this project-in-progress focuses on the development of soil water retention curves and soil texture characterizations to enhance data collected by the Montana Mesonet. To assess soil characteristics, triplicate soil cores were collected at each station site at four depths. These cores are used for two lab analyses. Soil water retention curves were developed across a moisture gradient using tensiometers, balances, and potentiometers. Lab-measured soil water retention curves allow in-situ soil moisture time series to be converted to soil water potential, a biologically meaningful variable used to assess plant water stress. So far, approximately one-third of sites have soil-water retention curves and soil-water-potential time series generated at all depths. Soil texture will be determined by assessing the particle size distribution of a sample using the integral suspension pressure (ISP) method. The addition of these soil characteristics to the extensive Mesonet data will allow for site-specific historic, near real-time, and forecasted modeling of plant-available water, groundwater recharge, calculations of drought indices, and other practical applications to inform management decisions.
Cody T. Norberg, University of Montana
Surface mass loads, such as oceans, atmosphere, glaciers, snowpack, and freshwater exert forces on Earth’s surface, causing the crust to change shape, also known as crustal deformation. The response of Earth’s crust to these mass loads is often easily predicted and removed from crustal measurements, except atmospheric response, which is much more variable from weather conditions. The Global Positioning System (GPS), which transmits radio signals between satellites and ground receivers, can monitor millimeter changes to the crust’s shape. This data is recorded by over 1200 GPS stations from the Plate Boundary Observatory network all across the western United States. However, methods for processing raw GPS data differ among analysis centers, producing different positions. One difference in GPS processing methods is the treatment of satellite signals delayed in the troposphere, the region of the atmosphere with the most mass. This research is significant to helping develop a set of standard methods for GPS processing and therefore making GPS more accurate. We investigated inconsistencies in three-dimensional (east, north, up) positions from five datasets created by the following three processing centers: NASA’s Jet Propulsion Laboratory, the Nevada Geodetic Laboratory, and UNAVCO consortium. We used the software program LoadDef to model the response of Earth’s surface to changes in atmospheric pressure at each station, which were then compared with observed GPS positions. We find a trend that GPS datasets produced with highly simplified assumptions about tropospheric delays have smaller reductions in root-mean-square scatter after correcting for atmospheric loading. Up to 50 % of the scatter in the residual GPS time series normally considered noise can be explained by crustal deformation responses from atmospheric mass loading. We conclude that differences in GPS time-series solutions can be partially explained by the temporal treatment of tropospheric signal delays during initial processing of raw GPS data.
Hayden A. Hunter, Bitterroot College
Using a watershed approach to study nonpoint source pollution over six-years in the Bitterroot Valley, we at Bitterroot College, have identified two wells with dangerous nitrate levels and a consistent loading of 8 pounds per day emerging as spring water from the aquifer beneath Hamilton. Water quality assessments of nutrients and pesticides nationwide show that nonpoint source pollution often leads to impairment of surface water fisheries and drinking water. Excess nitrogen causes eutrophication in the Bitterroot River. Nitrate levels in drinking water above 10 parts per million are potentially dangerous to newborn infants. Nitrate is converted to nitrite in the digestive tract, which reduces the oxygen-carrying capacity of blood that can cause death. To prevent economic damage to our fishery (worth more than $30 million annually) and to protect human health we are launching a community awareness project. We propose more nitrate assessment of nonpoint source pollution in valley wells and from spring flows down-gradient of Hamilton in 2020. We’ll create a community educational flyer by comparing results from all six years of study reporting our analysis to UMCUR. All of our water samples are collected in March each year (2015-2020) using standardized methodology. Ten percent of our samples are duplicates and blanks for quality assurance and quality control. All of our water samples are analyzed by Energy Labs Inc., a professional environmental water quality laboratory. Nitrate (NO3) and nitrite (NO2) background levels in natural groundwater systems should be less than 1 ppm (U.S. Geological Survey). Nonpoint source pollution in our valley originates from scattered surface locations that percolate into groundwater, which can emerge in the river. The nitrate pollution comes from excess fertilizers, pastures, streets, faulty septic systems, and leaking sewers, all sources that can include pharmaceuticals and pesticides.
Blaik Hopewell, University of Montana, Missoula
Nonactin is a cyclic macrotetrolide antibiotic produced by Streptomyces griseus consisting of two monomers of (+)-nonactic acid and two monomers of (-)-nonactic acid. Following extraction and methanolysis of nonactin and its related nactins, e.g. monactin, methyl nonactate can be purified as a 70:30 mixture of the (-) and (+) enantiomers. Methyl nonactates contain four chiral centers, having a total of 16 stereoisomers. The (+) enantiomer has initially been shown to have a lower minimum inhibitory concentration (MIC) when tested against methicillin-resistant Staphylococcus aureus and other Gram-positive pathogens compared to its complimentary (-) enantiomer. The difference of activity between the enantiomers suggests chiral selectivity of the target. So far hundreds of antibiotic candidates have been synthesized from the core methyl nonactate structure, with three compounds; 31G11, 31G12, and 31E11, having high biological activity in non-optically pure mixtures. These compounds are being made optically pure for eight of the most synthetically accessible isomers. Determination of the optimal stereoisomer will be drawn from comparing the observed MIC values of each optically pure isomer when tested against several Gram-positive and Gram-negative bacteria.
Keri D. Nauman, University of Montana, Missoula
Atmospheric volatile organic compounds (VOCs) are precursors of fine particulate matter anground-level ozone, both of which threaten the public health and are regulated by the U.S. EPA.. Due to the decrease in emissions from traditional transportation vehicles and power plants in the last four decades, volatile chemical products (VCPs) have become an emerging contributor to VOC emissions in urban atmosphere. These VCPs are typically found in household products as inactive ingredients, including cleaning agents, aerosol sprays, personal/hygienic care products, printer ink, pesticides, etc. Once these products are used, the VCPs volatilize and participate in reactions that form ground-level ozone and fine particulate matter. Despite their prevalence in our daily lives, these VCPs and their contribution to VOC emissions remain understudied and uncharacterized. This study aims to develop analytical methods for measuring and quantifying two VCPs using a high-resolution proton-transfer-reaction time-of-flight mass spectrometer (PTR-ToF-MS). PTR-ToF-MS can provide real time VOC concentration measurements in ambient air. The quantification of VCPs and their mass analogues will allow for their detection in these ambient air samples. We focus on characterizing measurements of propylene glycol and diethylene glycol, which are commonly found in VCPs and have some of the highest chemical production volumes. We will present the preliminary results for their instrument sensitivities, fragmentation pathways, potential interferences, and humidity dependencies. We will discuss how altering various instrument parameters can improve these sensitivities and explore the optimal conditions for the atmospheric measurements of these species using PTR-ToF-MS.
Anna K. Schmautz, University of Montana, Missoula
Terminal metal-oxo species are important intermediates in many biological systems such as the mononuclear non-heme iron oxygenases that utilize Fe(IV)-oxo intermediates to initiate oxidative transformations. The formation of a stable high-valent metal-oxo complex for late-transition metals is challenging synthetically and terminal metal-oxo ligands for metals in groups nine through twelve are rare. In this work, we synthesized a mononuclear Nickel complex through different oxidation states of +2, +3, and +4 supported by a dianionic, tridentate ligand. We produced a putative Ni(IV) complex by a single sequence reaction between Ni(III)OH and an oxidant, ceric ammonium nitrate (CAN). The reactivity of this proposed Ni (IV) complex is still under investigation as we move to determine the structure and the characterization of the complex. Our ongoing efforts aim to determine the C-H bond functionalization of the Ni(IV) complex.
Joseph P. Pennington, University of Montana
Cytochrome c (Cytc), a membrane associated redox active protein, functions as an electron shuttle in the electron transport chain. Additionally, it participates in the intrinsic pathway for apoptosis, exhibiting peroxidase activity when interacting with the mitochondrial membrane lipid, cardiolipin. The electrostatic binding site of Cytc, site A, has historically been attributed to positively-charged lysines, in particular, the Lys 72/73 and Lys 86/87 subclusters. These charge subclusters are thought to be the predominant site of interaction between cardiolipin and Cytc at pH 8 as they create zones of positive charge on the protein surface that interact with the negative charge of the cardiolipin headgroup. Recent research indicated that single, pairwise, and quadruple mutations of these Lys residues to the uncharged amino acid Ala in yeast iso-1-Cytc caused little change in binding to cardiolipin, indicating that site A may be more extensive than previously believed. In order to determine the importance of each subcluster to the overall binding affinity of site A in a higher eukaryote with a complete apoptotic pathway, double mutants of Human Cytc, Lys8687Ala and Lys7273Ala were created using PCR-based mutagenesis, characterized through dideoxy sequencing and then expressed in Escherichia coli. The binding constant, Kd, will be measured using fluorescence correlation spectroscopy (FCS) for mutants containing zinc substituted heme in the presence of nanodiscs containing 100% cardiolipin. By comparing these measurements with previous results for wild type Human Cytc, the importance of each of the subclusters to overall binding affinity can be determined by comparing the differences in Kd. Based on previous results with yeast iso-1-Cytc, it is expected that introduction of pairwise mutations will have little effect on Kd, contradicting the notion of a primary charge subcluster.
Eleanor I. Serviss, University of Montana, Missoula
There is an increasing demand for infrared cameras aboard small Unmanned Aircraft Systems (sUAS) for research in wildlife biology, fire, geology, land management practices, and emergency services. The University of Montana’s Autonomous Aerial Systems Office regularly flies missions with infrared cameras on sUAS. Because sUAS generally use infrared cameras without onboard temperature control systems due to weight restrictions, temperature measurements are highly affected by internal camera temperature and the cameras need regular calibration. This project documents an attempt to calibrate one of these cameras without accessing the camera’s internal thermistor.
For calibration, an aluminum mirror reflected the camera’s internal thermal radiation back at its own sensors. A blackbody source provided a known temperature to calibrate against. Around half of the Vue Pro R camera’s temperature measurements of the blackbody were beyond the manufacturer’s uncertainty (5℃).
The study revealed that without calibration, the Vue Pro R has too high of a measurement uncertainty for many applications in wildlife biology. Finding a lightweight infrared camera that fits on a sUAS and better accounts for internal temperature is needed.
Amanda J. Kotila, The University Of Montana
Droughts are one of the most prevalent natural hazards. They cause severe economic losses and water and food insecurity. While droughts can be observed, the mechanisms that cause a drought to occur and persist are poorly understood. Our aim is to better understand drought behavior by using principal component analysis to determine their causes, as well as further insight to their onset, duration, and effects.
A drought can be defined as an area and duration of below-average precipitation compared to that area’s mean precipitation at that point in time annually. A useful tool for determining and analyzing a drought is the Standardised Precipitation-Evapotranspiration Index (SPEI.) The SPEI uses temperature, precipitation, and evapotranspiration information to calculate the net precipitation anomaly across the globe. Using consecutive-in-time SPEI maps, one can observe the spatio-temporal onset, duration, and end of a drought.
While the SPEI can be used to identify and observe droughts, we used Principal Component Analysis (PCA) to try to understand the teleconnections between droughts and their causes. PCA is a statistical method that creates new, uncorrelated variables (Principal Components, PCs) and allows for clearer interpretation of large datasets of dependent variables and minimizing information loss. The correlation of PCs to the dataset is given by their variance. By applying PCA to the SPEI dataset for the continental United States, we found that the first four PCs had more than 50% of the variance. Studying the trendline of each PC through time, we then fit the trendlines of other climatic forces to see possible causes of drought. From this comparison, we found that the first (primary) PC is influenced by the Southern Oscillation (driver of the El Niño cycle) and the second PC is influenced by the North American Monsoon, which occurs annually and affects the Southwestern states, Texas, and Colorado. These results not only show some of the climatic teleconnections to past droughts, but show that PCA is a viable method for discovering such teleconnections.