Have you ever been involved in a survey research project and, once data starts to come in, thought, “Now that I have this data, what’s the best way to showcase the information?” It’s a common challenge amongst researchers and one we’ve successfully tackled by evaluating the survey’s components through a series of questions outlined below.
We all know that surveys are a versatile and cost effective way to collect data of all different types. They are considered a relatively simple research tool and are used across all industries in various forms. For example, agricultural industry relies heavily on grower/farmer surveys in order to compile data on management practices, as part of stewardship and outreach efforts. However, once collected the question becomes how to efficiently analyze and present survey-collected data. There are multiple clues found in the collection process itself that factor into how the data can be best reported:
Does the data require anonymization? A basic rule in survey data collection is that information related to identity or privacy protection must be altered or removed in such a way that the subject can no longer be linked to the survey data. We have found that developing a strong data anonymization process provides an additional comfort level for survey subjects who provide data in a voluntary survey format. For example, sometimes the survey can be completely anonymized with no identifying information collected. Other times, a secondary survey can be completed if a list of participants is required. If identifying information is collected, however, we’ve found that the assignment of protected and coded identifiers provides a mechanism to anonymize dataset.
What was the sample size and response type? Data analysis from survey results requires an understanding of the sample size and response types, whether in the form of numerical or text data. Multiple choice responses and close- or open-ended text responses must be considered differently for analysis and reportability. Summary statistics, including evaluation for normal distribution, as well as descriptive statistics and significance testing can be applied to survey response data during this step.
How large and complex is the data? If the survey data is particularly large or complex, various database tools are available for data organization and analysis. Beyond the use of simple spreadsheets, we have used tools such as the ArcGIS Survey123 application from the ESRI Geospatial Cloud. If we’re compiling records from various databases for processing, we have applied python scripts to extract data and store them in Microsoft SQL databases. The development of application programming interfaces (APIs) then allow us to bridge the gap between our database and output reports and web tools.
As a part of the data analysis step, it is also critical to understand the uncertainty in the data set. Questions to consider may include:
Was the sample size sufficient?
Was random or non-random sampling applied?
What was the response rate?
How did the people who agreed to participate differ from those who declined?
How was the survey administered (paper, electronically, in-person, or telephone interview), and
How could the administration format impact participation or responses?
How will the final report be presented? Our goal is to always present survey results in a format that clearly and concisely communicates the data and aligns with the aim of the research. How we accomplish this goal boils down to our end audience. Will it be presented within a scientific setting with in-the-know users or will it have a more general audience, such as a company’s C-Suite Executives? Do we need to provide supporting data or can the results stand on their own?
We strive to make the information instantly understandable and, as such, often employ a mix of graphic and pure data presentations. For example, we have found that customized end-user reports with graphical data presentation has been of particular use for growers and farmers participating in agricultural surveys. We have been able to graph metrics over time for a single grower’s performance, as well as comparisons to other growers in a way that maintains the anonymity of all survey contributors. You likely see similar graphical presentations on your monthly electric bill, which serves as a simple mechanism to communicate trends overtime and comparisons.
Answering the above questions will help ease the challenges associated with presented survey data which, in turn, results in more straight-forward communication and greater data usage.
By Raghu Vamshi, Senior Geospatial Scientist; email@example.com
Ensuring the safety of drinking water from potential contaminants is a shared priority for the US Environmental Protection Agency, environmental researchers, and industries alike. To this end, it is mission critical to stay well versed in the newest EPA methodologies including a deep understanding of the underlying databases, development tools, and assumptions.
The Safe Drinking Water Act (SDWA) was passed by Congress in 1974 and amended for updated actions in 1986 and 1996 to regulate the country’s public drinking water supply. Drinking water assessments for conventional pesticides are a critical part of SDWA. Just a few months ago, EPA formally announced two new methods updating the drinking water exposure assessment from surface water sources tools. The method changes provided by EPA are designed to:
Build new scenarios (a combination of crop, soil type, and weather data) for use in EPA’s Pesticide in Water Calculator, the standard water exposure model for both drinking water and aquatic wildlife;
Better account for variability in the agricultural area within a watershed that may contribute to a drinking water intake (Percent Cropped Area (PCA)) and incorporate data on the amount of a pesticide applied within a watershed for each use (Percent Crop Treated (PCT));
Outline methods to confidently use surface water monitoring data;
Derive and integrate pesticide-specific sampling bias factors to address temporal challenges with available monitoring data; and,
Use a weight-of-evidence approach to evaluate the relevance of monitoring sites to drinking water watersheds to address spatial limitations with available monitoring data.
Those involved in conducting drinking water assessments may have found the new approaches challenging and data intensive. Our geospatial scientists have meticulously worked through these challenges and are able to apply and even expand upon EPA’s methods to address chemical-specific challenges that arise in drinking water assessments. Our familiarity with the datasets and methodology used by EPA in these updates allowed us to apply the standard scenarios and use the source datasets to evaluate focused refinement opportunities. Waterborne scientists will continue to watch this space for future EPA method changes and work to apply similar tools for ecological risk assessments and endangered species assessments.
It is an understatement to say that the COVID-19 pandemic has taken a toll across the globe. In response, we’ve recently been applying our expertise in geospatial analysis and modeling to investigate SARS-CoV-2 viral load in wastewater and surface waters. We provide a unique tool for wastewater-based epidemiology in assessing COVID-19 and forecasting changes due its spread across communities. This study is the first of its kind to provide national-scale quantitative data to examine movement and fate of residual RNA fragments as well as the efficiency of wastewater treatment plants in removing the virus. Stay tuned for the release of our collaborative publication.
In a technology-driven world, finding the answers to questions for many of us has become as simple as a targeted Google search. For Zack Stone, our Senior Computer Engineer, that means constructing web tools and databases to host the information we gather here at Waterborne Environmental and applying programming tools to extract the answers we’re looking for.
Zack started with Waterborne as an Engineering Intern ten years ago. He first worked with our field sampling team but quickly became a source of technical and database support for our offices. Now, he works on massive datasets, development of innovative web tools and digital agriculture programs. In his words, he is “the gatekeeper for all things related to structured query language (SQL) databases.” Nothing goes in or out without his say.
On a day-to-day basis, Zack is knee deep in data, finding solutions for importing, sorting, and scrubbing data. Sometimes this includes data from our water monitoring work, or nation-wide agricultural studies. Other times, he’s working with survey data from farmers and growers, building databases and reporting on agricultural impact progress over time. He has worked to create digital houses for all of the information gathered in studies here at Waterborne and incorporates other relevant and publicly-available databases. His work has brought cohesion to Waterborne’s long history of environmental studies, creating a reference tool for precise representations of the locations and chemicals we have studied over the years.
Not only does Zack have keen eye for innovation and data solutions, he also has a clear knack for understanding the needs of the end-user. Bridging the complexity of web-based data tools and the need for simplicity in the spatial and graphical output, he certainly fills a necessary niche within the industries of agriculture and personal care products, and we know that his skill set is directly applicable to the needs of other industries as well.
Environmental concern is at the heart of our work at Waterborne, and there are few better examples than the work of our Senior Geospatial Scientist, Raghu Vamshi. From his research in endangered species assessments, monitoring site selection and analysis of environmental fate of agricultural chemicals to evaluating down-the-drain chemicals and personal care products in surface waters, Raghu’s 15 years with Waterborne have certainly bolstered our ability to study, create, and apply solutions to environmental challenges. He has a passion for the environment, proud that his work gives him the ability to explore the world from his computer screen and provide solutions to local and international problems alike.
Raghu holds a Bachelor’s degree in Agriculture and a Master’s degree in Geographic Information Systems (GIS) from Texas Tech University. After graduate school, he spent three years as a GIS Analyst for Paradigm Alliance Inc. in Wichita, KS. There, he spatially analyzed socio-demographic information to help corporations select the best locations for new buildings, from shopping malls to banks. While the challenge and need for innovation of this role aligned with his tastes, Raghu had a goal to apply his knowledge to the agricultural industry with a stronger focus on the environmental impacts. In 2006, our paths aligned and Raghu took on a new role as a Project GIS Specialist at Waterborne, relocating from Kansas to Virginia to be in close proximity to our Leesburg, VA headquarters.
In his years with Waterborne, Raghu demonstrated dedication and innovation at every turn. His passion for the environmental work clearly aligns with his commitment to “enable those in decision-making roles to have robust tools and data to make better informed decisions.” Internally, that has meant sifting through databases of geographical and crop records to help narrow down the site selection process for field studies, both at the national and international level.
Externally, Raghu has been leading efforts in our personal care products work, targeting down-the-drain environmental fate of products ranging from shampoo to UV filters in suncreens. The US and many other countries have wastewater treatment mechanisms in place to remove or reduce these down-the-drain chemicals. However, countries that lack the appropriate wastewater treatment facilities often struggle to address the concerns of down-the-drain chemicals. In China and India for example, there are locations where products have been dumped by waste disposal or plumbing from millions of consumers with little to no means for filtration or chemical containment. Not only does this pose a significant human risk for drinking water pollution but there are also environmental contamination risks at play as well. Several key environmental questions come to mind…What is the environmental fate of disposed chemicals and the potential effects to plant and animal species? What can we do to stop that flow of pollution and remediate the contamination that has already happened? How far has the damage spread and how much further will it travel? Does the sewage from coasts float hundreds of miles away to damage coral reefs? These are the sort of questions that Raghu’s work helps to answer, acknowledging that the human use of personal care products must be supported by environmental impact assessments for the larger sustainability needs.
Raghu humbly works to make the world a better place for all of us, including his wife and two children. While he’s facing the same challenges as most parents during these unprecedented times, it’s pretty great to see how he lights up when talking about his family. We get to see a similar sense of passion and joy in his work at Waterborne. His demeanor and level of dedication clearly have made him a wonderful husband and father, but also a tremendous asset to Waterborne as a company. We all aim to match his enthusiasm in working more and more towards the global needs of our environment.
It may not have the same feel as an Indiana Jones adventure or an expedition of old, but our work in both Geographic Information Systems (GIS) and database management for monitoring still allows us to explore the globe and experience far-off regions…through our computers. With our normal business travel suspended due to the pandemic, we’ve become the most modern of explorers, trotting the globe with the click of a mouse.
While not quite the same as true field work, our scientists are finding these “business trips” rewarding work. One day we’ll find ourselves engrossed in geographical or monitoring data, learning about soil types, local climate, or other environmental intricacies of a particular place. Another day we’ll be introduced to a small US town with an interesting name that triggers our curiosity. The experience can feel like a road trip when you see an interesting highway sign and have the urge to pull off and explore!
GIS traveling recently took Dr. Gerco Hoogeweg, our Principal Soil and Water Quality Scientist, to the Sahel region of Western Africa. In a geospatial analysis project, Gerco examined the co-occurrence between a pre-emergent herbicide use and potentially vulnerable soils in this region. This trip took him on a mapping and GIS journey covering over 400 million hectares of land in the region!*
A trip closer to home led to the discovery of a small coastal town named Slaughter Beach in Delaware. The name alone piqued our interest and, after a bit of digging, we found that, without concrete proof, multiple stories have been built around the name. The town is a massive spawning site for horseshoe crabs in the Spring and this lore has it that when the waves hit the spawning crabs on the beach, the crabs were flipped over and killed by exposure to the elements. Another story—and hopefully the true version for the crabs’ sake—is that the surname of the first postmaster in the settlement was Slaughter. Either way – the “slaughter of the crabs” or the “Slaughter of the post office” – the town name alone urged us to explore.
No one on our team is complaining about our chance to thoroughly explore Europe. Though our work with GIS ecoregion and soil crosswalks we conduct similarity analyses on soil climates across continents to find similar soil conditions and ecological regions. By identifying similar conditions occurring in the US and Europe, we can help our clients minimize the need for timely and costly field or laboratory studies. While working with these similarity indexes, we can learn about the climates and soil characteristics across Europe and the US.
Even though many of us are still stuck working from home, we’re excited to see where our data travel will take us next.
In response to the growing needs of our clients, Waterborne data scientists recently saw an opportunity to develop a Monitoring WebTool that aids in management decisions related to modeling, field investigations, and stewardship activities.
Our WebTool combines monitoring data with a spatial component that can be rendered on the fly from GIS mapping data, with basic statistical outputs. The interface provides the end user with a number of options for filtering the data, based on location or compound of interest. By integrating field study data and national datasets that have been scrubbed by our scientist, users are given an in-depth view of their product chemistry within the United States.
Our WebTool allows for large sets of monitoring data to be spatially and tabularly contextualized, which provides a multitude of beneficial options for our clients. Product use and chemical presence identified by location help to inform regulatory decisions and the need for any further exposure modeling work or additional field studies. The tool allows for faster identification of potentially vulnerable areas and provides value in time and cost savings associated with delays in the decision-making process.
Stewardship managers can use the tool directly with growers to identify locations for targeting stewardship activities. The data can also be combined with the ‘boots on the ground’ approaches so that end users can examine data trends over a given timeframe and view the impacts of stewardship activities in the watersheds. When best management practices and other measures are put in place such as label changes, for example, we can examine how the changes are reflected in the monitoring data. Our Monitoring WebTool literally puts these answers at the fingertips of our clients in an easy-to-use visual presentation.
The application of GIS is not a completely novel approach and many spatial tools have been used to generate static images (e.g., maps) that are not easily updated with new data. Since this tool is web-based, updated data are provided in real time, which provides our clients with an independent decision-making tool. Our data scientists acknowledge that this is the future direction of data sources with the implementation of machine learning. This gives us the ability to work with our clients in the development of customized features and filters to address specific higher-tier data needs.
To discuss your specific monitoring needs and how our WebTool can provide customized assistance to your organization, please contact Zack Stone (firstname.lastname@example.org) with questions or comments.
Although the implications of a global pandemic certainly impacted some of our field study activities, we made wise use of our time by expanding training across our field studies team. Since we work in a regulated industry, keeping up on Good Laboratory Practices (GLP) training is imperative to ensure that defined processes and standards of quality are being followed and maintained, even when not in the field. To that end, our Field Studies team buckled down on the following training:
First Aid. As anyone who has worked in the field knows, there are times when we are miles away from the nearest person, much less a medical facility. With this in mind, our Field Studies team reviewed our First Aid protocols and any team member who needed it completed a First Aid Certification program.
Technology. Software training is also critical to make sure our staff are up-to-date on the latest and greatest technologies. Our field studies staff took part in a variety of software training that has included Esri ArcGIS Desktop, ArcGIS Pro, and Basic Datalogger/Loggernet.
Project Management. Our client projects are always top of mind, and finding better ways to manage their processes while mentoring our up-and-coming staff members is part of our culture. 2020 provided us with an outstanding opportunity to complete internal project management courses.
With so many of our staff members trained in such diverse areas, we’re certainly looking forward to the 2021 upcoming field season!
Bathym… what?! You read correctly. Bathymetry, a form of hydroacoustics, may seem like a strange word but it represents a common field study measurement of water depth, typically in oceans, seas or lakes. Using echolocation, where sound waves are sent out and returned, bathymetry surveys are used to map out beds of bodies of water to establish depth and any underwater features (i.e., underwater canyons, the mid-Atlantic Ridge, underwater volcanoes).
Typically with bathymetry, equipment is attached to a survey boat and the boat drives across the area to be surveyed. That equipment uses echolocation to measure the time it takes the sound waves to travel first from, then back to the boat. From that measurement the depth can be determined.
Waterborne first acquired Bathymetry technology in 2014 with the primary intention of using it for conducting flow measurements on our stream and large river monitoring stations. Since then, we have used it to conduct bathymetric surveys in ponds, perform velocity profiling in streams, and conduct cross-sectional surveys. Recently, Waterborne conducted a study where an Acoustic Doppler Current Profiler (ADCP) was used for a bathymetric survey in a pond.
In our pond study, ADCP was used in conjunction with a GPS receiver mounted in a tethered boat. The ADCP collected depth data with three separate acoustic beams that were averaged to resolve a single depth point via software at a rate of several times a second. A separate software suite was then used to connect the depth data back to its associated GPS location. Repeated surveys were then referenced to a common elevation datum, which helped us determine sedimentation and erosion patterns.
We’ve found that the hydroacoustic science technology, including bathymetry has enormous potential. For example, we’ve deployed it to collect surrogate datasets for constituents such as suspended sediment and/or other constituents that attach themselves to sediment (i.e. certain pesticides and nutrients). With this technology at our disposal, our team has the ability to investigate characteristics of waterbodies in many different ways. The future is looking bright for this science with a funny name!
Applying creative approaches to overcome the innate hurdles associated with complex study designs is another day at the office for the experts on our field studies team. Never was this more true than when one of our clients asked for help in satisfying the new EPA goals surrounding the effects of nutrient transport outlet water. Conventional study work would have unnecessarily wasted precious time and resources and was certainly not in our client’s best interest.
Excess nutrient transport from agricultural settings have contributed to a hypoxic zone in the Mississippi River basin and Gulf of Mexico, which, in turn led the EPA to form the Hypoxia Task Force with a goal to reduce the size of the hypoxic zone to less than 5,000 km2 by 2035. The Task Force also has an interim goal of a 20% reduction of nitrogen and phosphorus loading by 2025. Unsurprisingly, these environmental goals have created an urgent need for studies to better understand nutrient transport in agricultural landscapes.
At the client’s request, a Waterborne team led by our Senior Agricultural Engineer Greg Goodwin established a large-scale field study to observe the effects of treatments on the nitrate-nitrogen concentration in outlet water of a tile-drain corn and soy rotated field. The study design included 37 discrete tile monitoring stations, making for quite a large monitoring effort! Our team went to work to evaluate possible monitoring approaches that could be employed in this case. It didn’t take long to realize that a conventional water sampling approach with individual samples and physical sample collection would waste precious study resources in time and cost. A novel approach was certainly needed in this case!
After careful consideration of possible approaches, the team landed on an automated pass-through study design. This approach made use of dataloggers controlling water-level sensors and pumps at each of the 37 separate tiles. The team also set up two analysis stations capable of radio-communicating with the dataloggers. This setup allowed for individual tile station pumps to push water samples to the appropriate analysis station where it was then fed over a nitrate sensor set to automatically read and store sample results. The analysis stations were also equipped with full weather stations and tipping-bucket rain gauges. Data was transmitted via cellular modem and accessible to the project team in real-time.
Application of this automatic approach drastically improved sampling frequencies and gave the team the ability to collect data that wouldn’t have been feasible collecting individual physical water samples, including back-to-back flow events without the need for site visits. Improvement of the sampling frequency allowed our data scientists to see trends that we may not have otherwise captured with more infrequent sampling intervals. The approach also decreased time and costs associated with manual labor, sample transport, and analysis. The real-time availability of the analyte data helped the project team make faster study decisions.
In addition to the automated approach applied to this study, this design allowed for the use of sophisticated data analysis techniques. We used Aquarius as a means to identify and remove erroneous measurements using USGS-approved processes. This design also allowed us to apply a correlation matrix to each of the tiles and then used Pearson’s correlation coefficients to identify stronger and weaker correlation of tile flow measurements to each other. These correlations were then used to select the tiles for assignment of one of four application rates used in the study. Ultimately, the increase of replicates allowable by the automated design offered us a unique statistical power in this study to assign treatments from a baseline comparison of replicate responses. We’re continuously working to bring our clients options for novel study designs, whether it be in the form of new equipment, automated processes, out-of-the-box solutions, or assessment of statistical power within a study. If you have any questions of how a novel field study design could help with your specific needs, please feel free to contact Greg Goodwin at email@example.com.