Skip to main content
Advanced Search

Filters: Tags: Computational Tools and Services (X)

84 results (54ms)   

Filters
Date Range
Extensions
Types
Contacts
Categories
Tag Types
Tag Schemes
View Results as: JSON ATOM CSV
Wildfires affect streams and rivers when they burn vegetation and scorch the ground. This makes floods more likely to happen and reduces water quality. Public managers, first responders, fire scientists, and hydrologists need timely information before and after a fire to plan for floods and water treatment. This project will create a method to combine national fire databases with the StreamStats water web mapping application to help stakeholders make informed decisions. When the project is finished, people will be able to use StreamStats to estimate post-wildfire peak flows in streams and rivers for most of the United States (where data is available). There will also be tools that allow users to trace upstream and...
thumbnail
Deep learning is a computer analysis technique inspired by the human brain’s ability to learn. It involves several layers of artificial neural networks to learn and subsequently recognize patterns in data, forming the basis of many state-of-the-art applications from self-driving cars to drug discovery and cancer detection. Deep neural networks are capable of learning many levels of abstraction, and thus outperform many other types of automated classification algorithms. This project developed software tools, resources, and two training workshops that will allow USGS scientists to apply deep learning to remotely sensed imagery and to better understand natural hazards and habitats across the Nation. The tools and...
thumbnail
Inventories of landslides and liquefaction triggered by major earthquakes are key research tools that can be used to develop and test hazard models. To eliminate redundant effort, we created a centralized and interactive repository of ground failure inventories that currently hosts 32 inventories generated by USGS and non-USGS authors and designed a pipeline for adding more as they become available. The repository consists of (1) a ScienceBase community page where the data are available for download and (2) an accompanying web application that allows users to browse and visualize the available datasets. We anticipate that easier access to these key datasets will accelerate progress in earthquake-triggered ground...
CDI helped fund development of the USGS Geo Data Portal in 2010. In 2012, CDI funded two projects to increase the functionality of the Geo Data Portal. The Resources section below contains links to the Geo Data Portal website and deliverables from the 2012 projects. Description of the Geo Data Portal from the Geo Data Portal documentation home : The USGS Geo Data Portal (GDP) project provides scientists and environmental resource managers access to downscaled climate projections and other data resources that are otherwise difficult to access and manipulate. This user interface demonstrates an example implementation of the GDP project web-service software and standards-based data integration strategy. A user...
This project created a mobile application to collect nationally consistent data of fish passage barriers in the United States to meet needs for hydrologic and ecological assessments and conservation planning decisions. Benefits Meets high priority need for hydrological and ecological assessments Data available to conservation planners Expand USGS scientific and technical support to the National Fish Habitat Action Plan Deliverables Presentation given at CDI-hosted Webinar (September 2012) Available to both iPhone (iOS6) and Android (3.0 or higher). Uses the geo-locational services provides with HTML5 to correlate location with an online data entry form Adds the ability to attach photos acquired in the field...
thumbnail
Over the last few years, the ISO 19115 family of metadata standards has become the predominantly accepted worldwide standard for sharing information about the availability and usability of scientific datasets among researchers. The U.S. interests in the ISO standard have also been growing as global-scale science demands participation with the broader international community; however, adoption has been slow because of the complexity and rigor of the ISO metadata standards. In addition, support for the standard in current implementations has been minimal. In 2009, the Alaska Data Integration Working Group members (ADIwg) mobilized to jointly address common data integration efforts. Beginning in 2012, ADIwg started...
thumbnail
Recent open data policies of the Office of Science and Technology Policy (OSTP) and Office of Management and Budget (OMB), which were fully enforceable on October 1, 2016, require that federally funded information products (publications, etc.) be made freely available to the public, and that the underlying data on which the conclusions are based must be released. A key and relevant aspect of these policies is that data collected by USGS programs must be shared with the public, and that these data are subject to the review requirements of Fundamental Science Practices (FSP). These new policies add a substantial burden to USGS scientists and science centers; however, the upside of working towards compliance with...
Autonomous Underwater Vehicles (AUVs) are instruments that collect water-quality, depth, and other data in waterbodies. They produce complex and massive datasets. There is currently no standard method to store, organize, process, quality-check, analyze, or visualize this data. The Waterbody Rapid Assessment Tool (WaterRAT) is aPython application that processes and displays water-quality data with interactive two-dimensional and three-dimensional figures, but it runs offline with few capabilities and for just one study site. This project will transition WaterRAT to an online application that the public can easily use to view all AUV data. A database of all AUV datasets will be developed to improve accessibility,...
thumbnail
ScienceCache was originally developed as a mobile device data collection application for a citizen science project. ScienceCache communicates with a centralized database that facilitates near real-time use of collected data that enhances efficiency of data collection in the field. We improved ScienceCache by creating a flexible, reliable platform that reduces effort required to set up a survey and manage incoming data. Now, ScienceCache can be easily adapted for citizen science projects as well as restricted to specific users for private internal research. We improved scEdit, a web application interface, to allow for creation of more-complex data collection forms and survey routes to support scientific studies....
thumbnail
Insect pests cost billions of dollars per year globally, negatively impacting food crops and infrastructure and contributing to the spread of disease. Timely information regarding developmental stages of pests can facilitate early detection and control, increasing efficiency and effectiveness. To address this need, the USA National Phenology Network (USA-NPN) created a suite of “Pheno Forecast” map products relevant to science and management. Pheno Forecasts indicate, for a specified day, the status of the insect’s target life cycle stage in real time across the contiguous United States. These risk maps enhance decision-making and short-term planning by both natural resource managers and members of the public. ...
2012 Updates (from the FY12 Annual Review) The NWIS Web Services Snapshot represents the next generation of data retrieval and management. The newest Snapshot tool allows instant access to NWIS data from four different web services through ArcGIS, software available to all USGS scientists in all mission areas. Increased data retrieval efficiency reduces the steps required to retrieve and compile water data from multiple sites from what can be more than 30 steps to just a few clicks. As an end-user education tool, it promotes use of NWIS data from both web services and the NWIS database, which increases the production of scientific research and analysis that uses NWIS data. The Snapshot database design enables efficient...
thumbnail
Droughts are becoming more frequent and severe and this trend is expected to continue in the coming century. Drought effects on natural resources include reduced water availability for plants and humans, as well as increased insect, disease, and vegetation mortality. Land managers need more information regarding how water availability may change and how drought will affect their sites in the future. We developed an online, interactive application that allows natural resource managers to access site-specific, observed historical and predicted future water availability. Users are able to set information that affects water balance, including soil texture and vegetation composition. With these inputs, as well as site-specific...
Artificial Intelligence (AI) isrevolutionizing ecologyand conservation by enabling species recognition from photos and videos. Our project evaluates the capacity to expand AI for individual fish recognition for population assessment. The success of this effort would facilitate fisheries analysis at an unprecedented scale by engaging anglers and citizen scientists in imagery collection.This project is one of the first attempts to apply AI towards fish population assessment with citizen science.
thumbnail
Advances in information technology now provide large volume, high-frequency data collection which may improve real-time biosurveillance and forecasting. But, big data streams present challenges for data management and timely analysis. As a first step in creating a data science pipeline for translating large datasets into meaningful interpretations, we created a cloud-hosted PostgreSQL database that collates climate data served from PRISM (https://climatedataguide.ucar.edu/climate-data) and water-quality data from the National Water Quality Portal (https://www.waterqualitydata.us/) and NWIS (https://waterdata.usgs.gov/nwis; fig 1). Using Python-based code, these data streams are queried and updated every 24 hours,...
thumbnail
ScienceCache is a scientific geocaching mobile application framework that targets two user groups for citizen science data collection: youth and geocachers. By melding training and games into the hunt for place-based data collection sites and incorporating photo uploads as data and authentication, new volunteers can collaborate in robust data collection. Scientists build a project on the administrative Web site app, specifying locations or goals for new data collection sites, clues for established sites, questions to answer, measurements, or other activities for the site based on their individual data needs. The project builds on the success of the USA National Phenology Network (NPN) and the ScienceBase project...
thumbnail
Note 9/22/18: The Adopt a Pixel concept has been incorporated into NASA's Globe Observer App (Land Cover Tool). Find out more and download the app at https://observer.globe.gov/. *** Adopt a Pixel-Data Infrastructure (AaP-DI) provides the basis for a new data acquisition system for ground reference data. These data will be used to complement existing and future remote sensing collections by providing geospatiallytagged ground-based landscape imagery and landcover of an exact location from 6 different viewing aspects. The goal is for AaP-DI to enable citizen participation in Landsat science. Deliverables: The Adopt a Pixel web interface (http://adoptapixel.cr.usgs.gov) is a data upload portal that allows citizen...
The purpose of this project was to establish and support a USGS Mobile Environment website to provide support of portable hardware devices, application development and application delivery. The development of a framework to fully support this endeavor will require input and involvement by Core Science Systems, Enterprise Information, Science Quality and Integrity, Office of Communication, Publishing and the mobile community. Benefits One-stop shop to provide detailed support information across USGS Mission Areas Actual functioning mobile applications, built collectively Deliverables Trained Mobile Community Workshop held July 17 – 19, 2012 Presentation given at CDI-hosted Webinar (September 2012) Provides Support,...
thumbnail
The National Land Cover Database (NLCD) serves as the definitive Landsat-based, 30-meter resolution, land cover database for the Nation. NLCD supports a wide variety of Federal, State, local, and nongovernmental applications that seek to assess ecosystem status and health, understand the spatial patterns of biodiversity, predict effects of climate change, and develop land management policy. However, access to NLCD products for the USGS community and the public is a concern due to large file sizes, limited download options, and the expectation that users must download and analyze multiple land cover products in order to answer even basic land cover change questions. Therefore, the goal of the NLCD Evaluation, Visualization...
This project developed a set of raster utility classes and layer types for inclusion in OpenLayers to allow for statistical analysis, manipulation, and additional rendering functionality for raster data sources. The deliverables are patches for the OpenLayers development branch that include the new functionality, examples and documentation to demonstrate its use, and comprehensive unit test coverage. The intention was to get this newly developed functionality into the next stable release of OpenLayers. An additional component of an HTML5 toolkit is for the opensource JavaScript mapping framework OpenLayers. These tools are especially useful to USGS web mapping needs. This effort delivered a new set of classes within...
thumbnail
Geotagged photographs have become a useful medium for recording, analyzing, and communicating Earth science phenomena. Despite their utility, many field photographs are not published or preserved in a spatial or accessible format—oftentimes because of confusion about photograph metadata, a lack of stability, or user customization in free photo sharing platforms. After receiving a request to release about 1,210 geotagged geological field photographs of the Grand Canyon region, we set out to publish and preserve the collection in the most robust (and expedient) manner possible (fig. 6). We leveraged and reworked existing metadata, JavaScript, and Python tools and developed a toolkit and proposed workflow to display...