首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.  相似文献   

2.
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.  相似文献   

3.
Abstract

Hydrological models are commonly used to perform real-time runoff forecasting for flood warning. Their application requires catchment characteristics and precipitation series that are not always available. An alternative approach is nonparametric modelling based only on runoff series. However, the following questions arise: Can nonparametric models show reliable forecasting? Can they perform as reliably as hydrological models? We performed probabilistic forecasting one, two and three hours ahead for a runoff series, with the aim of ascribing a probability density function to predicted discharge using time series analysis based on stochastic dynamics theory. The derived dynamic terms were compared to a hydrological model, LARSIM. Our procedure was able to forecast within 95% confidence interval 1-, 2- and 3-h ahead discharge probability functions with about 1.40 m3/s of range and relative errors (%) in the range [–30; 30]. The LARSIM model and the best nonparametric approaches gave similar results, but the range of relative errors was larger for the nonparametric approaches.

Editor D. Koutsoyiannis; Associate editor K. Hamed

Citation Costa, A.C., Bronstert, A. and Kneis, D., 2012. Probabilistic flood forecasting for a mountainous headwater catchment using a nonparametric stochastic dynamic approach. Hydrological Sciences Journal, 57 (1), 10–25.  相似文献   

4.
ABSTRACT

This paper is the outcome of a community initiative to identify major unsolved scientific problems in hydrology motivated by a need for stronger harmonisation of research efforts. The procedure involved a public consultation through online media, followed by two workshops through which a large number of potential science questions were collated, prioritised, and synthesised. In spite of the diversity of the participants (230 scientists in total), the process revealed much about community priorities and the state of our science: a preference for continuity in research questions rather than radical departures or redirections from past and current work. Questions remain focused on the process-based understanding of hydrological variability and causality at all space and time scales. Increased attention to environmental change drives a new emphasis on understanding how change propagates across interfaces within the hydrological system and across disciplinary boundaries. In particular, the expansion of the human footprint raises a new set of questions related to human interactions with nature and water cycle feedbacks in the context of complex water management problems. We hope that this reflection and synthesis of the 23 unsolved problems in hydrology will help guide research efforts for some years to come.  相似文献   

5.
Abstract

The uncertainty associated with a rainfall–runoff and non-point source loading (NPS) model can be attributed to both the parameterization and model structure. An interesting implication of the areal nature of NPS models is the direct relationship between model structure (i.e. sub-watershed size) and sample size for the parameterization of spatial data. The approach of this research is to find structural limitations in scale for the use of the conceptual NPS model, then examine the scales at which suitable stochastic depictions of key parameter sets can be generated. The overlapping regions are optimal (and possibly the only suitable regions) for conducting meaningful stochastic analysis with a given NPS model. Previous work has sought to find optimal scales for deterministic analysis (where, in fact, calibration can be adjusted to compensate for sub-optimal scale selection); however, analysis of stochastic suitability and uncertainty associated with both the conceptual model and the parameter set, as presented here, is novel; as is the strategy of delineating a watershed based on the uncertainty distribution. The results of this paper demonstrate a narrow range of acceptable model structure for stochastic analysis in the chosen NPS model. In the case examined, the uncertainties associated with parameterization and parameter sensitivity are shown to be outweighed in significance by those resulting from structural and conceptual decisions.

Citation Parker, G. T. Rennie, C. D. & Droste, R. L. (2011) Model structure and uncertainty for stochastic non-point source modelling applications. Hydrol. Sci. J. 56(5), 870–882.  相似文献   

6.
State‐of‐the‐art methods for the assessment of building fragility consider the structural capacity and seismic demand variability in the estimation of the probability of exceeding different damage states. However, questions remain regarding the appropriate treatment of such sources of uncertainty from a statistical significance perspective. In this study, material, geometrical and mechanical properties of a number of building classes are simulated by means of a Monte Carlo sampling process in which the statistical distribution of the aforementioned parameters is taken into consideration. Record selection is performed in accordance with hazard‐consistent distributions of a comprehensive set of intensity measures, and issues related with sufficiency, efficiency, predictability and scaling robustness are addressed. Based on the appraised minimum number of ground motion records required to achieve statistically meaningful estimates of response variability conditioned on different levels of seismic intensity, the concept of conditional fragility functions is presented. These functions translate the probability of exceeding a set of damage states as a function of a secondary sufficient intensity measure, when records are selected and scaled for a particular level of primary seismic intensity parameter. It is demonstrated that this process allows a hazard‐consistent and statistically meaningful representation of uncertainty and correlation in the estimation of intensity‐dependent damage exceedance probabilities. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

7.
ABSTRACT

This paper assesses how various sources of uncertainty propagate through the uncertainty cascade from emission scenarios through climate models and hydrological models to impacts, with a particular focus on groundwater aspects from a number of coordinated studies in Denmark. Our results are similar to those from surface water studies showing that climate model uncertainty dominates the results for projections of climate change impacts on streamflow and groundwater heads. However, we found uncertainties related to geological conceptualization and hydrological model discretization to be dominant for projections of well field capture zones, while the climate model uncertainty here is of minor importance. How to reduce the uncertainties on climate change impact projections related to groundwater is discussed, with an emphasis on the potential for reducing climate model biases through the use of fully coupled climate–hydrology models.
Editor D. Koutsoyiannis; Associate editor not assigned  相似文献   

8.
This study attempts to assess the uncertainty in the hydrological impacts of climate change using a multi-model approach combining multiple emission scenarios, GCMs and conceptual rainfall-runoff models to quantify uncertainty in future impacts at the catchment scale. The uncertainties associated with hydrological models have traditionally been given less attention in impact assessments until relatively recently. In order to examine the role of hydrological model uncertainty (parameter and structural uncertainty) in climate change impact studies a multi-model approach based on the Generalised Likelihood Uncertainty Estimation (GLUE) and Bayesian Model Averaging (BMA) methods is presented. Six sets of regionalised climate scenarios derived from three GCMs, two emission scenarios, and four conceptual hydrological models were used within the GLUE framework to define the uncertainty envelop for future estimates of stream flow, while the GLUE output is also post processed using BMA, where the probability density function from each model at any given time is modelled by a gamma distribution with heteroscedastic variance. The investigation on four Irish catchments shows that the role of hydrological model uncertainty is remarkably high and should therefore be routinely considered in impact studies. Although, the GLUE and BMA approaches used here differ fundamentally in their underlying philosophy and representation of error, both methods show comparable performance in terms of ensemble spread and predictive coverage. Moreover, the median prediction for future stream flow shows progressive increases of winter discharge and progressive decreases in summer discharge over the coming century.  相似文献   

9.
Abstract

The term “environmental flows” is now widely used to reflect the hydrological regime required to sustain freshwater and estuarine ecosystems, and the human livelihoods and well-being that depend on them. The definition suggests a central role for ecohydrological science to help determine a required flow regime for a target ecosystem condition. Indeed, many countries have established laws and policies to implement environmental flows with the expectation that science can deliver the answers. This article provides an overview of recent developments and applications of environmental flows on six continents to explore the changing role of ecohydrological sciences, recognizing its limitations and the emerging needs of society, water resource managers and policy makers. Science has responded with new methods to link hydrology to ecosystem status, but these have also raised fundamental questions that go beyond ecohydrology, such as who decides on the target condition of the ecosystem? Some environmental flow methods are based on the natural flow paradigm, which assumes the desired regime is the natural “unmodified” condition. However, this may be unrealistic where flow regimes have been altered for many centuries and are likely to change with future climate change. Ecosystems are dynamic, so the adoption of environmental flows needs to have a similar dynamic basis. Furthermore, methodological developments have been made in two directions: first, broad-scale hydrological analysis of flow regimes (assuming ecological relevance of hydrograph components) and, second, analysis of ecological impacts of more than one stressor (e.g. flow, morphology, water quality). All methods retain a degree of uncertainty, which translates into risks, and raises questions regarding trust between scientists and the public. Communication between scientists, social scientists, practitioners, policy makers and the public is thus becoming as important as the quality of the science.
Editor Z.W. Kundzewicz

Citation Acreman, M.C., Overton, I.C., King, J., Wood, P., Cowx, I.G., Dunbar, M.J., Kendy, E., and Young, W., 2014. The changing role of ecohydrological science in guiding environmental flows. Hydrological Sciences Journal, 59 (3–4), 433–450  相似文献   

10.
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster–Shafer (D–S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D–S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D–S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D–S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster–Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D–S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D–S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.  相似文献   

11.
Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis   总被引:1,自引:0,他引:1  
The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell?CMcGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang?CTang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.  相似文献   

12.
In this paper we extend the generalized likelihood uncertainty estimation (GLUE) technique to estimate spatially distributed uncertainty in models conditioned against binary pattern data contained in flood inundation maps. Untransformed binary pattern data already have been used within GLUE to estimate domain‐averaged (zero‐dimensional) likelihoods, yet the pattern information embedded within such sources has not been used to estimate distributed uncertainty. Where pattern information has been used to map distributed uncertainty it has been transformed into a continuous function prior to use, which may introduce additional errors. To solve this problem we use here ‘raw’ binary pattern data to define a zero‐dimensional global performance measure for each simulation in a Monte Carlo ensemble. Thereafter, for each pixel of the distributed model we evaluate the probability that this pixel was inundated. This probability is then weighted by the measure of global model performance, thus taking into account how well a given parameter set performs overall. The result is a distributed uncertainty measure mapped over real space. The advantage of the approach is that it both captures distributed uncertainty and contains information on global likelihood that can be used to condition predictions of further events for which observed data are not available. The technique is applied to the problem of flood inundation prediction at two test sites representing different hydrodynamic conditions. In both cases, the method reveals the spatial structure in simulation uncertainty and simultaneously enables mapping of flood probability predicted by the model. Spatially distributed uncertainty analysis is shown to contain information over and above that available from global performance measures. Overall, the paper highlights the different types of information that may be obtained from mappings of model uncertainty over real and n‐dimensional parameter spaces. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

13.
Abstract

This study presents a new methodology for estimation of input data measurement-induced uncertainty in simulated dissolved oxygen (DO) and nitrate-nitrogen (NO3-N) concentrations using the Hydrological Simulation Program–FORTRAN (HSPF) model and data from the Amite River, USA. Simulation results show that: (1) a multiplying factor of 1.3 can be used to describe the maximum error in temperature measurements; similarly, a multiplying factor of 1.9 was estimated to accommodate the maximum of ±5% error in rainfall measurements; (2) the uncertainty in simulated DO concentration due to positive temperature measurement errors can be described with a normal distribution, N(0.062, 0.567); (3) the uncertainty in simulated NO3-N concentration due to rainfall measurement errors follows a generalized extreme value distribution; and (4) the probability density functions can be utilized to determine the measurement-induced uncertainty in simulated DO and NO3-N concentrations according to the risk level acceptable in water quality management.

Editor D. Koutsoyiannis

Citation Patil, A. and Deng, Z.-Q., 2012. Input data measurement-induced uncertainty in watershed modelling. Hydrological Sciences Journal, 57 (1), 118–133.  相似文献   

14.
Abstract

A model based on analytical development and numerical solution is presented for estimating the cumulative distribution function (cdf) of the runoff volume and peak discharge rate of urban floods using the joint probability density function (pdf) of rainfall volume and duration together with information about the catchment's physical characteristics. The joint pdf of rainfall event volume and duration is derived using the theory of copulas. Four families of Archimedean copulas are tested in order to select the most appropriate to reproduce the dependence structure of those variables. Frequency distributions of runoff event volume and peak discharge rate are obtained following the derived probability distribution theory, using the functional relationship given by the rainfall–runoff process. The model is tested in two urban catchments located in the cities of Chillán and Santiago, Chile. The results are compared with the outcomes of continuous simulation in the Storm Water Management Model (SWMM) and with those from another analytical model that assumes storm event duration and volume to be statistically independent exponentially distributed variables.

Citation Zegpi, M. & Fernández, B. (2010) Hydrological model for urban catchments – analytical development using copulas and numerical solution. Hydrol. Sci. J. 55(7), 1123–1136.  相似文献   

15.
-- We investigate the impact of different rupture and attenuation models for the Cascadia subduction zone by simulating seismic hazard models for the Pacific Northwest of the U.S. at 2% probability of exceedance in 50 years. We calculate the sensitivity of hazard (probabilistic ground motions) to the source parameters and the attenuation relations for both intraslab and interface earthquakes and present these in the framework of the standard USGS hazard model that includes crustal earthquakes. Our results indicate that allowing the deep intraslab earthquakes to occur anywhere along the subduction zone increases the peak ground acceleration hazard near Portland, Oregon by about 20%. Alternative attenuation relations for deep earthquakes can result in ground motions that differ by a factor of two. The hazard uncertainty for the plate interface and intraslab earthquakes is analyzed through a Monte-Carlo logic tree approach and indicates a seismic hazard exceeding 1 g (0.2 s spectral acceleration) consistent with the U.S. National Seismic Hazard Maps in western Washington, Oregon, and California and an overall coefficient of variation that ranges from 0.1 to 0.4. Sensitivity studies indicate that the paleoseismic chronology and the magnitude of great plate interface earthquakes contribute significantly to the hazard uncertainty estimates for this region. Paleoseismic data indicate that the mean earthquake recurrence interval for great earthquakes is about 500 years and that it has been 300 years since the last great earthquake. We calculate the probability of such a great earthquake along the Cascadia plate interface to be about 14% when considering a time-dependent model and about 10% when considering a time-independent Poisson model during the next 50-year interval.  相似文献   

16.
ABSTRACT

The scientific literature has focused on uncertainty as randomness, while limited credit has been given to what we call here the “seventh facet of uncertainty”, i.e. lack of knowledge. This paper identifies three types of lack of understanding: (i) known unknowns, which are things we know we don’t know; (ii) unknown unknowns, which are things we don’t know we don’t know; and (iii) wrong assumptions, things we think we know, but we actually don’t know. Here we discuss each of these with reference to the study of the dynamics of human–water systems, which is one of the main topics of Panta Rhei, the current scientific decade of the International Association of Hydrological Sciences (IAHS), focusing on changes in hydrology and society. In the paper, we argue that interdisciplinary studies of socio-hydrological dynamics leading to a better understanding of human–water interactions can help in coping with wrong assumptions and known unknowns. Also, being aware of the existence of unknown unknowns, and their potential capability to generate surprises or black swans, suggests the need to complement top-down approaches, based on quantitative predictions of water-related hazards, with bottom-up approaches, based on societal vulnerabilities and possibilities of failure.
Editor D. Koutsoyiannis; Associate editor S. Weijs  相似文献   

17.
ABSTRACT

The scientific literature has focused on uncertainty as randomness, while limited credit has been given to what we call here the “seventh facet of uncertainty”, i.e. lack of knowledge. This paper identifies three types of lack of understanding: (i) known unknowns, which are things we know we don’t know; (ii) unknown unknowns, which are things we don’t know we don’t know; and (iii) wrong assumptions, things we think we know, but we actually don’t know. Here we discuss each of these with reference to the study of the dynamics of human–water systems, which is one of the main topics of Panta Rhei, the current scientific decade of the International Association of Hydrological Sciences (IAHS), focusing on changes in hydrology and society. In the paper, we argue that interdisciplinary studies of socio-hydrological dynamics leading to a better understanding of human–water interactions can help in coping with wrong assumptions and known unknowns. Also, being aware of the existence of unknown unknowns, and their potential capability to generate surprises or black swans, suggests the need to complement top-down approaches, based on quantitative predictions of water-related hazards, with bottom-up approaches, based on societal vulnerabilities and possibilities of failure.
Editor D. Koutsoyiannis; Associate editor S. Weijs  相似文献   

18.
Abstract

The effect of land-use or land-cover change on stream runoff dynamics is not fully understood. In many parts of the world, forest management is the major land-cover change agent. While the paired catchment approach has been the primary methodology used to quantify such effects, it is only possible for small headwater catchments where there is uniformity in precipitation inputs and catchment characteristics between the treatment and control catchments. This paper presents a model-based change-detection approach that includes model and parameter uncertainty as an alternative to the traditional paired-catchment method for larger catchments. We use the HBV model and data from the HJ Andrews Experimental Forest in Oregon, USA, to develop and test the approach on two small (<1 km2) headwater catchments (a 100% clear-cut and a control) and then apply the technique to the larger 62 km2 Lookout catchment. Three different approaches are used to detect changes in stream peak flows using: (a) calibration for a period before (or after) change and simulation of runoff that would have been observed without land-cover changes (reconstruction of runoff series); (b) comparison of calibrated parameter values for periods before and after a land-cover change; and (c) comparison of runoff predicted with parameter sets calibrated for periods before and after a land-cover change. Our proof-of-concept change detection modelling showed that peak flows increased in the clear-cut headwater catchment, relative to the headwater control catchment, and several parameter values in the model changed after the clear-cutting. Some minor changes were also detected in the control, illustrating the problem of false detections. For the larger Lookout catchment, moderately increased peak flows were detected. Monte Carlo techniques used to quantify parameter uncertainty and compute confidence intervals in model results and parameter ranges showed rather wide distributions of model simulations. While this makes change detection more difficult, it also demonstrated the need to explicitly consider parameter uncertainty in the modelling approach to obtain reliable results.

Citation Seibert, J. & McDonnell, J. J. (2010) Land-cover impacts on streamflow: a change-detection modelling approach that incorporates parameter uncertainty. Hydrol. Sci. J. 55(3), 316–332.  相似文献   

19.
Estimating and mapping spatial uncertainty of environmental variables is crucial for environmental evaluation and decision making. For a continuous spatial variable, estimation of spatial uncertainty may be conducted in the form of estimating the probability of (not) exceeding a threshold value. In this paper, we introduced a Markov chain geostatistical approach for estimating threshold-exceeding probabilities. The differences of this approach compared to the conventional indicator approach lie with its nonlinear estimators—Markov chain random field models and its incorporation of interclass dependencies through transiograms. We estimated threshold-exceeding probability maps of clay layer thickness through simulation (i.e., using a number of realizations simulated by Markov chain sequential simulation) and interpolation (i.e., direct conditional probability estimation using only the indicator values of sample data), respectively. To evaluate the approach, we also estimated those probability maps using sequential indicator simulation and indicator kriging interpolation. Our results show that (i) the Markov chain approach provides an effective alternative for spatial uncertainty assessment of environmental spatial variables and the probability maps from this approach are more reasonable than those from conventional indicator geostatistics, and (ii) the probability maps estimated through sequential simulation are more realistic than those through interpolation because the latter display some uneven transitions caused by spatial structures of the sample data.  相似文献   

20.
Water science data are a valuable asset that both underpins the original research project and bolsters new research questions, particularly in view of the increasingly complex water issues facing Canada and the world. Whilst there is general support for making data more broadly accessible, and a number of water science journals and funding agencies have adopted policies that require researchers to share data in accordance with the findable, accessible, interoperable, reusable (FAIR) principles, there are still questions about effective management of data to protect their usefulness over time. Incorporating data management practices and standards at the outset of a water science research project will enable researchers to efficiently locate, analyse and use data throughout the project lifecycle, and will ensure the data maintain their value after the project has ended. Here, some common misconceptions about data management are highlighted, along with insights and practical advice to assist established and early career water science researchers as they integrate data management best practices and tools into their research. Freely available tools and training opportunities made available in Canada through Global Water Futures, The Gordon Foundation DataStream, the Digital Research Alliance of Canada Portage Network, Compute Canada, and university libraries, among others are compiled. These include webinars, training videos, and individual support for the water science community that together enable researchers to protect their data assets and meet the expectations of journals and funders. The perspectives shared here have been developed as part of the Global Water Futures programme's efforts to improve data management and promote the use of common data practices and standards in the context of water science in Canada. Ten best practices are proposed that may be broadly applicable to other disciplines in the natural sciences and can be adopted and adapted globally.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号