A potential cause for concern in the model fit is the wide intervals

To develop an adequate model to predict viral transport in plant tissue, it is necessary to couple mathematical assumptions with an understanding of the underlying bio-geochemical processes governing virus removal, plant growth, growth conditions and virus-plant interactions. For example, although a simple transport model without AD could predict the viral load in the lettuce at harvest, it failed to capture the initial curvature in the viral load in the growth medium . An alternative to the AD hypothesis that could capture this curvature is the existence of two populations of viruses as used in Petterson et al. , one decaying slower than the other. However, a closer examination of the double exponential model revealed that it was not time invariant. This means that the time taken to decay from a concentration C1 to C2 is not unique and depends on the history of the events that occurred . Other viral models, such as the ones used in Peleg and Penchina faced the same issues. The incorporation of AD made the model time invariant and always provided the same time for decay between two given concentrations. This model fitting experience showcases how mathematics can guide the understanding of biological mechanisms. The hypothesis of two different NoV populations is less plausible than that of viral attachment and detachment to the hydroponic tank. While it appears that incorporating the AD mechanism does not significantly improve viral load prediction in lettuce shoot at harvest, this is a consequence of force fitting the model to data under the given conditions. Changing the conditions, for example,hydroponic fodder system by reducing viral attachment rate to the tank wall, will underestimate virus load in the lettuce shoot in the absence of AD .

Through this model fitting exercise, we also acknowledge that the model can be significantly improved with new insights on virus plant interactions and more data on the viral transport through plants.However, there is significant uncertainty in the data as well suggesting that the transport process itself is noise prone. Moreover, from the perspective of risk assessment, the variability between dose-response models is higher than the within dose-response model variability . Since within dose-response model variability stems from uncertainty in viral loads at harvest among other factors, the wide intervals do not exert a bigger effect than the discordance from different dose response models.Some parameters are identifiable to a good degree through model fitting, but there is a large degree of uncertainty in the viral transport efficiencies and the AD kinetic parameters. While this could be a consequence of fitting limited number of data points with several parameters, the viral load at harvest and risk estimates were well constrained. This large variation in parameters and ‘usefully tight quantitative predictions’ is termed the sloppiness of parameter sensitivities, and has been observed in physics and systems biology . Well designed experiments may simultaneously reduce uncertainty in the parameters as well as predictions , and therefore increasing confidence in predictions. One possible experiment to reduce parameter uncertainty is recording the transpiration and growth rate to fit Eq. 6 independently to acquire at and bt.An interesting outcome of our analysis is the strong association of risk with plant growth conditions. The health risks from consuming lettuce irrigated with recycled wastewater are highest in hydroponic grown lettuce, followed by soil grown lettuce under Sc2 and the least in soil grown lettuce under Sc1 . This difference in risk estimates stems to a large degree from the difference in AD kinetic constants . Increasing katt, s will decrease risk as more viruses will get attached to the growth medium, while increasing kdet, s will have the opposite effect , as more detached viruses are available for uptake by the plant.

The combined effect of the AD parameters depends on their magnitudes and is portrayed in Supplementary Fig. S5. This result indicates that a better understanding of the virus interaction with the growth environment can lead to an improved understanding of risk. More importantly, this outcome indicates that soil plays an important role in the removal of viruses from irrigation water through adsorption of viral particles. An investigation focused on understanding the influence of soil composition on viral attachment will help refine the transport model. The risk predicted by this dynamic transport model is greater than the EPA annual infection risk as well as the WHO annual disease burden benchmark. The reasons for this outcome are many-fold. First, there is a significant variability in the reported internalization of viruses in plants. In research of data for modeling NoV transport in plant, we filtered the existing data using the following criteria: 1) human NoV used as the seed agent, 2) quantitative viral results in growth medium and different locations of the plant. Based on these criteria, the data from DiCaprio et al. represent the best available data on viral internalization and transport in lettuce. However, it is also important to note that a similar study by Urbanucci et al. did not observe human NoV internalization in lettuce. This discrepancy could be due to the specific subspecies of the plant and growth conditions used in the studies. In addition, minor changes such as damages in roots or decrease in humidity of the growing environment can promote pathogen internalization . Alternatively, tracking viral transport through the growth medium and plant is challenging, which may yield false results due to reaction inhibitions in genome amplification and poor detection limit. The risk outcome of this study is conservative because it assumes an individual consumes the wastewater irrigated lettuce daily for an entire year. This assumption and the corresponding higher risk estimates are only applicable to a small portion of consumers, while most consumers in the U.S. are likely to have a more diverse diet.

While the model outcomes presented here represent the best attempt given the available data, it is also possible that the internalization observed by DiCaprio et al. is an extreme case and typical internalization is lesser. As previously discussed by others , risk estimates by different NoV dose-response models differed by orders of magnitude. This study primarily aims to introduce a viral transport model without advocating any one dose-response model. We hope the future refinement of pathogen dose-response models will reduce variability in risk estimates. The risk of consuming lettuce grown in soil as predicted by SalesOrtells et al. is higher than our predictions, although the results of DiCaprio et al. were used in both studies. This is a consequence of considering the greater adsorption capability of soil, which is not reflected when assuming a simple input:output ratio. Using different inoculating concentrations of NoV, body weight and consumption rate distributions also contributed to difference in the outcomes but to a lesser extent. Parameters for crisp head lettuce were obtained from several different sources, each possibly using a different sub-variety of crips head. Yet, global sensitivity analysis showed insensitivity of risk estimates to several assumed and fitted parameters , lending confidence to the approaches taken to parametrize the model. The importance of taking the dynamics of viral transport is underscored by the sensitivity to tli, h in hydroponic and tht, s in soil cases. This suggests that given no change in lettuce consumption, changes in irrigation schedule can affect risk outcome. Such arguments were not possible with the approach of Sales-Ortells et al. . In soil grown lettuce,fodder system the high sensitivity to kp indicate the role of plant specific processes in mediating risk outcome.In addition to a transport model predicting the NoV load in lettuce, we explore the strategies to reduce the risk of NoV gastroenteritis by increasing holding time of the produce after harvesting or using bigger hydroponic culture volumes. Although neither strategy could significantly alleviate the risks, the process highlights two strengths of modeling: 1) It provides mathematical support for arguments that would otherwise be less convincing; 2) It predicts outcomes of experiments without the physical resources required to perform them. For instance, the model can be used to explore alternate irrigation schedules to reduce the NoV internalization risk. Modeling also helps encapsulate our understanding of the system and generate hypotheses. For example, simple first order decay did not produce the trend observed in the water, which suggests that additional mechanisms are at play. We postulate the attachment of virus particles on the walls of the hydroponic system as one possible mechanism and examined the fit of the model. Although viral attachment to glass or other materials has been observed before , here it stands as a hypothesis that can be tested. In addition to generating and testing hypotheses, some of our model assumptions raise broader questions for future research. For example, it was assumed that viruses are transported at the rate of transpiration from the growth medium to the roots, yet not much is known regarding the role of roots in the internalization of viruses. Investigating the defense mechanisms of plants’ roots to passive viral transport, i.e. through rhizosphere microbiome interactions, may shed light on the broad understanding of plant and microbe interactions. The question of extending this model to other pathogen and plant systems draws attention to the dearth of data in enabling such efforts. While modeling another virus may not require changes to the model, understanding transport in other plants can be challenging.

Data required includes models for growth rate and transpiration, plant growth characteristic including density, water content, as well as internalization studies to determine transport efficiencies. However, from the perspective of risk management, lettuce may be used as the worst-case scenario estimate of risk in water reuse owing to its high consumption with minimal pathogen inactivation by cooking. This worst-case scenario can be used to set water quality standards for irrigation water for production of fresh produce eaten raw. The models can also be extended to include pathogen transport to the plant tissue from manure/ biosolids that are used as organic fertilizer.It is impossible to separate the management of nitrogen fertilizer from that of irrigation water in irrigated agriculture. Methods of application, timing, and amounts applied are key concerns both for fertilization and for irrigation water application. While many experiments have characterized crop and soil responses to one variable, relatively few have endeavored to study the interaction of fertilizer management sytems with irrigation management systems. The approach taken in this project was to examine interactions between these two centrally important components of agricultural production, with the ultimate objective of improving recommendations for the use of water and fertilizers in irrigated agriculture.Both greenhouse and field trials were established, at the Agricultural Field Stations at University of California, Riverside and at South Coast Field Station, Santa Ana. Greenhouse trials were undertaken to assess basic relationships between water and nitrogen supply and uptake, while field trials emphasized use of current agricultural production technology to test relationships in the fielk. Most of the research has been published, either as graduate theses or 1n scientific journals. These publications will be referenced throughout this report, and complete details of the research can be found within them.Greenhouse trials were first conducted to determine the relationship between minimum N03 -N concentration and N and water accumulation by tomatoes and lettuce . Over a wide range of solution N03-N concentrations, the ratio of N uptake to water uptake was constant, at approximately 100 fig N/L. This suggested that a constant, continuous supply of N in the irrigation water could supply the necessary nutrient without providing an excess. Experiments in soil columns with Romaine lettuce and Swiss Chard demonstrated that Chard could very efficiently decrease the solution N concentrations to near zero before water passed out the bottom of the column; lettuce was much less efficient. Tomatoes were grown in soil columns with sealed head spaces through which acid-scrubbed air was passed. Columns were irrigated frequently with water containing 0, 50, 100, or 200 fig NIL. Half of the treatments received irrigation water dripped on the surface at the base of the plant, and the other half received it 2.5 em below the soil surface. Urea-ammonium-nitrate was the source of all N. Less than 0.1% of the applied N was trapped as volatilized NH3 , even in the “most likelytt treatment .