While decision trees are a popular formal and quantitative method for determining an optimal decision from a finite set of choices, for all but very simple problems they are computationally intractable. For this reason, Influence Diagrams (IDs) have been used as a more compact and efficient alternative. However, most algorithmic solutions assume that all chance variables are discrete, whereas in practice many are continuous. For such 'Hybrid' IDs (HIDs) the current-state-of-the-art algorithms suffer from various limitations on the kinds of inference that can be performed. This paper presents a novel method that overcomes a number of these limitations. The method solves a HID by transforming it to a Hybrid Bayesian Network (HBN) and carrying out inference on this HBN using Dynamic Discretization (DD). It generates a simplified decision tree from the propagated HBN to compute and present the optimal decisions under different decision scenarios. To provide satisfactory performance the method uses 'inconsistent evidence' to model functional and structural asymmetry. By using the entire marginal probability distribution of the continuous utility and chance nodes, rather than expected values alone, our method also enhances decision analysis by offering the possibility to consider additional statistics other than expected utility, such as measures of risk. We illustrate our method by using the oil wildcatter example and its variations with continuous nodes. We also use a financial score to combine risk and return measures, for illustration. (C) 2018 Elsevier Inc. All rights reserved.