menu-bar All the categories

341,00 kr

Probabilistic Deep Learning shows how probabilistic deep learning models gives readers the tools to identify and account for uncertainty and potential errors in their results.   Starting by applying the underlying maximum likelihood principle of curve fitting to deep learning, readers will move on to using the Python-based Tensorflow Probability framework, and set up Bayesian neural networks that can state their uncertainties.   Key Features ·   The maximum likelihood principle that underlies deep learning applications ·   Probabilistic DL models that can indicate the range of possible outcomes ·   Bayesian deep learning that allows for the uncertainty occurring in real-world situations ·   Applying probabilistic principles to variational auto-encoders   Aimed  at  a  reader  experienced  with  developing  machine  learning  or deep learning applications.   About the technology Probabilistic deep learning models are better suited to dealing with the noise  and  uncertainty  of  real  world  data —a  crucial  factor  for self-driving cars, scientific results, financial industries, and other accuracy-critical applications.   Oliver Dürr is professor for data science at the University of Applied Sciences in Konstanz, Germany.   Beate Sick holds a chair for applied statistics at ZHAW, and works as a researcher and lecturer at the University of Zurich, and as a lecturer at ETH Zurich.   Elvis Murina is a research assistant, responsible for the extensive exercises that accompany this book.   Dürr and Sick are both experts in machine learning and statistics. They have supervised numerous bachelors, masters, and PhD the seson the topic of deep learning, and planned and conducted several postgraduate and masters-level deep learning courses. All three authors have been working with deep learning methods since 2013 and have extensive experience in both teaching the topic and developing probabilistic deep learning models.