Bayesian reasoning and machine learning / David Barber.
By: Barber, David
Publisher: Cambridge ; New York : Cambridge University Press, 2012Description: xxiv, 697 p. : ill. ; 26 cmISBN: 9780521518147Subject(s): Machine learning | Bayesian statistical decision theory | COMPUTERS / Computer Vision & Pattern RecognitionDDC classification: 006.3/1 LOC classification: QA267 | .B347 2012Other classification: COM016000 Online resources: Cover image | Contributor biographical information | Publisher description | Table of contents onlyItem type | Current location | Home library | Call number | Status | Date due | Barcode | Item holds |
---|---|---|---|---|---|---|---|
![]() |
COLLEGE LIBRARY | COLLEGE LIBRARY SUBJECT REFERENCE | 006.31 B233 2012 (Browse shelf) | Available | CITU-CL-44002 |
Browsing COLLEGE LIBRARY Shelves , Shelving location: SUBJECT REFERENCE Close shelf browser
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
No cover image available | ||
006.3 Xi4 2002 Probabilistic reasoning in multiagent systems : a graphical models approach / | 006.30151 G3465 2023 Practical mathematics for AI and deep learning / | 006.3023 G829 2007 Careers in artificial intelligence / | 006.31 B233 2012 Bayesian reasoning and machine learning / | 006.31 B413 2015 Machine learning : hands-on for developers and technical professionals / | 006.31 C839 2005 Fuzzy modeling and genetic algorithms for data mining and exploration / | 006.31 G675 2018 Machine learning : a constraint-based approach / |
Includes bibliographical references and index.
Machine generated contents note: Preface; Part I. Inference in Probabilistic Models: 1. Probabilistic reasoning; 2. Basic graph concepts; 3. Belief networks; 4. Graphical models; 5. Efficient inference in trees; 6. The junction tree algorithm; 7. Making decisions; Part II. Learning in Probabilistic Models: 8. Statistics for machine learning; 9. Learning as inference; 10. Naive Bayes; 11. Learning with hidden variables; 12. Bayesian model selection; Part III. Machine Learning: 13. Machine learning concepts; 14. Nearest neighbour classification; 15. Unsupervised linear dimension reduction; 16. Supervised linear dimension reduction; 17. Linear models; 18. Bayesian linear models; 19. Gaussian processes; 20. Mixture models; 21. Latent linear models; 22. Latent ability models; Part IV. Dynamical Models: 23. Discrete-state Markov models; 24. Continuous-state Markov models; 25. Switching linear dynamical systems; 26. Distributed computation; Part V. Approximate Inference: 27. Sampling; 28. Deterministic approximate inference; Appendix. Background mathematics; Bibliography; Index.
"Machine learning methods extract value from vast data sets quickly and with modest resources. They are established tools in a wide range of industrial applications, including search engines, DNA sequencing, stock market analysis, and robot locomotion, and their use is spreading rapidly. People who know the methods have their choice of rewarding jobs. This hands-on text opens these opportunities to computer science students with modest mathematical backgrounds. It is designed for final-year undergraduates and master's students with limited background in linear algebra and calculus. Comprehensive and coherent, it develops everything from basic reasoning to advanced techniques within the framework of graphical models. Students learn more than a menu of techniques, they develop analytical and problem-solving skills that equip them for the real world. Numerous examples and exercises, both computer based and theoretical, are included in every chapter. Resources for students and instructors, including a MATLAB toolbox, are available online"-- Provided by publisher.
"Vast amounts of data present amajor challenge to all thoseworking in computer science, and its many related fields, who need to process and extract value from such data. Machine learning technology is already used to help with this task in a wide range of industrial applications, including search engines, DNA sequencing, stock market analysis and robot locomotion. As its usage becomes more widespread, no student should be without the skills taught in this book. Designed for final-year undergraduate and graduate students, this gentle introduction is ideally suited to readers without a solid background in linear algebra and calculus. It covers everything from basic reasoning to advanced techniques in machine learning, and rucially enables students to construct their own models for real-world problems by teaching them what lies behind the methods. Numerous examples and exercises are included in the text. Comprehensive resources for students and instructors are available online"-- Provided by publisher.
There are no comments for this item.