Many-Sorted Algebras for Deep Learning and Quantum Technology


Many-Sorted Algebras for Deep Learning and Quantum Technology
Author: Charles R. Giardina Ph.D. (Author)
Publisher finelybook 出版社: Morgan Kaufmann
Edition 版次: 1st
Publication Date 出版日期: 2024-02-19
Language 语言: English
Print Length 页数: 422 pages
ISBN-10: 0443136971
ISBN-13: 9780443136979


Book Description
By finelybook

Many-Sorted Algebras for Deep Learning and Quantum Technology presents a precise and rigorous
description of basic concepts in quantum technologies and how they relate to deep learning and quantum theory. Current merging of quantum theory and deep learning techniques provides the need for a source that gives readers insights into the algebraic underpinnings of these disciplines. Although analytical, topological, probabilistic, as well as geometrical concepts are employed in many of these areas, algebra exhibits the principal thread; hence, this thread is exposed using many-sorted algebras. This book includes hundreds of well-designed examples that illustrate the intriguing concepts in quantum systems. Along with these examples are numerous visual displays. In particular, the polyadic graph shows the types or sorts of objects used in quantum or deep learning. It also illustrates all the inter and intra-sort operations needed in describing algebras. In brief, it provides the closure conditions. Throughout the book, all laws or equational identities needed in specifying an algebraic structure are precisely described.

  • Includes hundreds of well-designed examples to illustrate the intriguing concepts in quantum systems
  • Provides precise description of all laws or equational identities that are needed in specifying an algebraic structure
  • Illustrates all the inter and intra sort operations needed in describing algebras


Review

Presents the algebraic underpinnings and basic concepts in Quantum Theory and how they relate to Deep Learning and Quantum technologies

From the Back Cover

Many-Sorted Algebras for Deep Learning and Quantum Technology presents a precise and rigorous
description of basic concepts in quantum technologies and how they relate to deep learning and quantum theory. Current merging of quantum theory and deep learning techniques provides the need for a source that gives readers insights into the algebraic underpinnings of these disciplines. Although analytical, topological, probabilistic, as well as geometrical concepts are employed in many of these areas, algebra exhibits the principal thread; hence, this thread is exposed using many-sorted algebras. This book includes hundreds of well-designed examples that illustrate the intriguing concepts in quantum systems. Along with these examples are numerous visual displays. In particular, the polyadic graph shows the types or sorts of objects used in quantum or deep learning. It also illustrates all the inter and intra-sort operations needed in describing algebras. In brief, it provides the closure conditions. Throughout the book, all laws or equational identities needed in specifying an algebraic structure are precisely described.

About the Author

Charles R. Giardina was born in the Bronx, NY, on December 29, 1942. He received the B.S. degree in mathematics from Fairleigh Dickinson University, Rutherford, NJ, and the M.S. degree in mathematics from Carnegie Institute of Technology, Pittsburgh, PA. He also received the M.E.E. degree in 1969, and the Ph.D. degree in mathematics and electrical engineering in 1970 from Stevens Institute of Technology, Hoboken, NJ. Dr. Giardina was Professor of Mathematics, Electrical Engineering, and Computer Science at Fairleigh Dickinson University from 1965 to 1982. From 1982 to 1986, he was a Professor at the Stevens Institute of Technology. From 1986 to 1996, he was a Professor at the College of Staten Island, City University of New York. From 1996, he was with Bell Telephone Laboratories, Whippany, NJ, USA. His research interests include digital signal and image processing, pattern recognition, artificial intelligence, and the constructive theory of functions. Dr. Giardina has authored numerous papers in these areas, and several books including, Mathematical Models for Artificial Intelligence and Autonomous Systems, Prentice Hall; Matrix Structure Image Processing, Prentice Hall; Parallel Digital Signal Processing: A Unified Signal Algebra Approach, Regency; Morphological Methods in Image and Signal Processing, Prentice Hall; Image Processing Continuous to Discrete: Geometric, Transform, and Statistical Methods, Prentice Hall; and A Unified Signal Algebra Approach to Two-Dimensional Parallel Digital Signal Processing, Chapman and Hall/CRC Press.

Amazon page

相关文件下载地址

下载地址 Download解决验证以访问链接!
打赏
未经允许不得转载:finelybook » Many-Sorted Algebras for Deep Learning and Quantum Technology

评论 抢沙发

觉得文章有用就打赏一下

您的打赏,我们将继续给力更多优质内容

支付宝扫一扫

微信扫一扫