Deep learning with relational logic representations [electronic resource] / Gustav ?ír.
- 作者: ?ír, Gustav.
- 其他題名:
- Frontiers in artificial intelligence and applications ;
- 出版: Amsterdam : IOS Press 2022.
- 叢書名: Frontiers in Artificial Intelligence and Applications ;volume 357
- 主題: Deep learning (Machine learning) , Electronic books.
- ISBN: 9781643683430 (electronic bk.) 、 9781643683423
- URL:
點擊此處查看電子書
點擊此處查看電子書
- 一般註:112年度臺灣學術電子書暨資料庫聯盟採購
-
讀者標籤:
- 系統號: 000306544 | 機讀編目格式
館藏資訊

Deep learning has been used with great success in a number of diverse applications, ranging from image processing to game playing, and the fast progress of this learning paradigm has even been seen as paving the way towards general artificial intelligence. However, the current deep learning models are still principally limited in many ways. This book, ‘Deep Learning with Relational Logic Representations’, addresses the limited expressiveness of the common tensor-based learning representation used in standard deep learning, by generalizing it to relational representations based in mathematical logic. This is the natural formalism for the relational data omnipresent in the interlinked structures of the Internet and relational databases, as well as for the background knowledge often present in the form of relational rules and constraints. These are impossible to properly exploit with standard neural networks, but the book introduces a new declarative deep relational learning framework called Lifted Relational Neural Networks, which generalizes the standard deep learning models into the relational setting by means of a ‘lifting’ paradigm, known from Statistical Relational Learning. The author explains how this approach allows for effective end-to-end deep learning with relational data and knowledge, introduces several enhancements and optimizations to the framework, and demonstrates its expressiveness with various novel deep relational learning concepts, including efficient generalizations of popular contemporary models, such as Graph Neural Networks. Demonstrating the framework across various learning scenarios and benchmarks, including computational efficiency, the book will be of interest to all those interested in the theory and practice of advancing representations of modern deep learning architectures.




