TY - GEN
T1 - Boosting-Based Construction of BDDs for Linear Threshold Functions and Its Application to Verification of Neural Networks
AU - Tang, Yiping
AU - Hatano, Kohei
AU - Takimoto, Eiji
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023
Y1 - 2023
N2 - Understanding the characteristics of neural networks is important but difficult due to their complex structures and behaviors. Some previous work proposes to transform neural networks into equivalent Boolean expressions and apply verification techniques for characteristics of interest. This approach is promising since rich results of verification techniques for circuits and other Boolean expressions can be readily applied. The bottleneck is the time complexity of the transformation. More precisely, (i) each neuron of the network, i.e., a linear threshold function, is converted to a Binary Decision Diagram (BDD), and (ii) they are further combined into some final form, such as Boolean circuits. For a linear threshold function with n variables, an existing method takes O(n2n2) time to construct an ordered BDD of size O(2n2) consistent with some variable ordering. However, it is non-trivial to choose a variable ordering producing a small BDD among n! candidates. We propose a method to convert a linear threshold function to a specific form of a BDD based on the boosting approach in the machine learning literature. Our method takes O(2npoly (1 / ρ) ) time and outputs BDD of size O(n2ρ4ln1ρ), where ρ is the margin of some consistent linear threshold function. Our method does not need to search for good variable orderings and produces a smaller expression when the margin of the linear threshold function is large. More precisely, our method is based on our new boosting algorithm, which is of independent interest. We also propose a method to combine them into the final Boolean expression representing the neural network. In our experiments on verification tasks of neural networks, our methods produce smaller final Boolean expressions, on which the verification tasks are done more efficiently.
AB - Understanding the characteristics of neural networks is important but difficult due to their complex structures and behaviors. Some previous work proposes to transform neural networks into equivalent Boolean expressions and apply verification techniques for characteristics of interest. This approach is promising since rich results of verification techniques for circuits and other Boolean expressions can be readily applied. The bottleneck is the time complexity of the transformation. More precisely, (i) each neuron of the network, i.e., a linear threshold function, is converted to a Binary Decision Diagram (BDD), and (ii) they are further combined into some final form, such as Boolean circuits. For a linear threshold function with n variables, an existing method takes O(n2n2) time to construct an ordered BDD of size O(2n2) consistent with some variable ordering. However, it is non-trivial to choose a variable ordering producing a small BDD among n! candidates. We propose a method to convert a linear threshold function to a specific form of a BDD based on the boosting approach in the machine learning literature. Our method takes O(2npoly (1 / ρ) ) time and outputs BDD of size O(n2ρ4ln1ρ), where ρ is the margin of some consistent linear threshold function. Our method does not need to search for good variable orderings and produces a smaller expression when the margin of the linear threshold function is large. More precisely, our method is based on our new boosting algorithm, which is of independent interest. We also propose a method to combine them into the final Boolean expression representing the neural network. In our experiments on verification tasks of neural networks, our methods produce smaller final Boolean expressions, on which the verification tasks are done more efficiently.
KW - Binary decision diagram
KW - Boosting
KW - Convolutional Neural Network
KW - Verification
UR - http://www.scopus.com/inward/record.url?scp=85174306750&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85174306750&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-45275-8_32
DO - 10.1007/978-3-031-45275-8_32
M3 - Conference contribution
AN - SCOPUS:85174306750
SN - 9783031452741
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 477
EP - 491
BT - Discovery Science - 26th International Conference, DS 2023, Proceedings
A2 - Bifet, Albert
A2 - Lorena, Ana Carolina
A2 - Ribeiro, Rita P.
A2 - Gama, João
A2 - Abreu, Pedro H.
PB - Springer Science and Business Media Deutschland GmbH
T2 - 26th International Conference on Discovery Science, DS 2023
Y2 - 9 October 2023 through 11 October 2023
ER -