TY - GEN
T1 - ICE-based refinement type discovery for higher-order functional programs
AU - Champion, Adrien
AU - Chiba, Tomoya
AU - Kobayashi, Naoki
AU - Sato, Ryosuke
N1 - Funding Information:
He Zhu for his help in benchmarking DOrder, and the reviewers for their constructive feedback. This work was supported by JSPS KAKENHI Grant Number JP15H05706.
Publisher Copyright:
© The Author(s) 2018.
PY - 2018
Y1 - 2018
N2 - We propose a method for automatically finding refinement types of higher-order function programs. Our method is an extension of the Ice framework of Garg et al. for finding invariants. In addition to the usual positive and negative samples in machine learning, their Ice framework uses implication constraints, which consist of pairs (x, y) such that if x satisfies an invariant, so does y. From these constraints, Ice infers inductive invariants effectively. We observe that the implication constraints in the original Ice framework are not suitable for finding invariants of recursive functions with multiple function calls. We thus generalize the implication constraints to those of the form ((x1, ⋯, xk), y), which means that if all of x1, ⋯, xk satisfy an invariant, so does y. We extend their algorithms for inferring likely invariants from samples, verifying the inferred invariants, and generating new samples. We have implemented our method and confirmed its effectiveness through experiments.
AB - We propose a method for automatically finding refinement types of higher-order function programs. Our method is an extension of the Ice framework of Garg et al. for finding invariants. In addition to the usual positive and negative samples in machine learning, their Ice framework uses implication constraints, which consist of pairs (x, y) such that if x satisfies an invariant, so does y. From these constraints, Ice infers inductive invariants effectively. We observe that the implication constraints in the original Ice framework are not suitable for finding invariants of recursive functions with multiple function calls. We thus generalize the implication constraints to those of the form ((x1, ⋯, xk), y), which means that if all of x1, ⋯, xk satisfy an invariant, so does y. We extend their algorithms for inferring likely invariants from samples, verifying the inferred invariants, and generating new samples. We have implemented our method and confirmed its effectiveness through experiments.
UR - http://www.scopus.com/inward/record.url?scp=85045836666&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85045836666&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-89960-2_20
DO - 10.1007/978-3-319-89960-2_20
M3 - Conference contribution
AN - SCOPUS:85045836666
SN - 9783319899596
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 365
EP - 384
BT - Tools and Algorithms for the Construction and Analysis of Systems - 24th International Conference, TACAS 2018, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2018, Proceedings
A2 - Beyer, Dirk
A2 - Huisman, Marieke
PB - Springer Verlag
T2 - 24th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2018 Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2018
Y2 - 14 April 2018 through 20 April 2018
ER -