Verifying an abstract model of a system gives us valuable insights, but only guarantees safety of the real system insofar as the system is faithful to the model. If the system is not actually faithful to the model, all bets are off. One way to address the gap between models and implementations is by automatically generating a verified runtime monitor from the model, which then dynamically checks that the real system obeys the assumptions of the model and takes failsafe measures if it does not. This approach is taken by ModelPlex, but this alone does not provide a full answer because the monitors are expressed as formulas in differential dynamic logic, which are not obviously executable because they contain computations over real numbers.

In this work we first lower these monitors toward execution by soundly translating them to compute with interval arithmetic. We then soundly generate a concrete implementation in CakeML and compile it soundly to machine code. We have testing the resulting code on a Rasperry Pi-controlled ground robot.

The interval arithmetic proofs are performed in Isabelle/HOL while the initial model safety proof is performed in KeYmaera X. To reduce the trusted base, we built on a prior formalization of differential dynamic logic to build a verified proof checker for proofs exported from KeYmaera X.

**Brandon Bohrer**, Yong Kiam Tan, Stefan Mitsch, Magnus O. Myreen, and André Platzer.

VeriPhy: Verified Controller Executables from Verified Cyber-Physical System Models.

Programming Language Design and Implementation - 39th ACM SIGPLAN Conference,

**PLDI 2018**, Philadelpha, PA, June 18-22, 2018, ACM, 2018. © The authors.

To appear. preprint