Retrofitting a type system onto a real world dynamic expression language

pdf

Presented as part of the 2021 HCSS conference

ABSTRACT

Customers use AWS IoT Events to monitor their equipment for failures. To do so, they create an event detector, modeled as a state machine, in AWS IoT Events to trigger actions when such failures occur. The state machine consists of states and transitions predicated on conditions written in the IoT Events expression language. Each state machine takes an input value sent by a customer’s IoT device, substitutes input values in its expressions, evaluates them, and triggers other AWS services as its output. 

Customers of AWS IoT Events create and edit their detector model in the console. As they update the detector model, their work is checked for common errors via a troubleshooting “Analysis” feature integrated into the console. This feature allows customers to catch syntactic, missing data, as well as type errors directly on the console, where the customer is editing their detector model. Each reported error and warning guides the customer to the location in the detector model where the potential issue exists. This allows customers to understand and fix the issue in the same console page where they are editing the detector model. 

One category of errors and warnings reported during this edit/analyze/edit cycle is type errors. The type checker used by the Analysis feature dynamically creates type inference rules from the type rules of the IoT Events expression language. It  infers types for terms in the customer’s detector model and reports not only type errors but also inferred types for terms in their detector model. The Analysis feature helps customers understand and fix type errors without asking them to (1) declare types for their inputs, (2) understand the type rules for the expression language, (3) understand the type inference rules used by the type checker. Customers spend no additional effort to run the type checker or fix its reported issues, thereby making the type checker “disappear” in the customer’s debugging experience.

Vaibhav Sharma is an Applied Scientist at Amazon Web Services (AWS). Over the last one year, Vaibhav has worked on applying automated reasoning to the Internet of Things (IoT) domain. Prior to AWS, Vaibhav obtained a PhD in Computer Science from the University of Minnesota during which he worked on program synthesis using scalable symbolic execution.

 

 

 

 

Bilal Khan is a Software Development Manager in the AWS IoT team, leading the software development for AWS IoT Events, a managed service that enables companies to continuously monitor their equipment and fleets of devices for failure or changes in operation and trigger alerts to respond when such events occur. Prior to Amazon, he was in different roles ranging from leading strategic initiatives for a mid-size embedded systems manufacturing company building remote monitoring solutions, and before that he was with Nokia and Motorola focusing on building O&M solutions for large & complex cellular communication networks. Outside of work he enjoys the outdoors in the beautiful PNW.

Tags:
License: CC-2.5
Submitted by Anonymous on