© Mark Grechanik, 2016 
Home OC Program Keynotes Calls Sponsors Venue

ICST 2016

Keynote Speakers
Betty H.C. Cheng, Professor,  Software Engineering and Network Systems Laboratory BEACON: NSF Science and Technology Center for the Study of Evolution in Action Computer Science and Engineering Michigan State University

Title: Dealing with Uncertainty for High-Assurance Self-Adaptive Systems

This presentation will overview several research projects that investigate novel ways to model, analyze, and mitigate uncertainty

arising in three different aspects of cyber-physical systems. First, uncertainty about the physical environment can lead to suboptimal,

and sometimes catastrophic, results as the system tries to adapt to unanticipated or poorly-understood environmental conditions.

Second, uncertainty in the cyber environment can lead to unexpected and adverse effects, including not only performance impacts

(load, traffic, etc.) but also potential threats or overt attacks. Finally, uncertainty can exist with the components themselves and how

they interact upon reconfiguration, including unexpected and unwanted feature interactions. Each of these sources of uncertainty can

potentially be identified at different stages, respectively design time and run time, but their mitigation might be done at the same or

at a different stage. Based on the related literature and our investigations, we argue that the following three overarching techniques

are essential and warrant further research to provide enabling technologies to address uncertainty during both stages: model-based

development,  automated assurance techniques, and self-adaptation. Furthermore, we posit that in order to go beyond incremental

improvements to current software engineering techniques, we need to infuse these three areas with successful techniques and

inspirations from other disciplines, such as control theory, machine learning, and biology.

Biography:

Betty H.C. Cheng is a professor in the Department of Computer Science and Engineering at Michigan State University. Her research

interests include self-adaptive systems,  requirements engineering,  model-driven engineering, automated software engineering, and

harnessing evolutionary computation to address software engineering problems. These research areas are used to support the

development of high-assurance adaptive systems that must continuously deliver acceptable behavior, even in the face of environmental

and system uncertainty. Example applications include intelligent transportation and vehicle systems. She collaborates extensively with

industrial partners in her research projects in order to ensure real-world relevance of her research and to facilitate technology exchange

between academia and industry. Previously, she was awarded a NASA/JPL Faculty Fellowship to investigate the use of new software

engineering techniques for a portion of the shuttle software. She works extensively with industrial collaborators, including one sabbatical

working with the Motorola Software Labs investigating automated analysis techniques of specifications of telecommunication systems.

She was awarded an international faculty scholarship to explore research techniques for specifying and managing uncertainty in

high-assurance systems. She is currently on sabbatical, where she is launching new projects in the area of model-driven approaches

to sustainability, cyber security for automotive systems, and feature interaction detection and mitigation for autonomic systems, all

in the context of operating under uncertainty while maintaining assurance objectives. Her research has been funded by several federal

funding agencies, including NSF, ONR, DARPA, NASA, AFRL, ARO, and numerous industrial organizations. She serves on the editorial

boards for Requirements Engineering Journal, and Software and Systems Modeling, and IEEE Transactions on Software Engineering.

She was the Technical Program Co-Chair for IEEE International Conference on Software Engineering (ICSE-2013), the premier and

flagship conference for software engineering.

She received her BS from Northwestern University in 1985 and her MS and PhD from the University of Illinois-Urbana Champaign in

1987 and 1990, respectively, all in computer science. She may be reached at the Department of Computer Science and Engineering,

Michigan State University, 3115 Engineering Building, 428 S. Shaw Lane, East Lansing, MI 48824; chengb@cse.msu.edu;

www.cse.msu.edu/~chengb.

Adam Porter, Professor,  Executive Director at Franunhofer Center for Experimental Software EngineeringProfessor of Computer Science Department of Computer Science University of Maryland Institute for Advanced Computing Studies (UMIACS)  University of Maryland at College Park

Title: What got us here, will not get us there: Trends and challenges in testing tomorrow’s systems.

 

Looking back over the last 40 years or so, the software testing community has much to be proud of. Despite alternating periods of

struggle and success, considered the hot topic at one point , but out of fashion the next, the software testing research “enterprise” on

the whole has been unarguably successful - creating foundational theory and concepts, building practical tools, methods and techniques,

and effectively transferring its research into practice.

As with any enterprise, however, success isn’t permanent. Over time the world changes and the assumptions, goals and strategies that

once brought success may prove less effective in the future.  For instance, as distributed architectures with weak data consistency models

become more common, the way we construct testing oracles will certainly have to change. Researchers may want to begin thinking now

about what these and many other changes mean to software testing.

In other industries, enterprises regularly review their objectives, the environment in which they operate, and their own strengths and

weaknesses to help them plan for future success. One such process is sometimes referred to as a SWOT (Strengths, Weaknesses,

Opportunities and Threats) analysis.

In an industry changing as fast as software development is, I believe that the software testing community could benefit greatly from this

kind of introspection and analysis. Hoping to spark a community-wide conversation, in this talk, I’ll provide my personal view of the

objectives underlying the software testing enterprise and discuss some of the trends and challenges that I believe will shape our

enterprise’s future success.

 

Biography:

Since 1991 Dr. Porter has been a professor of computer science at the University of Maryland and the University of Maryland Institute for

Advanced Studies (UMIACS). He is also currently serving as the Scientific and Executive Director of the Fraunhofer Center for Experimental

Software Engineering, a UMD-affiliated applied research and technology transition center.

Dr. Porter is an award-winning teacher and researcher, whose work has generally focused on developing tools and techniques for

large-scale software development and quality assurance. Specifically, his research has focused on developing empirical methods for

identifying and eliminating bottlenecks in industrial development processes, experimental evaluation of fundamental software engineering

hypotheses, and development of tools that demonstrably improve fundamental software development processes, such as software

inspection and software testing.  He also created and runs one of the world's largest Massive Online Open Courses (MOOCs) on Mobile

Application Development for the Android Platform, which has had over 800,000+ student registrations from nearly every country on

the planet.

A. Prasad Sistla, Professor,  Professor of Computer Science Department of Computer Science University of Illinois at Chicago

Title: Runtime Verification of Cyber Physical Systems.

With the increasing use of Cyber Physical Systems in society it is essential to make sure the safe functioning of such systems, especially

in safety critical systems. In general thorough  design verification of such systems is not feasible and  testing does not guarantee

complete correctness. Runtime verification or monitoring is a complementary approach, which also provides an additional layer of safety.

Runtime verification of such systems is challenging due to the fact that the under lying state of the system is not always observable and

the state variables of the system  include both continuous and discrete variables. Furthermore, the behavior of such systems is

probabilistic in nature due to the noise present in the sensors and due to other uncertainties (such as failures) being modeled

probabilistically. We consider the problem of monitoring such system by observing it's outputs to ensure if the underlying computation

of the system satisfies a given temporal property. We will outline the challenges and possible approaches for  monitoring temporal

properties of such systems.

Biography:

Dr. Sistla obtained Ph. D. degree in Computer Science/ Applied Mathematics from Harvard University. Prior to that he obtained

M.E. degree in Computer Science from Indian Institute of Science, Bangalore, India. He is currently a Professor in the Department of

Computer Science in the University of Illinois at Chicago. Prof. Sistla has done extensive research in the areas of Model Checking,

Formal Methods and Database Systems. He published some of the earliest papers employing model checking based techniques in

verication of concurrent systems. Dr. Sistla served on the editorial boards of leading Computer Science Journals, also on the Program

Committees of many important Computer Science conferences. Dr. Sistla's research has been funded by leading organizations such

as NSF, AFOSR, DARPA etc.

Keynotes