Organizing Committee

Conference Schedule




(Author Instructions)

Hotel and Transportation

Area Attractions

Contact Us

Previous Conferences


Organized by:
Missouri University of
Science & Technology

Systems Engineering Graduate Program
Smart Engineering Systems Laboratory
600 W. 14th St.
Rolla, MO 65409-0370
Phone: 573-341-6576
Email: complexsystems@mst.edu

2017 CONFERENCE PLENARY, luncheon, and banquet speakers

Click name to view speaker's Biography/Presentation Information


  Mika Sato-Ilic J. David Schaffer Charlie Dagli
  University of Tsukuba
Institute for Justice
and Well-Being
Binghamton University
Research Staff
Lincoln Laboratory


  Fred Highland  Charles Catlett Nicole Levy
  University of Maryland
Baltimore County
Urban Center for
Computation and Data
French National
of Arts and Crafts



 Speakers in alphabetical order: 

  Charlie Dagli, PhD
Research Staff
Lincoln Laboratory, MIT






Dr. Charlie K. Dagli has been a member of the research staff in the Human Language Technology Group at MIT Lincoln Laboratory since January 2010. His primary research interests are in the areas of multimedia understanding, machine learning, and network analysis. Prior to joining Lincoln Laboratory, he held positions at Hewlett-Packard Laboratories, Ricoh Innovations, and State Farm Corporate Research. He was the recipient of the Best Student Paper award at the 2006 ACM International Conference on Image and Video Retrieval and holds three patents for technologies in computer vision and multimedia analysis.

Dr. Dagli received the BS degree from Boston University in 2001, and the MS and PhD degrees from the University of Illinois, Urbana-Champaign, in 2003 and 2009, all in electrical and computer engineering. (Return to top)


  Charles Catlett, Ph.D.
Urban Center for Computation and Data
Chicago, IL
  Using Data Analytics, Computational Modeling, and Embedded Systems to Understand Cities as Complex Systems


 Urbanization is one of the great challenges and opportunities of this century, inextricably tied to global challenges ranging from climate change to sustainable use of energy and natural resources, and from personal health and safety to accelerating innovation and education. There is a growing science community—spanning nearly every discipline—pursuing research related to these challenges. The availability of urban data has increased over the past few years, in particular through open data initiatives, creating new opportunities for collaboration between academia and local government in areas ranging from scalable data infrastructure to tools for data analytics, along with challenges such as replicability of solutions between cities, integrating and validating data for scientific investigation, and protecting privacy. For many urban questions, however, new data sources will be required with greater spatial and/or temporal resolution, driving innovation in the use of sensors in mobile devices as well as embedding intelligent sensing infrastructure in the built environment. Collectively these data sources also hold promise to begin to integrate computational models associated with individual urban sectors such as transportation, building energy use, or climate. Catlett will discuss the work that Argonne National Laboratory and the University of Chicago are doing in partnership with the City of Chicago and other cities through the Urban Center for Computation and Data, focusing in particular on new opportunities related to embedded systems and experience to date with the Array of Things project in Chicago and partner cities.


Charles Catlett, Ph.D. is the founding director of the Urban Center for Computation and Data, UrbanCCD, which brings social, physical, and computational scientists together with artists, architects, technologists, and policy makers to explore science-based approaches to opportunities and challenges related to the understanding, design, and sustainable operation of cities.  To this end UrbanCCD brings expertise, tools, and resources to bear from computational modeling, data analytics, and embedded systems. He is also a Senior Computer Scientist at Argonne National Laboratory and a Senior Fellow at the Computation Institute of the University of Chicago and Argonne National Laboratory.

From 2007 to 2011 he was the Chief Information Officer at Argonne National Laboratory, and from 2004 to 2007 he was Director of the National Science Foundation's TeraGrid initiative - a nationally distributed supercomputing facility involving fifteen universities and federal laboratories.  From 1999 to 2004 Charlie directed the design and deployment of I-WIRE, a dedicated fiber optic network funded by the State of Illinois, which connects research institutions in the Chicago area and downstate Illinois to support advanced research and education.  

Before joining the University of Chicago and Argonne in 2000, Charlie was Chief Technology Officer at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.  Beginning at NCSA’s founding in 1985 he participated in the development of NSFNET, one of several early national networks that evolved into what we now experience as the Internet. During the exponential growth of the web following the release of NCSA’s Mosaic web browser, his team developed and supported NCSA’s scalable web server infrastructure.

Recognized in 2014 as one of Chicago’s “Tech 50” technology leaders by Crain’s Chicago Business, Charlie is a Computer Engineering graduate of the University of Illinois at Urbana-Champaign. (Return to top)


  Fred Highland
Graduate Faculty, Systems Engineering
University of Maryland,
Baltimore County
  Complexity in Engineering Large Scale Censuses and Surveys


 A population census is one of the largest and most complex operations a government performs. It requires years of planning, development of hundreds of systems and the coordination of thousands of workers to produce results that are critical to running the government and the nation for many years into the future. While seemingly a simple data collection and analysis task, a census or survey must address a number of statistical, social, political and economic issues taking it beyond complicated into the realm of complex systems. In addition, economic and social pressures are forcing statistical organizations to conduct less intrusive and more cost-effective surveys by employing generalized survey systems, adaptive survey design and administrative records that further increase potential complexity issues. The talk will discuss the census and survey process, the complexity issues that arise in creating and operating large scale national survey systems, the trend toward administrative record usage and adaptive survey design, and the complex system engineering challenges that all of these bring.


Fred Highland is an adjunct instructor in the Systems Engineering program at University of Maryland, Baltimore County. He has over 38 years of experience in software and systems technology with Lockheed Martin where he was a Senior Fellow and Master Systems Architect responsible for architecture and development of complex systems including the NASA Space Shuttle, Census data collection systems, artificial intelligence applications and big data analytics. His research interests include application of complex systems to systems engineering and architecture as well as neurocomputing using polychronous wavefront computing. Mr. Highland received his M.S. in Computer Science from the University of Houston and a B.S. in Computer Science from the University of Rhode Island. (Return to top)

  Nicole Levy
Professor of Software Engineering
French National Conservatory of Arts and Crafts
Paris, France
  Quality-driven Reference Architecture Incremental Design: an Industrial Experience



The objective is to describe a methodology to define a Reference Architecture that will ease the further development of complex systems in a given domain. The Reference Architecture contains variations points, first step to the design of a Software Product Line. Early product quality considerations are taken into account, based on the ISO/IEC 26550 reference model guidelines for Software Product Line engineering, introducing required qualities as variation criteria. A bottom-up strategy will be followed starting from an existing product. The logic view of the Reference Architecture is incrementally developed in a four steps process. The first step shows the architecture with main functional components. The second step establishes the traceability among functional components and the quality requirements needed by the functionalities to perform conveniently their tasks, expressed also as non-functional components or implicit functionalities. This link is maintained in the third step when components are agglomerated or separated to populate the components of the style. The Reference Architecture configuration is presented in the fourth step with the variability model, which is defined considering similarities among the different tasks performed by the functional and non-functional components. An industrial experience in the Human Resources domain is presented: a Vacation Request System that takes into account different regulations.


Pr. Nicole Levy is full professor of software engineering at the french National Conservatory of Arts and Crafts (Cnam), Paris and member of its Study and Research Centre for Informatics and Communications (Cédric) since September 2010. Prior to this assignment, she was member of the Computer Science Laboratory (PRiSM ) at University of Versailles Saint-Quentin en Yvelines where she led a research group. She directed the University of Versailles engineering school, called ISTY. She started her academic career as assistant professor at University of Nancy 1 and was a member of the Lorraine research laboratory in computer science and its applications (LORIA).

Her research interests include using formal methods to specify complex systems and software architectures. She is mostly interested in the development and reconfiguration processes for software architectures based on both functional and non- functional properties. (Return to top)


  Mika Sato-Ilic, PhD
Professor of Engineering,
Information Systems
University of Tsukuba,

  Modeling New Complex Data Structures
Monday, October 30, 2017. Morning Plenary, 8:00 am - 9:00 am


As the world advances towards a new era of innovative information within the frame of Cyber Physical Systems, data analysts are under pressure to interpret the unprecedentedly complex and massive data observed. This new complex data structure calls for the development of a methodology for innovative data analysis to extract the efficient latent structure of the data. In the methodology of data analysis, measuring, quantifying, and fusing data are essential; therefore, “space” and “scale” play an essential role in analyzing data. It is from these two perspectives of “space” and “scale” that this presentation will introduce methodologies of data analyses for adapting complex data structures.


Prof. Mika Sato-Ilic currently holds the position of Professor in the Faculty of Engineering, Information and Systems, at the University of Tsukuba. She also holds the position of Vice President in the National Statistics Center, Japan. She is the founding Editor-in-Chief of the International Journal of Knowledge Engineering and Soft Data Paradigms, Associate Editors of IEEE Transactions on Fuzzy Systems, Neurocomputing, Information Sciences, and Regional Editor of International Journal on Intelligent Decision Technologies, as well as serving on the editorial board of several other journals. In addition, she was a Council of the International Association for Statistical Computing (a Section of the International Statistical Institute), a Senior Member of the IEEE where she held several positions including the Vice-Chair of the Fuzzy Systems Technical Committee of the IEEE Computational Intelligence Society. In addition, she has served on several IEEE committees including the administration committee, program co-chair, and special sessions co-chair. Her academic output includes 4 books, 12 book chapters and over 120 journal and conference papers. Her research interests include the development of methods for data mining, multi-dimensional data analysis, multi-mode multi-way data theory, pattern classification, and computational intelligence techniques for which she has received several academic awards. (Return to top)


  J. David Schaffer, Ph.D.
Visiting Research Professor
Institute for Justice and Well-Being
Binghamton University

  A Tail of Three Bio-inspired Computing Paradigms
Tuesday, October 31, 2017. Morning Plenary, 8:00 am - 9:00 am


I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the “mother of all adaptive processes.” Some variants on the basic algorithms will be sketched and some lessons we have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make “thinking machines”, an old dream of man’s, and based upon the only known existing example of intelligence. Then a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature’s highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.


Dr. Schaffer recently retired as a Research Fellow after 25 years with Philips Research. He now advises graduate students and initiates research projects at Binghamton University in the domains of bioinformatics, evolving intelligent machines, and Alzheimer’s disease. He believes that evolutionary computation is one of the most valuable technologies for mastering complexity. Dr Schaffer holds a B.S. in Aerospace Engineering from Notre Dame, M.S. in Systems Engineering from Widener University, and Ph.D. in Electrical Engineering from Vanderbilt. He has published about 100 peer-reviewed papers, serves on the editorial board for the Evolutionary Computation Journal, and the steering committee for the Evolutionary Multi-objective Optimization conference series. He holds forty-three issued US patents. In 2012, he was named a Pioneer in Evolutionary Computation by the IEEE Computational Intelligence Society. (Return to top)