Computer Science

Share this article:

Computer Science

  • Join our comunity:

Theory of Modeling and Simulation

By: , Posted on: July 27, 2018

The centrality of Deep Learning to AI is reaching its zenith with a growing range of applications ready to fundamentally restructure information and communication technologies. However, there is also a developing realization that while deep learning is a very efficient way of fitting simplified neural models to voluminous data, it lacks the structures to develop and employ causal models the way human intuition tends to work. Models developed for simulation are intrinsically causal in nature with embedded intricate relationships that enables them to predict and explain the behavior of complex systems. While modeling and simulation (M&S) is an empirical activity, and has been labelled as the tool of last resort in the past, a theory has been developing to provide the right conceptual framework for its conduct. This theory will be available to AI developers when progress based on data-centric deep learning plateaus.

A consensus on the fundamental status of theory of modeling and simulation is emerging – some recognize the need for a theoretical foundation for M&S as a science. Such a foundation is necessary to foster the development of M&S-specific methods and the use of such methods to solve real world problems faced by practitioners. For others there is the acknowledgment that certain of the theory’s basic distinctions such as the separation, and inter-relation, of models and simulators, are at least alternatives to be considered in addressing core M&S research challenges.

The third edition of Theory of Modeling and Simulation: Discrete Event & Iterative System Computational Foundations is dedicated to the inference that theory of M&S is an essential component, and organizing structure, for such the body of knowledge necessary to successfully conduct M&S activities. A prime emphasis of the new edition is on the central role of iterative specification of systems which provide a solid foundation for the computational approach to predicting and explaining complex systems. While earlier editions introduced iterative specification as the common form of specification for unifying continuous and discrete systems, this edition employs it more fundamentally throughout the book.

Recently, one of the authors of “Theory of Modeling and Simulation” gave a brown bag lunch talk to M&S-related personnel at Mitre Corp. that manages federally funded research and development centers supporting several U.S. government agencies. Titled “Recent advances in the Theory of Modeling and Simulation: Implications for Computational Emergence”, Bernard P. Zeigler reviewed recent research results in the theoretical basis of modeling and simulation that are covered in the book.  He pointed how theory is yielding new insights into computational representations of complex real world systems.

The theory shows how simulation models can be constructed from iteratively specified components with well-specified input/output couplings.  The Discrete event system specification (DEVS) formalism provides a well-known instance of such constructions. The theory provides existence conditions under which compositions of component systems form well-defined system-of-systems, a central problem to those constructing simulations that bring together M&S assets from diverse sources with different kinds of underlying modeling formalisms.

Zeigler also discussed Implications for emergence of unanticipated phenomena within the constraints of computation theory. This is a subject of intense interest both to model builders who are trying to avoid negative unintended consequences from composed models as well as those seeking to observe truly novel positive behavior from such compositions. Click for links to the pair of videos, which replicate the talk.

Part I

and

Part II

Need a copy? Visit elsevier.com and use discount code STC317 at checkout to save up to 30% on your very own copy!

About the author:
Bernard P. Zeigler, is a Professor of Electrical & Computer Engineering at the University of Arizona and co-director of the Arizona Center for Integrative Modeling and Simulation. He is the author of numerous books and publications, a Fellow of the IEEE, and of the Society for Modeling and Simulation International. Prof. Zeigler is currently heading a project for the Joint Interoperability Test Command (JITC) where he is leading the design of the future architecture for large distributed simulation events for the Joint Distributed Engineering Plant (JDEP). He is also developing DEVS-methodology approaches for testing mission thread end-to-end interoperability and combat effectiveness of Defense Department acquisitions and transitions to the Global Information Grid with its Service Oriented Architecture (GIG/SOA).

Connect with us on social media and stay up to date on new articles

Computer Science

Computing functionality is ubiquitous. Today this logic is built into almost any machine you can think of, from home electronics and appliances to motor vehicles, and it governs the infrastructures we depend on daily — telecommunication, public utilities, transportation. Maintaining it all and driving it forward are professionals and researchers in computer science, across disciplines including:

  • Computer Architecture and Computer Organization and Design
  • Data Management, Big Data, Data Warehousing, Data Mining, and Business Intelligence (BI)
  • Human Computer Interaction (HCI), User Experience (UX), User Interface (UI), Interaction Design and Usability
  • Artificial intelligence (AI)
Morgan Kaufmann companion resources can be found here You can also access companion materials and instructor’s resources for all our new books on the Elsevier Store. Search by author, title or ISBN, then look for the “Resources” tab on any book page. Looking for companion materials or instructor’s resources for these titles? Connect below: