Theoretical Foundations for Software
 
SoS Logo

Theoretical Foundations for Software

Theory work helps enhance our understanding of  basic principles.  Much interest has developed around the theoretical foundations of software which have direct and indirect implications for cyber security.  The research cited here appeared in 2014 and include such topics as malware propagation and mutant measurements.

 

Shigen Shen; Hongjie Li; Risheng Han; Vasilakos, A.V.; Yihan Wang; Qiying Cao, "Differential Game-Based Strategies for Preventing Malware Propagation in Wireless Sensor Networks," Information Forensics and Security, IEEE Transactions on, vol. 9, no. 11, pp.1962,1973, Nov. 2014. doi: 10.1109/TIFS.2014.2359333 Wireless sensor networks (WSNs) are prone to propagating malware because of special characteristics of sensor nodes. Considering the fact that sensor nodes periodically enter sleep mode to save energy, we develop traditional epidemic theory and construct a malware propagation model consisting of seven states. We formulate differential equations to represent the dynamics between states. We view the decision-making problem between system and malware as an optimal control problem; therefore, we formulate a malware-defense differential game in which the system can dynamically choose its strategies to minimize the overall cost whereas the malware intelligently varies its strategies over time to maximize this cost. We prove the existence of the saddle-point in the game. Further, we attain optimal dynamic strategies for the system and malware, which are bang-bang controls that can be conveniently operated and are suitable for sensor nodes. Experiments identify factors that influence the propagation of malware. We also determine that optimal dynamic strategies can reduce the overall cost to a certain extent and can suppress the malware propagation. These results support a theoretical foundation to limit malware in WSNs.

Keywords:  bang-bang control; differential games; invasive software; telecommunication control; telecommunication security; wireless sensor networks; WSN; bang-bang controls; decision-making problem; differential equations; differential game-based strategy; malware propagation model; malware propagation prevention; malware-defense differential game; optimal control problem; optimal dynamic strategy; overall cost minimization; saddle-point; sensor node characteristics; sleep mode; traditional epidemic theory; wireless sensor networks; Control systems; Games; Grippers; Malware; Silicon; Wireless sensor networks; Differential game; Malware propagation; epidemic theory; wireless sensor networks (ID#: 15-3659)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6905838&isnumber=6912034

 

Baraldi, A.; Boschetti, L.; Humber, M.L., "Probability Sampling Protocol for Thematic and Spatial Quality Assessment of Classification Maps Generated From Spaceborne/Airborne Very High Resolution Images," Geoscience and Remote Sensing, IEEE Transactions on, vol. 52, no. 1, pp.701,760, Jan. 2014. doi: 10.1109/TGRS.2013.2243739  To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: 1) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. 2) The inclusion probabilities must be: a) knowable for nonsampled units and b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne very high resolution images, where: 1) an original Categorical Variable Pair Similarity Index (proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and 2) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session, the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic Mapper (SIAM™) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps, and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAM™ by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAM™ pre-classification maps proposed in this contribution, together with OQIs claimed for SIAM™ by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAM™ software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems initiative and the QA4EO international guidelines.

Keywords: decision trees; geographic information systems; geophysical image processing ;image classification; measurement uncertainty; probability; quality assurance; remote sensing; sampling methods; DigitalGlobe; Global Earth Observation System of Systems;QA4EO international guidelines; Quality Assurance Framework for Earth Observation guidelines;QuickBird-2 image; SIAM preclassification maps; Satellite Image Automatic Mapper;WorldView-2 images; categorical variable pair similarity index; decision-tree; inclusion probability; measurement uncertainty; probability sampling protocol; procedural knowledge; quality assessment; spaceborne/airborne very high resolution images; structural knowledge; subsymbolic object-based spatial quality indicators; symbolic pixel-based thematic quality indicators; thematic maps; Accuracy; Earth; Estimation; Guidelines; Indexes; Protocols; Spatial resolution; Contingency matrix; error matrix; land cover change (LCC) detection; land cover classification; maps comparison; nonprobability sampling; ontology; overlapping area matrix (OAMTRX);probability sampling; quality indicator of operativeness (OQI);spatial quality indicator (SQI);taxonomy; thematic quality indicator (TQI)}, (ID#: 15-3660)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6479283&isnumber=6675822

 

Cardoso, L.S.; Massouri, A.; Guillon, B.; Ferrand, P.; Hutu, F.; Villemaud, G.; Risset, T.; Gorce, J.-M., "CorteXlab: A Facility For Testing Cognitive Radio Networks In A Reproducible Environment," Cognitive Radio Oriented Wireless Networks and Communications (CROWNCOM), 2014 9th International Conference on , vol., no., pp.503,507, 2-4 June 2014. While many theoretical and simulation works have highlighted the potential gains of cognitive radio, several technical issues still need to be evaluated from an experimental point of view. Deploying complex heterogeneous system scenarios is tedious, time consuming and hardly reproducible. To address this problem, we have developed a new experimental facility, called CorteXlab, that allows complex multi-node cognitive radio scenarios to be easily deployed and tested by anyone in the world. Our objective is not to design new software defined radio (SDR) nodes, but rather to provide a comprehensive access to a large set of high performance SDR nodes. The CorteXlab facility offers a 167 m2 electromagnetically (EM) shielded room and integrates a set of 24 universal software radio peripherals (USRPs) from National Instruments, 18 PicoSDR nodes from Nutaq and 42 IoT-Lab wireless sensor nodes from Hikob. CorteXlab is built upon the foundations of the SensLAB testbed and is based the free and open-source toolkit GNU Radio. Automation in scenario deployment, experiment start, stop and results collection is performed by an experiment controller, called Minus. CorteXlab is in its final stages of development and is already capable of running test scenarios. In this contribution, we show that CorteXlab is able to easily cope with the usual issues faced by other testbeds providing a reproducible experiment environment for CR experimentation.

Keywords: Internet of Things; cognitive radio; controllers; electromagnetic shielding; software radio; testing; wireless sensor networks; CorteXlab facility; Hikob; IoT-Lab wireless sensor nodes; Minus; National Instruments; Nutaq; PicoSDR nodes; SDR nodes; SensLAB; cognitive radio networks; complex heterogeneous system scenarios; complex multinode cognitive radio scenarios; controller; electromagnetically shielded room; open-source toolkit GNU Radio; reproducible environment; software defined radio; testing facility; universal software radio peripherals; Cognitive radio; Field programmable gate arrays; Interference; MIMO; Orbits; Wireless sensor networks (ID#: 15-3661)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6849736&isnumber=6849647

 

Chang, Lichen, "Convergence of Physical System And Cyber System Modeling Methods For Aviation Cyber Physical Control System," Information and Automation (ICIA), 2014 IEEE International Conference on, pp. 542, 547, 28-30 July 2014. doi: 10.1109/ICInfA.2014.6932714 Recent attention to aviation cyber physical systems (ACPS) is driven by the need for seamless integration of design disciplines that dominate physical world and cyber world convergence. System convergence is a big obstacle to good aviation cyber-physical system (ACPS) design, which is due to a lack of an adequate scientific theoretical foundation for the subject. The absence of a good understanding of the science of aviation system convergence is not due to neglect, but rather due to its difficulty. Most complex aviation system builders have abandoned any science or engineering discipline for system convergence they simply treat it as a management problem. Aviation System convergence is almost totally absent from software engineering and engineering curricula. Hence, system convergence is particularly challenging in ACPS where fundamentally different physical and computational design concerns intersect. In this paper, we propose an integrated approach to handle System convergence of aviation cyber physical systems based on multi-dimensions, multi-views, multi-paradigm and multiple tools. This model-integrated development approach addresses the development needs of cyber physical systems through the pervasive use of models, and physical world, cyber world can be specified and modeled together, cyber world and physical world can be converged entirely, and cyber world models and physical world model can be integrated seamlessly. The effectiveness of the approach is illustrated by means of one practical case study: specifying and modeling Aircraft Systems. In this paper, We specify and model Aviation Cyber-Physical Systems with integrating Modelica, Modelicaml and Architecture Analysis & Design Language (AADL), the physical world is modeled by Modelica and Modelicaml, the cyber part is modeled by AADL and Modelicaml.

Keywords: Aerospace control; Aircraft; Analytical models; Atmospheric modeling; Convergence; Mathematical model; Unified modeling language; AADL; Aviation Cyber Physical System; Dynamic Continuous Features; Modelica; Modelicaml; Spatial-Temporal Features (ID#: 15-3662)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6932714&isnumber=6932615

 

Hummel, M., "State-of-the-Art: A Systematic Literature Review on Agile Information Systems Development," System Sciences (HICSS), 2014 47th Hawaii International Conference on, pp.4712,4721, 6-9 Jan. 2014. doi: 10.1109/HICSS.2014.579 Principles of agile information systems development (ISD) have attracted the interest of practice as well as research. The goal of this literature review is to validate, update and extend previous reviews in terms of the general state of research on agile ISD. Besides including categories such as the employed research methods and data collection techniques, the importance of theory is highlighted by evaluating the theoretical foundations and contributions of former studies. Since agile ISD is rooted in the IS as well as software engineering discipline, important outlets of both disciplines are included in the search process, resulting in 482 investigated papers. The findings show that quantitative studies and the theoretical underpinnings of agile ISD are lacking. Extreme Programming is still the most researched agile ISD method, and more efforts on Scrum are needed. In consequence, multiple research gaps that need further research attention are identified.

Keywords: software prototyping; Scrum; agile ISD; agile information systems development; data collection techniques; extreme programming; software engineering discipline; Abstracts; Data collection; Interviews; Programming; Systematics; Testing (ID#: 15-3663)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6759181&isnumber=6758592

 

Ammann, P.; Delamaro, M.E.; Offutt, J., "Establishing Theoretical Minimal Sets of Mutants," Software Testing, Verification and Validation (ICST), 2014 IEEE Seventh International Conference on, pp.21,30, March 31 2014-April 4 2014. doi: 10.1109/ICST.2014.13 Mutation analysis generates tests that distinguish variations, or mutants, of an artifact from the original. Mutation analysis is widely considered to be a powerful approach to testing, and hence is often used to evaluate other test criteria in terms of mutation score, which is the fraction of mutants that are killed by a test set. But mutation analysis is also known to provide large numbers of redundant mutants, and these mutants can inflate the mutation score. While mutation approaches broadly characterized as reduced mutation try to eliminate redundant mutants, the literature lacks a theoretical result that articulates just how many mutants are needed in any given situation. Hence, there is, at present, no way to characterize the contribution of, for example, a particular approach to reduced mutation with respect to any theoretical minimal set of mutants. This paper's contribution is to provide such a theoretical foundation for mutant set minimization. The central theoretical result of the paper shows how to minimize efficiently mutant sets with respect to a set of test cases. We evaluate our method with a widely-used benchmark.

Keywords: minimisation; program testing; set theory; mutant set minimization; mutation analysis; mutation score; redundant mutants; test cases; Benchmark testing; Computational modeling; Context; Electronic mail; Heuristic algorithms; Minimization; Mutation testing; dynamic subsumption; minimal mutant sets (ID#: 15-3664)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6823862&isnumber=6823846

 

Achouri, A.; Hlaoui, Y.B.; Jemni Ben Ayed, L., "Institution Theory for Services Oriented Applications," Computer Software and Applications Conference Workshops (COMPSACW), 2014 IEEE 38th International, pp.516,521, 21-25 July 2014. doi: 10.1109/COMPSACW.2014.86 In the present paper, we present our approach for the transformation of workflow applications based on institution theory. The workflow application is modeled with UML Activity Diagram(UML AD). Then, for a formal verification purposes, the graphical model will be translated to an Event-B specification. Institution theory will be used in two levels. First, we defined a local semantic for UML AD and Event B specification using a categorical description of each one. Second, we defined institution comorphism to link the two defined institutions. The theoretical foundations of our approach will be studied in the same mathematical framework since the use of institution theory. The resulted Event-B specification, after applying the transformation approach, will be used for the formal verification of functional proprieties and the verification of absences of problems such deadlock. Additionally, with the institution comorphism, we define a semantic correctness and coherence of the model transformation.

Keywords: Unified Modeling Language; diagrams; formal specification; formal verification; programming language semantics; software engineering; UML AD;UML activity diagram; event-B specification; formal verification; graphical model; institution comorphism; institution theory; local semantic; semantic correctness; service oriented applications; workflow applications; Context; Grammar; Manganese; Semantics; Syntactics; System recovery; Unified modeling language; Event-B; Formal semantics; Institution theory; Model transformation; UML Activity Diagram (ID#: 15-3665)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6903182&isnumber=6903069

 

Lerchner, H.; Stary, C., "An Open S-BPM Runtime Environment Based on Abstract State Machines," Business Informatics (CBI), 2014 IEEE 16th Conference on, vol. 1, pp. 54, 61, 14-17 July 2014. doi: 10.1109/CBI.2014.24 The paradigm shift from traditional BPM to Subject-oriented BPM (S-BPM) is accounted to identifying independently acting subjects. As such, they can perform arbitrary actions on arbitrary objects. Abstract State Machines (ASMs) work on a similar basis. Exploring their capabilities with respect to representing and executing S-BPM models strengthens the theoretical foundations of S-BPM, and thus, validity of S-BPM tools. Moreover it enables coherent intertwining of business process modeling with executing of S-BPM representations. In this contribution we introduce the framework and roadmap tackling the exploration of the ASM approach in the context of S-BPM. We also report the major result, namely the implementation of an executable workflow engine with an Abstract State Machine interpreter based on an existing abstract interpreter model for S-BPM (applying the ASM refinement concept). This workflow engine serves as a baseline and reference implementation for further language and processing developments, such as simulation tools, as it has been developed within the Open-S-BPM initiative.

Keywords: business data processing; finite state machines; program interpreters; workflow management software; ASM approach; Open S-BPM runtime environment; S-BPM model; S-BPM tools; abstract interpreter model; abstract state machine interpreter; business process modeling; executable workflow engine ;subject-oriented BPM; Abstracts; Analytical models; Business; Engines; Mathematical model; Semantics; Abstract State Machine; CoreASM; Open-S-BPM; Subject-oriented Business Process Management; workflow engine (ID#: 15-3670)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6904137&isnumber=6904121

 

Poberezhskiy, Y.S.; Poberezhskiy, G.Y., "Impact of the Sampling Theorem Interpretations on Digitization and Reconstruction in SDRs and CRs," Aerospace Conference, 2014 IEEE, pp. 1, 20, 1-8 March 2014. doi: 10.1109/AERO.2014.6836423 Sampling and reconstruction (S&R) are used in virtually all areas of science and technology. The classical sampling theorem is a theoretical foundation of S&R. However, for a long time, only sampling rates and ways of the sampled signals representation were derived from it. The fact that the design of S&R circuits (SCs and RCs) is based on a certain interpretation of the sampling theorem was mostly forgotten. The traditional interpretation of this theorem was selected at the time of the theorem introduction because it offered the only feasible way of S&R realization then. At that time, its drawbacks did not manifest themselves. By now, this interpretation has largely exhausted its potential and inhibits future progress in the field. This tutorial expands the theoretical foundation of S&R. It shows that the traditional interpretation, which is indirect, can be replaced by the direct one or by various combinations of the direct and indirect interpretations that enable development of novel SCs and RCs (NSCs and NRCs) with advanced properties. The tutorial explains the basic principles of the NSCs and NRCs design, their advantages, as well as theoretical problems and practical challenges of their realization. The influence of the NSCs and NRCs on the architectures of SDRs and CRs is also discussed.

Keywords: analogue-digital conversion; cognitive radio; signal reconstruction; signal representation; signal sampling; software radio; CR; NRC design; NSC design; S&R circuits; SDR; cognitive radio; sampled signal representation; sampling and reconstruction; sampling rates; sampling theorem interpretation; software defined radio; Band-pass filters; Bandwidth; Barium; Baseband; Digital signal processing; Equations; Interference (ID#: 15-3671)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6836423&isnumber=6836156

Chen, Qingyi; Kang, Hongwei; Zhou, Hua; Sun, Xingping; Shen, Yong; Jin, YunZhi; Yin, Jun, "Research on Cloud Computing Complex Adaptive Agent," Service Systems and Service Management (ICSSSM), 2014 11th International Conference on, pp.1,4, 25-27 June 2014. doi: 10.1109/ICSSSM.2014.6943342 It has gradually realized in the industry that the increasing complexity of cloud computing under interaction of technology, business, society and the like, instead of being simply solved depending on research on information technology, shall be explained and researched from a systematic and scientific perspective on the basis of theory and method of a complex adaptive system (CAS). This article, for basic problems in CAS theoretical framework, makes research on definition of an active adaptive agent constituting the cloud computing system, and proposes a service agent concept and basic model through commonality abstraction from two basic levels: cloud computing technology and business, thus laying a foundation for further development of cloud computing complexity research as well as for multi-agent based cloud computing environment simulation.

Keywords: Adaptation models; Adaptive systems; Business; Cloud computing; Complexity theory; Computational modeling; Economics; cloud computing; complex adaptive system; service agent (ID#: 15-3672)

URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6943342&isnumber=6874015


Note:

Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to news@scienceofsecurity.net for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.