Publications
filter list : Years: 2021 |
2020 |
2019 |
2018 |
2017 |
2016 |
2015 |
2014 |
2013 |
2012 |
2011 |
2010 |
2009 |
2008 |
2007 |
2006 |
2005 |
2004 |
2003 |
2002 |
2001 |
2000 |
1999 |
1998 |
1997 |
1996 |
1995 |
1994 |
1993 |
1992 |
1991 |
1988 |
1987 |
1986 |
1983 | show all back to the year overview Benjamin Völker, Philipp M Scholl, Bernd BeckerA Feature and Classifier Study for Appliance Event Classification 2021 3rd EAI International Conference on Sustainable Energy for Smart Cities - EAI SESC 2021 Springer» show abstract « hide abstract Abstract The shift towards advanced electricity metering infrastructure gained traction because of several smart meter roll-outs during the last decade. This increased the interest in Non-Intrusive Load Monitoring. Nevertheless, adoption is low, not least because the algorithms cannot simply be integrated into the existing smart meters due to the resource constraints of the embedded systems. We evaluated 27 features and four classifiers regarding their suitability for event-based NILM in a standalone and combined feature analysis. Active power was found to be the best scalar and Waveform Approximation the best multidimensional feature. We propose the feature set [P, cosPhi, TRI, WFA] in combination with a Random Forest classifier. Together, these lead to F1-scores of up to 0.98 on average across four publicly available datasets. Still, feature extraction and classification remains computationally lightweight and allows processing on resource constrained embedded systems. Benjamin VölkerAcquiring, congregating, and processing high-frequency electricity data in distributed environments 2021 » show abstract « hide abstract Abstract The energy consumption of a home depends on the behavior of its inhabitants offering a promising energy saving potential. However, this potential can only be unfolded to full extent if the consumption of each individual appliance is know. Non-Intrusive Load Monitoring (NILM) offers a retrospective way to get individual appliance consumption data. If such data are combined with eco-feedback techniques it can help to better understand a user’s electricity usage to ultimately save energy. Researching NILM algorithms, and in particular, the development of the underlying supervised machine learning techniques, requires adequate datasets with corresponding ground truth data and methodologies to create more labeled data when needed. Adding detailed labels to such datasets is a time-consuming and error-prone process. Deploying a supervised NILM system typically requires a dedicated system training procedure hampering their widespread adoption. The thesis at hand presents several strategies to address these challenges in order to improve the adoption of NILM. In particular, a set of requirements is presented for acquiring and congregating high-frequency electrical measurements in distributed environments. These are handled by a novel recording framework comprised of a central recording director and two prototype Data Acquisition Systems (DAQs), one for aggregated and one for plug-level data. The developed methodologies allow the DAQs to deliver highly accurate and time-synchronized data while using rather inexpensive components. To add precise and descriptive labels to such data, a semi-automatic labeling method is developed and evaluated on two publicly available datasets. The method improves the labeling efficiency up to 74% and has been integrated into a novel labeling tool implemented as a web-application. The framework and labeling tool have been used to collect and label the Fully-labeled hIgh-fRequency Electricity Disaggregation (FIRED) dataset. It contains 101 days of 8 kHz aggregated current and voltage measurements of the 3-phase electricity supply of a typical residential apartment in Germany. The data also includes synchronized 2 kHz plug-level readings of 21 individual appliances, other environmental sensor measurements, and descriptive event labels of all appliances, resulting in a complete and versatile residential electricity dataset. Furthermore, several domain specific features and classifiers are evaluated regarding their suitability for (event-based) NILM targeted for resource constrained systems. Data cleaning methods are evaluated which remove the steady-state energy consumption of other appliances from the aggregated data of a given appliance event. As plug-level data delivers less noisy individual appliance data, it is shown that the inclusion of such data during training results in a performance gain for appliance classification algorithms. Finally, a novel supervised NILM system is proposed and evaluated which uses a combination of aggregated and individual appliance data to improve and aid the training process while only requiring minimal user interaction. Lukas Mentel, Karsten Scheibler, Felix Winterer, Bernd Becker, Tino TeigeBenchmarking SMT Solvers on Automotive Code 2021 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract Using embedded systems in safety-critical environments requires a rigorous testing of the components these systems are composed of. For example, the software running on such a system has to be evaluated regarding its code coverage – in particular, unreachable code fragments have to be avoided according to the ISO 26262 standard. Software model checking allows to detect such dead code automatically. While the recent case study compares several academic software model checkers with the commercial test and verification tool BTC EmbeddedPlatform (BTC EP), we want to focus on a lower level – i.e. the back-end solvers within BTC EP. Therefore, we evaluate the performance of off-the-shelf SMT solvers supporting the theory of floating-point as well as the theory of bitvectors on floating-point dominated benchmark instances originating from the automotive domain. Furthermore, we compare these off-the-shelf SMT solvers with the back-end solvers used by BTC EP. Karsten Scheibler, Felix Winterer, Tobias Seufert, Tino Teige, Christoph Scholl, Bernd BeckerICP and IC3 2021 Conference on Design, Automation and Test in Europe Conference (DATE) » show abstract « hide abstract Abstract If embedded systems are used in safety-critical environments, they need to meet several standards. For example, in the automotive domain the ISO 26262 standard requires that the software running on such systems does not contain unreachable code. Software model checking is one effective approach to automatically detect such dead code. Being used in a commercial product, iSAT3 already performs very well in this context. In this paper we integrate IC3 into iSAT3 in order to improve its dead code detection capabilities even further. Felix Winterer, Tobias Seufert, Karsten Scheibler, Tino Teige, Christoph Scholl, Bernd BeckerICP and IC3 with Stronger Generalization 2021 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract Most recently, IC3 was integrated into the SMT solver iSAT3. Thus, iSAT3+IC3 introduces the first IC3 variant based on interval abstraction and Interval Constraint Propagation (ICP). As strong generalization is one of the key aspects for the IC3 algorithm to be successful, we integrate two additional generalization schemes from literature into iSAT3+IC3: Inductive Generalization and Counterexamples To Generalization (CTG). Furthermore, we evaluate the benefits and the drawbacks of different variants of these methods in the context of interval abstraction and ICP. Benjamin Völker, Marc Pfeifer, Florian Wolling, Philipp M. Scholl, Bernd BeckerIntroducing MILM - A Hybrid Minimal-Intrusive Load Monitoring Approach 2021 The Twelfth ACM International Conference on Future Energy Systems (e-Energy ’21) ACM» show abstract « hide abstract Abstract The shift towards an advanced electricity metering infrastructure has gained traction because of several smart meter roll-outs. This accelerated research in Non-Intrusive Load Monitoring techniques. These techniques highly benefit from the temporal resolution improvements achieved by smart meters. Nevertheless, industrial adoption is low, not least because the achieved disaggregation performance is rather poor for unsupervised approaches. This work sketches a way to utilize intrusive sensors in combination with a standard NILM system to enhance training and maximize overall system's performance while minimizing the number of required intrusive sensors. Pascal Pieper, Ralf Wimmer, Gerhard Angst, Rolf DrechslerMinimally Invasive HW/SW Co-debug Live Visualization on Architecture Level 2021 ACM Press » show abstract « hide abstract Abstract We present a tool that allows developers to debug hard- and software and their interaction in an early design stage. We combine a SystemC virtual prototype (VP) with an easily configurable and interactive graphical user interface and a standard software debugger. The graphical user interface visualizes the internal state of the hardware. At the same time, the software debugger monitors and allows to manipulate the state of the software. This co-visualization supports design understanding and live debugging of the HW/SW interaction. We demonstrate its usefulness with a case-study where we debug an OLED display driver running on a RISC-V VP. Tobias Paxian, Pascal Raiola, Bernd BeckerOn Preprocessing for Weighted MaxSAT 2021 VMCAI - 22nd International Conference on Verification, Model Checking, and Abstract Interpretation , Springer, volume : 12597, pages : 556 - 577» show abstract « hide abstract Abstract Both techniques can be used by all MaxSAT solvers, though our focus lies on Pseudo Boolean constraint based MaxSAT solvers. Experimental results show the effectiveness of both techniques on a large set of benchmarks from a hardware security application and from the 2019 MaxSAT Evaluation. In particular for the hardest of the application benchmarks, the solver Pacose with GBMO and TrimMaxSAT performs best compared to the MaxSAT Evaluation solvers of 2019. For the benchmarks of the 2019 MaxSAT Evaluation, we show that with the proposed techniques the top solver combination solves significantly more instances. Marc Pfeifer, Benjamin Völker, Sebastian Böttcher, Sven Köhler, Philipp M SchollTeaching Embedded Systems by Constructing an Escape Room 2021 In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education (SIGCSE ’21), Virtual Event, New York, USA ACM» show abstract « hide abstract Abstract Embedded systems form the basis of many of today's consumer products and industrial applications - and they are increasingly connected. To teach this topic, we created a course with the overarching goal of designing and constructing an automated escape room. This provided the motivation for the students to learn the engineering and soft skills required for building networked embedded systems. The game was open for faculty members and friends of the students after course completion. By splitting the building process into multiple tasks, such as individual puzzles, the presented concept encourages inter- and intra-group work, including conceptualizing, designing and developing reliable, connected embedded systems. In this paper we first present the motivation, context, and pedagogical approach of the course. We then describe the course structure and conclude with experiences from constructing an escape room as a group. Benjamin Völker, Andreas Reinhardt, Anthony Faustine, Lucas PereiraWatt’s up at Home? Smart Meter Data Analytics from a Consumer-Centric Perspective 2021 Energies , volume : 14, issue : 3» show abstract « hide abstract Abstract The key advantage of smart meters over traditional metering devices is their ability to transfer consumption information to remote data processing systems. Besides enabling the automated collection of a customer’s electricity consumption for billing purposes, the data collected by these devices makes the realization of many novel use cases possible. However, the large majority of such services are tailored to improve the power grid’s operation as a whole. For example, forecasts of household energy consumption or photovoltaic production allow for improved power plant generation scheduling. Similarly, the detection of anomalous consumption patterns can indicate electricity theft and serve as a trigger for corresponding investigations. Even though customers can directly influence their electrical energy consumption, the range of use cases to the users’ benefit remains much smaller than those that benefit the grid in general. In this work, we thus review the range of services tailored to the needs of end-customers. By briefly discussing their technological foundations and their potential impact on future developments, we highlight the great potentials of utilizing smart meter data from a user-centric perspective. Several open research challenges in this domain, arising from the shortcomings of state-of-the-art data communication and processing methods, are furthermore given. We expect their investigation to lead to significant advancements in data processing services and ultimately raise the customer experience of operating smart meters. back to the year overview Benjamin Völker, Marc Pfeifer, Philipp M Scholl, Bernd BeckerA Framework to Generate and Label Datasets for Non-Intrusive Load Monitoring 2020 Energies » show abstract « hide abstract Abstract In order to reduce the electricity consumption in our homes, a first step is to make the user aware of it. Raising such awareness, however, demands to pinpoint users of specific appliances that unnecessarily consume electricity. A retrofittable and scalable way to provide appliance-specific consumption is provided by Non-Intrusive Load Monitoring methods. These methods use a single electricity meter to record the aggregated consumption of all appliances and disaggregate it into the consumption of each individual appliance using advanced algorithms usually utilizing machine-learning approaches. Since these approaches are often supervised, labelled ground-truth data need to be collected in advance. Labeling on-phases of devices is already a tedious process, but, if further information about internal device states is required (e.g., intensity of an HVAC), manual post-processing quickly becomes infeasible. We propose a novel data collection and labeling framework for Non-Intrusive Load Monitoring. The framework is comprised of the hardware and software required to record and (semi-automatically) label the data. The hardware setup includes a smart-meter device to record aggregated consumption data and multiple socket meters to record appliance level data. Labeling is performed in a semi-automatic post-processing step guided by a graphical user interface, which reduced the labeling effort by 72% compared to a manual approach. We evaluated our framework and present the FIRED dataset. The dataset features uninterrupted, time synced aggregated, and individual device voltage and current waveforms with distinct state transition labels for a total of 101 days. Benjamin Völker, Marc Pfeifer, Philipp M Scholl, Bernd BeckerAnnoticity: A Smart Annotation Tool and Data Browser for Electricity Datasets 2020 5th International Workshop on Non-Intrusive Load Monitoring ACM» show abstract « hide abstract Abstract The growing request for eco-feedback and smart living concepts accelerated the development of Non-Intrusive Load Monitoring (NILM) algorithms during the last decade. Comparing and evaluating these algorithms still remains challenging due to the absence of a common benchmark datasets, and missing best practises for their application. Despite the fact that multiple datasets were recorded for the purpose of comparing NILM algorithms, many researchers still have to record their own dataset in order to meet the requirements of their specific application. Adding ground truth labels to these datasets is a cumbersome and time consuming process as it requires an expert to visually inspect all the data manually. Therefore, we propose the Annoticity inspection and labeling tool which simplifies the process of visualizing and labeling of electricity data. We use an event detector based on the log likelihood ratio test which achieved an F1 score of 90.07% in our experiments. Preliminary results indicate that the effort of generating event labels is reduced by 80.35% using our tool. Uwe Wagschal, Thomas Metz, Thomas Waldvogel, Bernd Becker, Linus Feiten, Samuel WeishauptDas »Debat-O-Meter« als neues Tool in der E-Partizipation In : Demokratie im 21. Jahrhundert 2020, Nomos , pages : 341 - 368, ISBN : 978-3-7489-2150-9 Benjamin Völker, Marc Pfeifer, Philipp M Scholl, Bernd BeckerFIRED: A Fully-labeled hIgh-fRequency Electricity Disaggregation Dataset 2020 The 7th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation Virtual Event Japan November, 2020 ACM» show abstract « hide abstract Abstract As more and more homes are equipped with smart electricity meters, home owners gain the ability to monitor their total electricity consumption on a daily or hourly basis. Techniques such as load forecasting, load disaggregation, and activity recognition try to provide even better insights into our electricity consumption, highlight saving potential or improve our daily living. To develop and evaluate these techniques, publicly available datasets are used. We identified a lack of high frequency fully labeled electricity datasets in the residential domain and present the FIRED dataset. It contains 52 of 8kHz aggregated current and voltage readings of the 3-phase supply of a typical residential apartment in Germany. The dataset also contains synchronized ground truth data as 2kHz readings of 21 individual appliances, as well as room temperature readings and fully labeled state changes of the lighting system, resulting in a complete and versatile residential electricity dataset. Pascal Raiola, Tobias Paxian, Bernd BeckerMinimal Witnesses for Security Weaknesses in Reconfigurable Scan Networks 2020 IEEE European Test Symposium Tobias Paxian, Bernd BeckerPacose: an iterative SAT-based MaxSAT solver 2020 MaxSAT Evaluation 2020: Solver and Benchmark Descriptions , volume : 16, pages : 12 - 12 Pascal Raiola, Tobias Paxian, Bernd BeckerPartial (un-) weighted MaxSAT benchmarks: minimizing witnesses for security weaknesses in reconfigurable scan networks 2020 MaxSAT Evaluation 2020 , volume : 14, pages : 44 - 44 Tobias Paxian, Mael Gay, Devanshi Upadhyaya, Bernd Becker, Ilia PolianSAT Benchmarks of AutoFault Attacking AES, LED and PRESENT 2020 SAT COMPETITION 2020 , volume : 20, pages : 79 - 79 Leonore Winterer, Sebastian Junges, Ralf Wimmer, Nils Jansen, Ufuk Topcu, Joost-Pieter Katoen, Bernd BeckerStrategy Synthesis for POMDPs in Robot Planning using Game-Based Abstractions 2020 IEEE T Automat Contr » show abstract « hide abstract Abstract We study synthesis problems with constraints in partially observable Markov decision processes (POMDPs), where the objective is to compute a strategy for an agent that is guaranteed to satisfy certain safety and performance specifications. Verification and strategy synthesis for POMDPs are, however, computationally intractable in general. We alleviate this difficulty by focusing on planning applications and exploiting typical structural properties of such scenarios; for instance, we assume that the agent has the ability to observe its own position inside an environment. We propose an abstraction refinement framework which turns such a POMDP model into a (fully observable) probabilistic two-player game (PG). For the obtained PGs, efficient verification and synthesis tools allow to determine strategies with optimal safety and performance measures, which approximate optimal schedulers on the POMDP. If the approximation is too coarse to satisfy the given specifications, an refinement scheme improves the computed strategies. As a running example, we use planning problems where an agent moves inside an environment with randomly moving obstacles and restricted observability. We demonstrate that the proposed method advances the state of the art by solving problems several orders-of-magnitude larger than those that can be handled by existing POMDP solvers. Furthermore, this method gives guarantees on safety constraints, which is not supported by the majority of the existing solvers. Leonore Winterer, Ralf Wimmer, Nils Jansen, Bernd BeckerStrengthening Determinstic Policies for POMPDs 2020 Proceedings of the 12th NASA Formal Methods Symposium (NFM) » show abstract « hide abstract Abstract The synthesis problem for partially observable Markov decision processes (POMDPs) is to compute a policy that satisfies a given specification. Such policies have to take the full execution history of a POMDP into account, rendering the problem undecidable in general. A common approach is to use a limited amount of memory and randomize over potential choices. Yet, this problem is still NP-hard and often computationally intractable in practice. A restricted problem is to use neither history nor randomization, yielding policies that are called stationary and deterministic. Previous approaches to compute such policies employ mixed-integer linear programming (MILP). We provide a novel MILP encoding that supports sophisticated specifications in the form of temporal logic constraints. It is able to handle an arbitrary number of such specifications. Yet, randomization and memory are often mandatory to achieve satisfactory policies. First, we extend our encoding to deliver a restricted class of randomized policies. Second, based on the results of the original MILP, we employ a preprocessing of the POMDP to encompass memory-based decisions. The advantages of our approach over state-of-the-art POMDP solvers lie (1) in the flexibility to strengthen simple deterministic policies without losing computational tractability and (2) in the ability to enforce the provable satisfaction of arbitrarily many specifications. The latter point allows to take trade-offs between performance and safety aspects of typical POMDP examples into account. We show the effectiveness of our method on a broad range of benchmarks. back to the year overview Benjamin Völker, Marc Pfeifer, Philipp M Scholl, Bernd BeckerA Versatile High Frequency Electricity Monitoring Framework for Our Future Connected Home 2019 EAI International Conference on Sustainable Energy for Smart Cities, Braga Portugal Springer, pages : 221 - 231» show abstract « hide abstract Abstract In our homes a lot of devices are powered by electricity with- out us knowing the specific amount. As electricity production has a large, negative environmental impact, we should be more aware about how de- vices consume power and how we can adapt our daily routine to decrease our electricity requirements. Methods such as Non-Intrusive Load Mon- itoring (NILM) can provide the user with precise device level electricity data by measuring at a single point in a houses’ electricity network. However, the time resolution of most off-the-shelf power meters is not sufficient for NILM or the meters are locked down for security reasons. Therefore, we have developed our own versatile energy metering frame- work which consists of a high frequency electricity metering device, a versatile backend for data processing and a webapp for data visualiza- tion. The developed hardware is capable of sampling up to 32 kHz, while the software framework allows to extract other power related metrics such as harmonic content. The system’s application ranges from pro- viding transparent electricity usage to the user up to generating load forecasts with fine granularity. Benjamin Völker, Philipp M Scholl, Bernd BeckerSemi-Automatic Generation and Labeling of Training Data for Non-intrusive Load Monitoring 2019 e-Energy '19 Proceedings of the Tenth ACM International Conference on Future Energy Systems ACM New York, NY, USA ©2019, volume : 10, pages : 17 - 23» show abstract « hide abstract Abstract User awareness is one of the main drivers to reduce unnecessary energy consumption in our homes. This awareness, however, requires individual energy data of the devices we own. A retrofittable way to get this data is to use Non-Intrusive Load Monitoring methods. Most of these methods are supervised and require to collect labeled ground truth data in advance. Labeling on-phases of devices is already a tedious process, but if further information about internal device states are required (e.g. intensity of an HVAC), manual labeling methods are infeasible. We propose a novel data collection and labeling method for Non-Intrusive Load Monitoring. This method uses intrusive sensors directly connected to the monitored devices. A post-processing step classifies the connected devices into four categories and exposes internal state sequences in a semi-automatic way. We evaluated our labeling method with a sample dataset comparing the amount of recognized events, states and classified device category. The event detector achieved a total F1 score of 86.52 % for devices which show distinct states in its power signal. Using our framework, the overall labeling effort is cut by more than half (42%). Leonie Feldbusch, Felix Winterer, Johannes Gramsch, Linus Feiten, Bernd BeckerGamification in a Classroom Response System for Academic Teaching 2019 11th International Conference on Computer Supported Education (CSEDU) » show abstract « hide abstract Abstract The classroom response system SMILE (SMartphones In LEctures) is regularly used in academic lectures. Among other features, it enables lecturers to start quizzes that can be answered anonymously by students on their smartphones. This aims at both activating the students and giving them feedback about their understanding of the current content of the lecture. But even though many students use SMILE in the beginning of a course, the number of active participants tends to decrease as the term progresses. This paper reports the results of a study looking at incorporating gamification into SMILE to increase the students’ motivation and involvement. Game elements such as scores, badges and a leaderboard have been integrated paired with a post-processing feature enabling students to repeat SMILE quizzes outside of the lectures. The evaluations show that the gamification approach increased the participation in SMILE quizzes significantly. Marc Pfeifer, Philipp M. Scholl, Rainer Voigt, Bernd BeckerActive Stereo Vision with High Resolution on an FPGA 2019 27th IEEE Annual International Symposium on Field-Programmable Custom Computing Machines Proceedings of the 27th IEEE Annual International Symposium on Field-Programmable Custom Computing Machines , pages : 118 - 126» show abstract « hide abstract Abstract We present a novel FPGA based active stereo vision system, tailored for the use in a mobile 3D stereo camera. For the generation of a single 3D map the matching algorithm is based on a correlation approach, where multiple stereo image pairs instead of a single one are processed to guarantee an improved depth resolution. To efficiently handle the large amounts of incoming image data we adapt the algorithm to the underlying FPGA structures, e.g. by making use of pipelining and parallelization. Experiments demonstrate that our approach provides high-quality 3D maps at least three times more energy-efficient (5.5 fps/W) than comparable approaches executed on CPU and GPU platforms. Implemented on a Xilinx Zynq-7030 SoC our system provides a computation speed of 12.2 fps, at a resolution of 1.3 megapixel and a 128 pixel disparity search space. As such it outperforms the currently best passive stereo systems of the Middlebury Stereo Evaluation in terms of speed and accuracy. The presented approach is therefore well suited for mobile applications, that require a highly accurate and energy-efficient active stereo vision system. Christoph Scholl, Jie-Hong Roland Jiang, Ralf Wimmer, Aile Ge-ErnstA PSPACE Subclass of Dependency Quantified Boolean Formulas and Its Effective Solving 2019 Honolulu, HI, USA Proceedings of the 33rd AAAI Conference on Artificial Intelligence (AAAI) , AAAI Press» show abstract « hide abstract Abstract Recently a DQBF proof calculus based on a notion of fork extension, in addition to resolution and universal reduction, was proposed by Rabe in 2017. We show that this calculus is in fact incomplete for general DQBFs, but complete for a subclass of DQBFs, where any two existential variables have either identical or disjoint dependency sets over the universal variables. We further characterize this DQBF subclass to be Sigma_3^P complete in the polynomial time hierarchy. Essentially using fork extension, a DQBF in this subclass can be converted to an equisatisfiable 3QBF with only a linear increase in formula size. We exploit this conversion for effective solving of this DQBF subclass and point out its potential as a general strategy for DQBF quantifier localization. Experimental results show that the method outperforms state-of-the-art DQBF solvers on a number of benchmarks, including the 2018 DQBF evaluation benchmarks. Phillip M Scholl, Benjamin Völker, Bernd Becker, Kristof Van LaerhovenA multi-media exchange format for time-series dataset curation In : Human Activity Sensing: Corpus and Applications 2019, Springer International Publishing , pages : 111 - 119, ISBN : 9783030130015» show abstract « hide abstract Abstract can be applied to optimize storage and transmission overhead. Uwe Wagschal, Thomas Waldvogel, Thomas Metz, Samuel Weishaupt, Linus Feiten, Bernd Becker, Kamal SinghAlle gegen Alle? Die Mehrpersonendebatte der kleinen Parteien in der Analyse In : Die Bundestagswahl 2017: Analysen der Wahl-, Parteien-, Kommunikations- und Regierungsforschung 2019, en Wiesbaden , pages : 461 - 482, ISBN : 978-3-658-25050-8» show abstract « hide abstract Abstract Der Beitrag analysiert eine TV-Debatte zur Bundestagswahl 2017 mit einem Mehrpersonenpodium unter Beteiligung der "kleinen" Parteien. Auf Basis von Echtzeit- und Umfragedaten formuliert der Beitrag nach eingehender Analyse fünf zentrale Befunde: 1) Je weiter politisch die Debattenteilnehmer auseinanderliegen, desto polarisierter und konfliktiver ist die Debatte. 2) Anhänger, die sich politisch einem Lager zurechnen bewerten auch den bzw. die Repräsentanten dieses Lagers deutlich besser. 3) Debattenteilnehmer anderer politischer Lager werden systematisch schlechter bewertet. 4) Ein besonderer Bias durch die Moderation liegt für die untersuchte Debatte nicht vor. 5) Die direkte Bedeutung von TV-Debatten für die Wahlentscheidung ist begrenzt. Polian Ilia, Gay Mael, Paxian Tobias, Sauer Matthias, Becker BerndAutomatic Construction of Fault Attacks on Cryptographic Hardware Implementations In : Automated Methods in Cryptographic Fault Analysis 2019, Springer , pages : 151 - 170, Ilia Polian, Mael Gay, Tobias Paxian, Sauer Matthias, Bernd BeckerAutomatic construction of fault attacks on cryptographic hardware implementations 2019 Automated Methods in Cryptographic Fault Analysis , volume : 1, pages : 151 - 170 Steven Carr, Nils Jansen, Ralf Wimmer, Alexandru Constantin Serban, Bernd Becker, Ufuk TopcuCounterexample-Guided Strategy Improvement for POMDPs using Recurrent Neural Networks 2019 Macao, China Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI) » show abstract « hide abstract Abstract We study strategy synthesis for POMDPs. The particular problem is to determine strategies that provably adhere to (probabilistic) temporal logic constraints. This problem is computationally intractable and theoretically hard. We propose a novel method that combines techniques from machine learning and formal verification. First, we train an RNN to encode POMDP strategies. The RNN accounts for memory-based decisions without the need to expand the full belief space of a POMDP. Secondly, we restrict the RNN-based strategy to represent a finite-memory strategy and implement it on a specific POMDP. For the resulting finite Markov chain, efficient formal verification techniques provide provable guarantees against temporal logic specifications. If the specification is not satisfied, counterexamples supply diagnostic information. We use this information to improve the strategy by iteratively training the RNN. Numerical experiments show that the proposed method elevates the state of the art in POMDP solving by up to three orders of magnitude in terms of solving times and model sizes. Gay Maël, Paxian Tobias, Upadhyaya Devanshi, Becker Bernd, Polian IliaHardware-Oriented Algebraic Fault Attack Framework with Multiple Fault Injection Support 2019 2019 Workshop on Fault Diagnosis and Tolerance in Cryptography (FDTC) , IEEE, pages : 25 - 32 Mael Gay, Tobias Paxian, Devanshi Upadhyaya, Bernd Becker, Ilia PolianHardware-oriented algebraic fault attack framework with multiple fault injection support 2019 2019 Workshop on Fault Diagnosis and Tolerance in Cryptography (FDTC) , volume : 16, pages : 25 - 32» show abstract « hide abstract Abstract The evaluation of fault attacks on security-critical hardware implementations of cryptographic primitives is an important concern. In such regards, we have created a framework for automated construction of fault attacks on hardware realization of ciphers. The framework can be used to quickly evaluate any cipher implementations, including any optimisations. It takes the circuit description of the cipher and the fault model as input. The output of the framework is a set of algebraic equations, such as conjunctive normal form (CNF) clauses, which is then fed to a SAT solver. We consider both attacking an actual implementation of a cipher on an field-programmable gate array (FPGA) platform using a fault injector and the evaluation of an early design of the cipher using idealized fault models. We report the successful application of our hardware-oriented framework to a collection of ciphers, including the advanced encryption standard (AES), and the lightweight block ciphers LED and PRESENT. The corresponding results and a discussion of the impact to different fault models on our framework are shown. Moreover, we report significant improvements compared to similar frameworks, such as speedups or more advanced features. Our framework is the first algebraic fault attack (AFA) tool to evaluate the state-of-the art cipher LED-64, PRESENT and full-scale AES using only hardware-oriented structural cipher descriptions. Aile Ge-Ernst, Christoph Scholl, Ralf WimmerLocalizing Quantifiers for DQBF 2019 Proceedings of the 2019 Conference on Formal Methods in Computer Aided Design (FMCAD) , Clark W. Barrett, Jin Yang» show abstract « hide abstract Abstract Dependency quantified Boolean formulas (DQBFs) are a powerful formalism, which subsumes quantified Boolean formulas (QBFs) and allows an explicit specification of dependencies of existential variables on universal variables. Driven by the needs of various applications that can be encoded by DQBFs in a natural, compact, and elegant way, research on DQBF solving has emerged in the past few years. However, most works focus on closed DQBFs in prenex form (where all quantifiers are placed in front of a propositional formula), and non-prenex DQBFs have almost not been studied in the literature. In this paper we provide a formal definition for syntax and semantics of non-closed non-prenex DQBFs and prove useful properties enabling quantifier localization. Moreover, we make use of our theory by integrating quantifier localization into a state-of-the- art DQBF solver. Experiments with prenex DQBF benchmarks, including those from the QBFEVAL'18 competition, clearly show that quantifier localization pays off in this context. Benjamin Thiemann, Linus Feiten, Pascal Raiola, Bernd Becker, Matthias SauerOn Integrating Lightweight Encryption in Reconfigurable Scan Networks 2019 IEEE European Test Symposium » show abstract « hide abstract Abstract The applicability of the method is demonstrated by an FPGA-based implementation. We report on the performance of the developed instrument wrapper, which is empirically shown to have only a small impact on the workflow in terms of hardware overhead, operational costs and test time overhead. Pascal Raiola, Benjamin Thiemann, Jan Burchard, Ahmed Atteya, Natalia Kapstova, Hans-Joachim Wunderlich, Bernd Becker, Matthias SauerOn Secure Data Flow in Reconfigurable Scan Networks 2019 Conf. on Design, Automation and Test in Europe Ralf Wimmer, Christoph Scholl, Bernd BeckerThe (D)QBF Preprocessor HQSpre – Underlying Theory and Its Implementation 2019 Journal on Satisfiability, Boolean Modeling and Computation , volume : 11, issue : 1, pages : 3 - 52» show abstract « hide abstract Abstract Preprocessing turned out to be an essential step for SAT, QBF, and DQBF solvers to reduce/modify the number of variables and clauses of the formula, before the formula is passed to the actual solving algorithm. These preprocessing techniques often reduce the computation time of the solver by orders of magnitude. In this paper, we present the preprocessor HQSpre that was developed for Dependency Quantified Boolean Formulas (DQBFs) and that generalizes different preprocessing techniques for SAT and QBF problems to DQBF. We give a presentation of the underlying theory together with detailed proofs as well as implementation details contributing to the efficiency of the preprocessor. HQSpre has been used with obvious success by the winners of the DQBF track, and, even more interestingly, the QBF tracks of QBFEVAL’18. Sebastian Weydner-Volkmann, Linus FeitenVertrauensstiftende Videoüberwachung? 2019 digma: Zeitschrift für Datenrecht und Informationssicherheit , volume : 19, issue : 4, pages : 218 - 221 back to the year overview Pascal Raiola, Michael A. Kochte, Ahmed Atteya, Laura Rodríguez Gómez, Hans-Joachim Wunderlich, Bernd Becker, Matthias SauerDetecting and Resolving Security Violations in Reconfigurable Scan Networks 2018 IEEE International Symposium on On-Line Testing and Robust System Design Benjamin Völker, Philipp M Scholl, Bernd BeckerMonitoring - A Hybrid Approach 2018 e-Energy '18 Proceedings of the Ninth International Conference on Future Energy Systems/Karlsruhe, Germany , pages : 436 - 438» show abstract « hide abstract Abstract research directions for hybrid load monitoring systems Ahmed Atteya, Michael Kochte, Matthias Sauer, Pascal Raiola, Bernd Becker, Hans-Joachim WunderlichOnline Prevention of Security Violations in Reconfigurable Scan Networks 2018 IEEE European Test Symposium Johanna Sepulveda, Damian Aboul-Hassan, Georg Sigl, Bernd Becker, Matthias SauerTowards the Formal Verification of Security Properties of a Network-on-Chip Router 2018 IEEE European Test Symposium Felix Neubauer, Jan Burchard, Pascal Raiola, Jochen Rivoir, Bernd Becker, Matthias SauerEfficient Generation of Parametric Test Conditions for AMS Chips with an Interval Constraint Solver 2018 IEEE VLSI Test Symposium (VTS'18) » show abstract « hide abstract Abstract The characterization of analog-mixed signal (AMS) silicon requires a suitable pattern set able to exercise the parametric operational space to – among other tasks – validate the correct
(specified) working behaviour of the device under test. As experience shows, most of the unexpected problems occur for very specific value combinations of a few test condition variables that were not expected to have an influence. Additionally, restrictions on the operational conditions have to be taken into account. We present a method to efficiently create a set of test conditions to cover such a constrained search space with a user-defined density. First, an initial test condition set is generated using quasirandom Sobol sequences. Secondly, we analyse the test conditions to identify
and fill uncovered areas in the parameter space using the in-house interval constraint solver iSAT3. The applicability of the method is demonstrated by experimental results on a 19-dimensional search space using a realistic set of constraints. Linus Feiten, Karsten Scheibler, Bernd Becker, Matthias SauerUsing different LUT paths to increase area efficiency of RO-PUFs on Altera FPGAs 2018 TRUDEVICE Workshop, Dresden » show abstract « hide abstract Abstract Furthermore, the number of PUF response bits can be increased even further without extra hardware or software if a certain degree of correlations among the response bits is accepted. The degree of correlations is proportional to the growth of response bits, so a design decision can be made. To make this decision, a metric to quantify the degree of correlations is needed, which has so far been mostly neglected in the analyses of PUF implementations. Felix Neubauer, Jan Burchard, Pascal Raiola, Jochen Rivoir, Bernd Becker, Matthias SauerHigh-Coverage AMS Test Space Optimization by Efficient Parametric Test Condition Generation 2018 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract The characterization of analog-mixed signal (AMS) silicon requires a suitable pattern set able to exercise the parametric operational space to – among other tasks – validate the correct (specified) working behaviour of the device under test. As experience shows, most of the unexpected problems occur for very specific value combinations of a few test condition variables that were not expected to have an influence. Additionally, restrictions on the operational conditions have to be taken into account. We present a method to efficiently create a set of test conditions to cover such a constrained search space with a user-defined density. First, an initial test condition set is generated using quasirandom Sobol sequences. Secondly, we analyse the test conditions to identify and fill uncovered areas in the parameter space using the in-house interval constraint solver iSAT3. The applicability of the method is demonstrated by experimental results on a 19-dimensional search space using a realistic set of constraints. Uwe Wagschal, Thomas Metz, Thomas Waldvogel, Bernd Becker, Linus Feiten, Samuel WeishauptOnline real-time-response measurement in real life settings: the Debat-O-Meter 2018 Cologne 20th General Online Research Conference (GOR) » show abstract « hide abstract Abstract Political debates have attracted substantial public and academic attention due to their prominent role in election campaigns. Reaching a tremendous number of potential voters, TV debates concentrate the electoral contest into a one-evening format that covers all major topics and main contenders. As "miniature campaigns" they reach many undecided and serve as (often the only) source of direct campaign information for voters less sophisticated or attentive to politics. In recent years Real Time Response Measurement gained international importance in political communication research. Leonore Winterer, Sebastian Junges, Ralf Wimmer, Nils Jansen, Ufuk Topcu, Joost-Pieter Kaoten, Becker BAbstraktions-basierte Verifikation von POMDPs im Motion-Planning-Kontext 2018 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract Wir untersuchen Pfadplanungsprobleme aus der Robotik und die Berechnung von Stra- tegien, die beweisbare Garantien für Sicherheitseigenschaften liefern. Dabei soll ein Agent mit hoher Wahrscheinlichkeit einen Zielort erreichen, ohne mit einem Hindernis zusammenzustoßen, das sich probabilistisch durch den Raum bewegt. Schwierig wird dieses Problem dadurch, dass der Agent seine Umgebung nur eingeschränkt beobachten kann. Dieses Szenario lässt sich auf na- türliche Art und Weise als ein sogenannter partiell beobachtbarer Markow-Entscheidungsprozess (POMDP) modellieren. Da viele interessante Eigenschaften auf POMDPs unentscheidbar sind, beschäftigt sich die hier vorgestellte Arbeit mit der Entwicklung einer spielbasierten Abstraktionsmethode, um für solche Pfadplanungsprobleme und Sicherheitseigenschaften gute Approximationen zu berechnen. Ralf Wimmer, Karina Wimmer, Christoph Scholl, Bernd BeckerAnalysis of Incomplete Circuits using Dependency Quantified Boolean Formulas In : Advanced Logic Synthesis 2018, Springer , André Ignacio Reis and Rolf Drechsler, pages : 151 - 168, André Ignacio Reis and Rolf Drechsler, ISBN : 978-3-319-67294-6 Christoph Scholl, Ralf WimmerDependency Quantified Boolean Formulas: An Overview of Solution Methods and Applications 2018 Oxford, UK Proceedings of the 21st Int'l Conf. on Theory and Applications of Satisfiability Testing (SAT) , Olaf Beyersdorff, Christoph M. Wintersteiger, volume : 10929, pages : 3 - 16» show abstract « hide abstract Abstract Dependency quantified Boolean formulas (DQBFs) as a generalization of quantified Boolean formulas (QBFs) have received considerable attention in research during the last years. Here we give an overview of the solution methods developed for DQBF so far. The exposition is complemented with the discussion of various applications that can be handled with DQBF solving. Pascal Raiola, Michael A. Kochte, Ahmed Atteya, Laura Rodríguez Gómez, Hans-Joachim Wunderlich, Bernd Becker, Matthias SauerDesign of Reconfigurable Scan Networks for Secure Data Transmission 2018 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Tobias Paxian, Sven Reimer, Bernd BeckerDynamic Polynomial Watchdog Encoding for Solving Weighted MaxSAT 2018 International Conference on Theory and Applications of Satisfiability Testing , Springer, volume : 10929, pages : 37 - 53» show abstract « hide abstract Abstract Our experimental results show that our encoding and algorithm is competitive with state-of-the-art encodings as utilized in QMaxSAT (2nd place in last MaxSAT Evaluation 2017). Our encoding dominates two of the QMaxSAT encodings, and at the same time is able to solve unique instances. We integrated our new encoding into QMaxSAT and adapt the heuristic to choose between the only remaining encoding of QMaxSAT and our approach. This combined version solves 19 (4%) more instances in overall 30\% less run time on the benchmark set of the MaxSAT Evaluation 2017. Compared to each encoding of QMaxSAT used in the evaluation, our encoding leads to an algorithm that is on average at least 2X faster. Sebastian Junges, Nils Jansen, Ralf Wimmer, Tim Quatmann, Leonore Winterer, Joost-Pieter Katoen, Bernd BeckerFinite-State Controllers of POMDPs via Parameter Synthesis 2018 Proceedings of the 34th Conference on Uncertainty in Artificial Intelligence (UAI) , AUAI Press, pages : 519 - 529» show abstract « hide abstract Abstract e study finite-state controllers (FSCs) for partially observable Markov decision processes (POMDPs) that are provably correct with respect to given specifications. The key insight is that computing (randomised) FSCs on POMDPs is equivalent to (and computationally as hard as) synthesis for parametric Markov chains (pMCs). This correspondence enables using black-box techniques to compute correct-by-construction FSCs for POMDPs for a wide range of properties. Our experimental evaluation on typical POMDP problems shows that we are competitive to state-of-the-art POMDP solvers. Steven Carr, Nils Jansen, Ralf Wimmer, Jie Fu, Ufuk TopcuHuman-in-the-Loop Synthesis for Partially Observable Markov Decision Processes 2018 Proceedings of the 2018 Annual American Control Conference (ACC) , pages : 762 - 69» show abstract « hide abstract Abstract We study planning problems where autonomous agents operate inside environments that are subject to uncertainties and not fully observable. Partially observable Markov decision processes (POMDPs) are a natural formal model to capture such problems. Because of the potentially huge or even infinite belief space in POMDPs, synthesis with safety guarantees is, in general, computationally intractable. We propose an approach that aims to circumvent this difficulty: in scenarios that can be partially or fully simulated in a virtual environment, we actively integrate a human user to control an agent. While the user repeatedly tries to safely guide the agent in the simulation, we collect data from the human input. Via behavior cloning, we translate the data into a strategy for the POMDP. The strategy resolves all nondeterminism and non-observability of the POMDP, resulting in a discrete-time Markov chain (MC). The efficient verification of this MC gives quantitative insights into the quality of the inferred human strategy by proving or disproving given system specifications. For the case that the quality of the strategy is not sufficient, we propose a refinement method using counterexamples presented to the human. Experiments show that by including humans into the POMDP verification loop we improve the state of the art by orders of magnitude in terms of scalability. Yuliya Butkova, Ralf Wimmer, Holger HermannsMarkov Automata on Discount! 2018 19. International GI/ITG Conference on Measurement, Modelling and Evaluation of Computing Systems (MMB) , Reinhard German et al., volume : 10740» show abstract « hide abstract Abstract Markov automata (MA) are a rich modelling formalism for complex systems combining compositionality with probabilistic choices and continuous stochastic timing. Model checking algorithms for different classes of properties involving probabilities and rewards have been devised for MA, opening up a spectrum of applications in dependability engineering and artificial intelligence, reaching out into economy and finance. In the latter more general contexts, several quantities of considerable importance are based on the idea of discounting reward expectations, so that the near future is more important than the far future. This paper introduces the expected discounted reward value for MA and develops effective iterative algorithms to quantify it, based on value- as well as policy-iteration. To arrive there, we reduce the problem to the computation of expected discounted rewards and expected total rewards in Markov decision processes. This allows us to adapt well-known algorithms to the MA setting. Experimental results clearly show that our algorithms are efficient and scale to MA with hundred thousands of states. Paxian Tobias, Reimer Sven, Becker BerndPacose: An Iterative SAT-based MaxSAT Solver 2018 MaxSAT Evaluation 2018 back to the year overview Jan Horáček, Jan Burchard, Bernd Becker, Martin KreuzerIntegrating Algebraic and SAT Solvers 2017 International Conference on Mathematical Aspects of Computer and Information Sciences (MACIS) 2017 Jan Horáček, Maël Gay, Bernd Becker, Martin KreuzerIntegrating Algebraic and SAT Solvers 2017 International Conference on Mathematical Aspects of Computer and Information Sciences (MACIS) 2017 Isabel Dahlhausen, Tobias SchubertVon der Teilnehmendenanalyse zur lernförderlichen Gestaltung: Wie können wir E-Learning lebendiger machen? 2017 16. Internationale ILIAS-Konferenz Jan Burchard, Maël Gay, Ange-Salomé Messeng Ekossono, Jan Horáček, Bernd Becker, Tobias Schubert, Martin Kreuzer, Ilia PolianAttacks 2017 Fault Diagnosis and Tolerance in Cryptography (FDTC) 2017 » show abstract « hide abstract Abstract A prototype of the framework AutoFault, which automatically constructs fault-injection attacks for hardware realizations of ciphers, is presented. AutoFault can be used to quickly evaluate the resistance of security-critical hardware blocks to fault attacks and the adequacy of implemented countermeasures. The framework takes as inputs solely the circuit description of the cipher and the fault(s) and produces an algebraic formula that can be handed over to an external solver. In contrast to previous work, attacks constructed by AutoFault do not incorporate any cipher-specific cryptoanalytic derivations, making the framework accessible to users without cryptographic background. We report successful application of AutoFault in combination with a state-of-the-art SAT solver to LED-64 and to small-scale AES. To the best of our knowledge, this is the first time that a state-of-the-art cipher (LED-64) was broken by a fault attack with no prior manual cryptanalysis whatsoever. Michael A. Kochte, Matthias Sauer, Laura Rodríguez Gómez, Pascal Raiola, Bernd Becker, Hans-Joachim WunderlichSpecification and verification of security in reconfigurable scan networks 2017 IEEE European Test Symposium Jan Burchard, Dominik Erb, Sudhakar M. Reddy, Adit D. Singh, Bernd BeckerTSO Tests 2017 IEEE VLSI Test Symposium (VTS'17) Uwe Wagschal, Bernd Becker, Thomas Metz, Thomas Waldvogel, Linus FeitenReal-time evaluation of political debates at home and abroad with the Debat-O-Meter 2017 Berlin 19th General Online Research Conference (GOR) » show abstract « hide abstract Abstract Due to their role in election campaigns, televised debates between political candidates (“TV-Duelle”) have attracted substantial scholarly attention. A large body of work has used physical feedback devices to measure viewers’ reactions to a debate in real-time. However, this approach is limited to lab-based settings with potential negative ramifications for e.g. external validity. Therefore, a method freeing researchers from the need to use physical devices is called for. Jan Burchard, Felix Neubauer, Pascal Raiola, Dominik Erb, Bernd BeckerEvaluating the Effectiveness of D-Chains in SAT based ATPG 2017 IEEE Latin American Test Symposium (LATS'17) » show abstract « hide abstract Abstract With the ever increasing size of today’s Very-Large-Scale-Integration (VLSI) designs new approaches for test pattern generation become more and more popular. One of the best known methods is SAT-based automatic test pattern generation (ATPG) which, in contrast to classical structural ATPG, first generates a mathematical representation of the problem in form of a Boolean formula. A specialized solver evaluates this representation to determine the testability of faults and extracts a test pattern in case a satisfying assignment was found. In order to increase the solving speed introduced the concept of D-chains which add additional information to the mathematical model. In return, this forces the solver to only consider assignments that might lead to a valid test pattern and thus reduce the search space. With the advent of incremental solving new concepts like the backward D-chain or even more recently an indirect D-chain were introduced. However, none
of the previous publications tried to analyze and evaluate which of these methods is the most beneficial. In this paper we present a thorough investigation of the different D-chain concepts to evaluate which is the best method for different problems. In addition, we propose a new indirect D-chain algorithm with two extensions. Our experimental results show that depending on the incorporated D-chain the runtime can be reduced tremendously. Matthias Sauer, Pascal Raiola, Linus Feiten, Bernd Becker, Ulrich Rührmair, Ilia PolianSensitized Path PUF: A Lightweight Embedded Physical Unclonable Function 2017 Conf. on Design, Automation and Test in Europe Felix Neubauer, Karsten Scheibler, Bernd Becker, Ahmed Mahdi, Martin Fränzle, Tino Teige, Tom Bienmüller, Detlef FehrerAccurate Dead Code Detection in Embedded C Code by Arithmetic Constraint Solving (Extended Abstract) 2017 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract Unreachable code fragments in software, despite not having a negative impact on the functional behavior, can have a bad effect in other areas, such as code optimization or coverage-based code validation and certification. Especially the latter is important in industrial, safety critical environments, where detecting such dead code is a major goal to adjust the coverage of software tests. In the context of embedded systems we focus on C programs which are reactive, control-oriented, and floating-point dominated. Such programs are either automatically generated from Simulink-plus-Stateflow models or hand-written according to coding guidelines. While there are various techniques – e.g. abstract interpretation or Counterexample guided abstraction refinement (CEGAR) – to deal with individual issues of this domain, there is none which can cover all of them. The AVACS transfer project T1 aims at the combination of these techniques in order to provide an accurate and efficient dead code detection. In this paper we present the ideas and goals of the project as well as the current status (including very promising experimental results) and future challenges. Pascal Raiola, Dominik Erb, Sudhakar Reddy, Bernd BeckerAccurate Diagnosis of Interconnect Open Defects based on the Robust Enhanced Aggressor Victim Model 2017 30th International Conference on VLSI Design » show abstract « hide abstract Abstract In addition, it allows to distinguish faults at the same interconnect and thus reducing the area that has to be considered for physical failure analysis. Experimental results show the high efficiency of the new method handling circuits with up to 500,000 non-equivalent faults and considerably increasing the diagnostic resolution. Jan Burchard, Ange-Salomé Messeng Ekossono, Jan Horáček, Maël Gay, Bernd Becker, Tobias Schubert, Martin Kreuzer, Ilia PolianAlgebraic Fault Attacks on Ciphers 2017 RESCUE Workshop on Reliability, Security and Quality at ETS 2017 Jan Burchard, Ange-Salomé Messeng Ekossono, Jan Horáček, Maël Gay, Bernd Becker, Tobias Schubert, Martin Kreuzer, Ilia PolianAlgebraic Fault Attacks on Ciphers 2017 International Verification and Security Workshop (IVSW) 2017 Universität Freiburg / FreiDok Applying CDCL to Verification and Test: When Laziness Pays Off Karsten Scheibler» show abstract « hide abstract Abstract When condensing this thesis into three keywords then CDCL, lazy and iSAT3 are the most important ones. The term CDCL is the abbreviation for conflict-driven clause learning and is the current state-of-the-art algorithm when searching for solutions for a given Boolean formula. For the moment the most important aspect is that CDCL does its job by operating on a set of so-called clauses. While some of the aforementioned problems can be expressed with a small set of these clauses which fits into the memory of currently available computers, other problems do not allow such an encoding. This is the point at which the term lazy comes into play. Instead of trying to generate all clauses in advance, some form of clever laziness is applied and the creation of a clause is deferred until it is really needed. This allows to tackle more complex problems which could not be solved before. In particular the solver iSAT3 relies on this technique for solving formulas with Boolean, integer, floating-point and real-valued variables. This solver is the basis of all contributions presented in this thesis. Wolling F, Scholl P, Reindl L, Van Laerhoven KCombining capacitive coupling with conductive clothes: Towards resource-efficient wearable communication 2017 Proceedings - International Symposium on Wearable Computers, ISWC , volume : Part F1305» show abstract « hide abstract Abstract © Copyright 2017 Association for Computing Machinery. Traditional intra-body communication approaches mostly rely on either fixed cable joints embedded in clothing, or on wireless radio transmission that tends to reach beyond the body. Situated between these approaches is body-coupled communication, a promising yet less-explored method that transmits information across the user’s skin. We propose a novel body-coupled communication approach that simplifies the physical layer of data transmission via capacitive coupling between wearable systems with conductive fabrics: This layer provides a stable reference potential for the feedback path in proximity to the attached wearables on the human body, to cancel the erratic dependency on the environmental ground, and to increase the communications’ reliability. Evaluation of our prototype shows significant increases in signal quality, due to reduced attenuation and noise. Requirements on hardware and, subsequently, energy consumption, cost, and implementation effort are reduced as well. Wolling F, Scholl P, Reindl L, Van Laerhoven KCombining capacitive coupling with conductive clothes: Towards resource-efficient wearable communication 2017 Proceedings - International Symposium on Wearable Computers, ISWC , volume : Part F1305» show abstract « hide abstract Abstract © Copyright 2017 Association for Computing Machinery. Traditional intra-body communication approaches mostly rely on either fixed cable joints embedded in clothing, or on wireless radio transmission that tends to reach beyond the body. Situated between these approaches is body-coupled communication, a promising yet less-explored method that transmits information across the user’s skin. We propose a novel body-coupled communication approach that simplifies the physical layer of data transmission via capacitive coupling between wearable systems with conductive fabrics: This layer provides a stable reference potential for the feedback path in proximity to the attached wearables on the human body, to cancel the erratic dependency on the environmental ground, and to increase the communications’ reliability. Evaluation of our prototype shows significant increases in signal quality, due to reduced attenuation and noise. Requirements on hardware and, subsequently, energy consumption, cost, and implementation effort are reduced as well. Hassan Hatefi, Ralf Wimmer, Bettina Braitling, Luis Maria Ferrer Fioriti, Bernd Becker, Holger HermannsCost vs. Time in Stochastic Games and Markov Automata 2017 Form Asp Comput , volume : 29, issue : 4, pages : 629 - 649» show abstract « hide abstract Abstract Costs and rewards are important tools for analysing quantitative aspects of models like energy consumption and costs of maintenance and repair. Under the assumption of transient costs, this paper considers the computation of expected cost-bounded rewards and cost-bounded reachability for Markov automata and Markov games. We provide a fixed point characterization of this class of properties under early schedulers. Additionally, we give a transformation to expected time-bounded rewards and time-bounded reachability, which can be computed by available algorithms. We prove the correctness of the transformation and show its effectiveness on a number of Markov automata case studies. Uwe Wagschal, Thomas Waldvogel, Thomas Metz, Bernd Becker, Linus Feiten, Samuel Weishaupt, Kamaljeet SinghDas TV-Duell und die Landtagswahl in Schleswig-Holstein: Das Debat-O-Meter als neues Instrument der politischen Kommunikationsforschung 2017 Zeitschrift für Parlamentsfragen (ZParl) , volume : 48. Jg., issue : 3, pages : 594 - 613» show abstract « hide abstract Abstract Politische TV-Debatten sind das Großereignis in modernen Wahlkämpfen und verstärkt Gegenstand von politik- und kommunikationswissenschaftlichen Untersuchungen. Die bisherige Forschung verwendete physische Eingabegeräte, die die Forscher auf Laborsettings mit kleinen Fallzahlen und negativen Effekten auf die externe Validität der Messung festlegte. Der Beitrag präsentiert eine Studie zur Landtagswahl in Schleswig-Holstein 2017 mit dem Debat-O-Meter, eine neue internetbasierte Anwendung für Smartphones, Tablets und PCs, die die Analyse von Wahrnehmungen und Wirkungen von politischen TV-Debatten mit einer großen Anzahl an Teilnehmenden in natürlichen Rezeptionssituationen erlaubt. Es wird gezeigt, dass „Real-Time Response“-Messungen (RTR) mit einer großen Teilnehmerzahl von mehr als 850 durch neue Wege der Rekrutierung mithilfe von Medienpartnern außerhalb des Labors möglich sind. Die Analyse belegt, dass Herausforderer Daniel Günther (CDU) die Debatte für sich entscheiden konnte und identifiziert die Parteineigung, die RTR-Bewertung und die Kandidateneinschätzung in der Vorbefragung als erklärungskräftigste Variablen für den Debattensieg. Böttcher S, Scholl P, Van Laerhoven KDetecting process transitions from wearable sensors: An unsupervised labeling approach 2017 ACM International Conference Proceeding Series , volume : Part F1319» show abstract « hide abstract Abstract © 2017 Association for Computing Machinery. Authoring protocols for manual tasks such as following recipes, manufacturing processes, or laboratory experiments requires a significant effort. This paper presents a system that estimates individual procedure transitions from the user’s physical movement and gestures recorded with inertial motion sensors. Combined with egocentric or external video recordings this facilitates efficient review and annotation of video databases. We investigate different clustering algorithms on wearable inertial sensor data recorded on par with video data, to automatically create transition marks between task steps. The goal is to match these marks to the transitions given in a description of the workflow, thus creating navigation cues to browse video repositories of manual work. To evaluate the performance of unsupervised clustering algorithms, the automatically generated marks are compared to human-expert created labels on publicly available datasets. Additionally, we tested the approach on a novel data set in a manufacturing lab environment, describing an existing sequential manufacturing process. Böttcher S, Scholl P, Van Laerhoven KDetecting process transitions from wearable sensors: An unsupervised labeling approach 2017 ACM International Conference Proceeding Series , volume : Part F1319» show abstract « hide abstract Abstract © 2017 Association for Computing Machinery. Authoring protocols for manual tasks such as following recipes, manufacturing processes, or laboratory experiments requires a significant effort. This paper presents a system that estimates individual procedure transitions from the user’s physical movement and gestures recorded with inertial motion sensors. Combined with egocentric or external video recordings this facilitates efficient review and annotation of video databases. We investigate different clustering algorithms on wearable inertial sensor data recorded on par with video data, to automatically create transition marks between task steps. The goal is to match these marks to the transitions given in a description of the workflow, thus creating navigation cues to browse video repositories of manual work. To evaluate the performance of unsupervised clustering algorithms, the automatically generated marks are compared to human-expert created labels on publicly available datasets. Additionally, we tested the approach on a novel data set in a manufacturing lab environment, describing an existing sequential manufacturing process. Pascal Raiola, Jan Burchard, Felix Neubauer, Dominik Erb, Bernd BeckerEvaluating the Effectiveness of D-chains in SAT-based ATPG and Diagnostic TPG 2017 J Electron Test , volume : 33, issue : 6, pages : 751 - 767» show abstract « hide abstract Abstract The ever increasing size and complexity of today’s Very-Large-Scale-Integration (VLSI) designs requires a thorough investigation of new approaches for the generation of test patterns for both test and diagnosis of faults. SAT-based automatic test pattern generation (ATPG) is one of the most popular methods, where, in contrast to classical structural ATPG methods, first a mathematical representation of the problem in form of a Boolean formula is generated, which is then evaluated by a specialized solver. If the considered fault is testable, the solver will return a satisfying assignment, from which a test pattern can be extracted; otherwise no such assignment can exist. In order to speed up test pattern generation, the concept of D-chains was introduced by several researchers. Thereby supplementary clauses are added to the Boolean formula, reducing the search space and guiding the solver toward the solution. In the past, different variants of D-chains have been developed, such as the backward D-chain or the indirect D-chain. In this work we perform a thorough analysis and evaluation of the D-chain variants for test pattern generation and also analyze the impact of different D-chain encodings on diagnostic test pattern generation. Our experimental results show that depending on the incorporated D-chain the runtime can be reduced tremendously. Jan Burchard, Dominik Erb, Adit D. Singh, Sudhakar M. Reddy, Bernd BeckerFast and Waveform-Accurate Hazard-Aware SAT-Based TSOF ATPG 2017 Conference on Design, Automation and Test in Europe » show abstract « hide abstract Abstract unaware tests will fail to detect the targeted opens because of test invalidation – these are reliably detected by our new test generation methodology. Importantly, our approach can also be applied to improve the effectiveness of commercial cell aware tests. Ralf Wimmer, Andreas Karrenbauer, Ruben Becker, Christoph Scholl, Bernd BeckerFrom DQBF to QBF by Dependency Elimination 2017 Melbourne, VIC, Australia Int'l Conf. on Theory and Applications of Satisfiability Checking , Serge Gaspers and Toby Walsh, volume : 10491, pages : 326 - 343» show abstract « hide abstract Abstract In this paper, we propose the elimination of dependencies to convert a given dependency quantified Boolean formula (DQBF) to an equisatisfiable QBF. We show how to select a set of dependencies to eliminate such that we arrive at a smallest equisatisfiable QBF in terms of existential variables that is achievable using dependency elimination. This approach is improved by taking so-called don't-care dependencies into account, which result from the application of dependency schemes to the formula and can be added to or removed from the formula at no cost. We have implemented this new method in the state-of-the-art DQBF solver HQS. Experiments show that dependency elimination is clearly superior to the previous method using variable elimination. Ralf Wimmer, Sven Reimer, Paolo Marin, Bernd BeckerHQSpre - An Effective Preprocessor for QBF and DQBF 2017 Uppsala, Sweden International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS), Part I , Axel Legay and Tiziana Margaria, volume : 10205, pages : 373 - 390» show abstract « hide abstract Abstract We present our new preprocessor \textsc{HQSpre}, a state-of-the-art tool for simplifying quantified Boolean formulas (QBFs) and the first available preprocessor for dependency quantified Boolean formulas (DQBFs). The latter are a generalization of QBFs, resulting from adding so-called Henkin-quantifiers to QBFs. HQSpre applies most of the preprocessing techniques that have been proposed in the literature. It can be used both as a standalone tool and as a library. It is possible to tailor it towards different solver back-ends, e. g., to preserve the circuit structure of the formula when a non-CNF solver back-end is used. Extensive experiments show that HQSpre allows QBF solvers to solve more benchmark instances and is able to decide more instances on its own than state-of-the-art tools. The same impact can be observed in the DQBF domain as well. Linus Feiten, Matthias Sauer, Bernd BeckerImplementation of Delay-Based PUFs on Altera FPGAs In : Hardware Security and Trust: Design and Deployment of Integrated Circuits in a Threatened Environment 2017, Springer International Publishing , pages : 211 - 235, ISBN : 978-3-319-44318-8» show abstract « hide abstract Abstract This chapter focuses on the implementation of delay-based PUFs on Altera FPGAs. While there has been a manifold of publications on how to evaluate and refine PUFs, a thorough description of the required "handicrafts" enabling a novice to enter this exciting research field has so far been missing. The methods shared in this chapter are not just easily extractable from available standard documentation, but have been compiled by the authors over a long period of trials and consultations with the Altera user community. Designing a delay-based PUF on FPGAs requires fine-tuning which is often like diverting the automated design tools from their intended use. For example, the device-specific delays for the PUF response generation are generally gathered from circuitry looking redundant to the bitstream compiler. Therefore, the automatic reduction of such seemingly redundant circuitry must be prevented. The way the circuitry is placed and routed also has a major impact on delay characteristics, so it is necessary to customise this instead of letting the compiler do it automatically. The reader will be walked through all necessary steps by means of a running example enabling them to embark on further experiments on their own. Along the way, the architecture of Altera Cyclone FPGAs is explained and results from the authors' own experimental studies are shared. Scholl P, Van Laerhoven KLessons learned from designing an instrumented lighter for assessing smoking status 2017 UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers » show abstract « hide abstract Abstract Copyright © 2017 ACM. Assessing smoking status from wearable sensors can potentially create novel cessation therapies for the global smoking epidemic, without requiring users to regularly fill in questionnaires to obtain their smoking data. In this paper we discuss several design iterations of an instrumented cigarette lighter the records when it is used, to provide the ground-truth for detection from sensor signals measured at the human body, or to provide an alternative low-delay detection mechanism for smoking. Scholl P, Van Laerhoven KLessons learned from designing an instrumented lighter for assessing smoking status 2017 UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers » show abstract « hide abstract Abstract ls measured at the human body, or to provide an alternative low-delay detection mechanism for smoking. Yuliya Butkova, Ralf Wimmer, Holger HermannsLong-run Rewards for Markov Automata 2017 Uppsala, Sweden International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS), Part II , Axel Legay and Tiziana Margaria, volume : 10206, pages : 188 - 203» show abstract « hide abstract Abstract Markov automata are a powerful formalism for modelling systems which exhibit nondeterminism, probabilistic choices and continuous stochastic timing. We consider the computation of long-run average rewards, the most classical problem in continuous-time Markov model analysis. We propose an algorithm based on value iteration. It improves the state of the art by orders of magnitude. The contribution is rooted in a fresh look on Markov automata, namely by treating them as an efficient encoding of CTMDPs with -- in the worst case -- exponentially more transitions. Leonore Winterer, Sebastian Junges, Ralf Wimmer, Nils Jansen, Ufuk Topcu, Joost-Pieter Kaoten, Becker BMotion Planning under Partial Observability using Game-Based Abstraction 2017 Melbourne, VIC, Australia 56th IEEE Conf. on Decision and Control (CDC) , pages : 2201 - 2208» show abstract « hide abstract Abstract We study motion planning problems where agents move inside environments that are not fully observable and subject to uncertainties. The goal is to compute a strategy for an agent that is guaranteed to satisfy certain safety and performance specifications. Such problems are naturally modeled by partially observable Markov decision processes (POMDPs). Because of the potentially huge or even infinite belief space of POMDPs, verification and strategy synthesis is in general computationally intractable. We tackle this difficulty by exploiting typical structural properties of such scenarios; for instance, we assume that agents have the ability to observe their own positions inside an evironment. Ambiguity in the state of the environment is abstracted into non-deterministic choices over the possible states of the environment. Technically, this abstraction transforms POMDPs into probabilistic two-player games (PGs). For these PGs, efficient verification tools are able to determine strategies that approximate certain measures on the POMDP. If an approximation is too coarse to provide guarantees, an abstraction refinement scheme further resolves the belief space of the POMDP. We demonstrate that our method improves the state of the art by orders of magnitude compared to a direct solution of the POMDP. Scholl P, Van Laerhoven KOn the statistical properties of body-worn inertial motion sensor data for identifying sensor modality 2017 Proceedings - International Symposium on Wearable Computers, ISWC , volume : Part F1305» show abstract « hide abstract Abstract ssumption hold, 98% of all 1003 data points were correctly classified. Scholl P, Van Laerhoven KOn the statistical properties of body-worn inertial motion sensor data for identifying sensor modality 2017 Proceedings - International Symposium on Wearable Computers, ISWC , volume : Part F1305» show abstract « hide abstract Abstract © 2017 Copyright held by the owner/author(s). Interpreting datasets containing inertial data (acceleration, rateof- turn, magnetic flux) requires a description of the datasets itself. Often this description is unstructured, stored as a convention or simply not available anymore. In this note, we argue that each modality exhibits particular statistical properties, which allows to reconstruct it solely from the sensor’s data. To investigate this, tri-axial inertial sensor data from five publicly available datasets were analysed. Three statistical properties: mode, kurtosis, and number of modes are shown to be sufficient for classification - assuming the sampling rate and sample format are known, and that both acceleration and magnetometer data is present. While those assumption hold, 98% of all 1003 data points were correctly classified. Sebastian Junges, Nils Jansen, Ralf Wimmer, Tim Quatmann, Leonore Winterer, Joost-Pieter Katoen, Bernd BeckerPermissive Finite-State Controllers of POMDPs using Parameter Synthesis 2017 » show abstract « hide abstract Abstract We study finite-state controllers (FSCs) for partially observable Markov decision processes (POMDPs) that are provably correct with respect to given specifications. The key insight is that computing (randomised) FSCs on POMDPs is equivalent to - and computationally as hard as - synthesis for parametric Markov chains (pMCs). This correspondence allows to use tools for parameter synthesis in pMCs to compute correct-by-construction FSCs on POMDPs for a variety of specifications. Our experimental evaluation shows comparable performance to well-known POMDP solvers. Mummadi C, Kasireddy S, Peter Leo F, Scholl P, Verma K, Van Laerhoven KReal-time embedded recognition of sign language alphabet fingerspelling in an IMU-based glove 2017 ACM International Conference Proceeding Series , volume : Part F1319» show abstract « hide abstract Abstract Copyright © 2017 Association for Computing Machinery. Data gloves have numerous applications, including enabling novel human-computer interaction and automated recognition of large sets of gestures, such as those used for sign language. For most of these applications, it is important to build mobile and self-contained applications that run without the need for frequent communication with additional services on a back-end server. We present in this paper a data glove prototype, based on multiple small Inertial Measurement Units (IMUs), with a glove-embedded classifier for the french sign language. In an extensive set of experiments with 57 participants, our system was tested by repeatedly fingerspelling the French Sign Language (LSF) alphabet. Results show that our system is capable of detecting the LSF alphabet with a mean accuracy score of 92% and an F1 score of 91%, with all detections performed on the glove within 63 milliseconds. Mummadi C, Kasireddy S, Peter Leo F, Scholl P, Verma K, Van Laerhoven KReal-time embedded recognition of sign language alphabet fingerspelling in an IMU-based glove 2017 ACM International Conference Proceeding Series , volume : Part F1319» show abstract « hide abstract Abstract Copyright © 2017 Association for Computing Machinery. Data gloves have numerous applications, including enabling novel human-computer interaction and automated recognition of large sets of gestures, such as those used for sign language. For most of these applications, it is important to build mobile and self-contained applications that run without the need for frequent communication with additional services on a back-end server. We present in this paper a data glove prototype, based on multiple small Inertial Measurement Units (IMUs), with a glove-embedded classifier for the french sign language. In an extensive set of experiments with 57 participants, our system was tested by repeatedly fingerspelling the French Sign Language (LSF) alphabet. Results show that our system is capable of detecting the LSF alphabet with a mean accuracy score of 92% and an F1 score of 91%, with all detections performed on the glove within 63 milliseconds. Meurisch C, Naeem U, Scholl P, Azam M, Günther S, Baumann P, Réhman S, Mühlhäuser MSmartguidance’17:2nd workshop on intelligent personal support of human behavior 2017 UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers » show abstract « hide abstract Abstract Copyright © 2017 ACM. In today’s fast-paced environment, humans are faced with various problems such as information overload, stress, health and social issues. So-called anticipatory systems promise to approach these issues through personal guidance or support within a user’s daily and professional life. The Second Workshop on Intelligent Personal Support of Human Behavior (SmartGuidance’17) aims to build on the success of the previous workshop (namely Smarticipation) organized in conjunction with UbiComp 2016, to continue discussing the latest research outcomes of anticipatory mobile systems. We invite the submission of papers within this emerging, interdisciplinary research field of anticipatory mobile computing that focuses on understanding, design, and development of such ubiquitous systems. We also welcome contributions that investigate human behaviors, underlying recognition and prediction models; conduct field studies; as well as propose novel HCI techniques to provide personal support. All workshop contributions will be published in supplemental proceedings of the UbiComp 2017 conference and included in the ACM Digital Library. Meurisch C, Naeem U, Scholl P, Azam M, Günther S, Baumann P, Réhman S, Mühlhäuser MSmartguidance’17:2nd workshop on intelligent personal support of human behavior 2017 UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers » show abstract « hide abstract Abstract s of anticipatory mobile systems. We invite the submission of papers within this emerging, interdisciplinary research field of anticipatory mobile computing that focuses on understanding, design, and development of such ubiquitous systems. We also welcome contributions that investigate human behaviors, underlying recognition and prediction models; conduct field studies; as well as propose novel HCI techniques to provide personal support. All workshop contributions will be published in supplemental proceedings of the UbiComp 2017 conference and included in the ACM Digital Library. Michael Kochte, Sauer M, Laura Rodriguez Gomez, Raiola P, Becker B, Hans-Joachim WunderlichSpecification and verification of security in reconfigurable scan networks 2017 IEEE European Test Symposium Tobias Schubert, Benjamin Völker, Marc Pfeifer, Bernd BeckerThe Smart MiniFab: An Industrial IoT Demonstrator Anywhere at Any Time 2017 Smart Education and e-Learning 2017, Vilamoura/Portugal, KES International Springer International Publishing, pages : 253 - 262 Bernd Becker, Christoph Scholl, Ralf WimmerVerification of Incomplete Designs In : Formal System Verification - State of the Art and Future Trends 2017, Springer , Rolf Drechsler, pages : 37 - 72, Rolf Drechsler, ISBN : 978-3-319-57683-1» show abstract « hide abstract Abstract We consider the verification of digital systems which are incomplete in the sense that for some modules only their interfaces (i.e., the signals entering and leaving the module) are known, but not their implementations. For such designs, we study realizability (``Is it possible to implement the missing modules such that the complete design has certain properties?'') and validity (``Does a property hold no matter how the missing parts are implemented?''). back to the year overview Marc Pfeifer, Tobias Schubert, Bernd BeckerPackSens: A Condition and Transport Monitoring System Based on an Embedded Sensor Platform 2016 7th EAI International Conference on Sensor Systems and Software Proceedings of the 7th EAI International Conference on Sensor Systems and Software , pages : 81 - 92» show abstract « hide abstract Abstract As a consequence of the growing globalization, transports which need a safe handling are increasing. Therefore, this paper introduces an innovative transport and condition monitoring system based on a mobile embedded sensor platform. The platform is equipped with a variety of sensors needed to extensively monitor a transport and can be attached directly to the transported good. The included microcontroller processes all relevant data served by the sensors in a very power efficient manner. Furthermore, it provides possible violations of previously given thresholds through a standardized Near Field Communication (NFC) interface to the user. Since falls are one major cause of damages while transportation, the presented system is the first one that not only detects every fall but also analyses the fall height and other parameters related to the fall event in real-time on the platform. The whole system was tested in different experiments where all critical situations and in particular all fall situations have been detected correctly. Benjamin Völker, Tobias Schubert, Bernd BeckeriHouse: A Voice-Controlled, Centralized, Retrospective Smart Home 2016 7th EAI International Conference on Sensor Systems and Software » show abstract « hide abstract Abstract Speech recognition in smart home systems has become pop- ular in both, research and consumer areas. This paper introduces an in- novative concept for a modular, customizable, and voice-controlled smart home system. The system combines the advantages of distributed and centralized processing to enable a secure as well as highly modular plat- form and allows to add existing non-smart components retrospectively into the smart environment. To interact with the system in the most com- fortable way - and in particular without additional devices like smart- phones - voice-controlling was added as the means of choice. The task of speech recognition is partitioned into decentral Wake-Up-Word (WUW) recognition and central continuous speech recognition to enable flexibil- ity while maintaining security. This is achieved utilizing a novel WUW algorithm suitable to be executed on small microcontrollers which uses Mel Frequency Cepstral Coefficients as well as Dynamic Time Warping. A high rejection rate up to 99.93% was achieved, justifying the use of the algorithm as a voice trigger in the developed smart home system. Matthias Sauer, Jie Jiang, Sven Reimer, Kohei Miyase, Xiaoqing Wen, Bernd Becker, Ilia PolianOn Optimal Power-aware Path Sensitization 2016 2016 25nd IEEE Asian Test Symposium (ATS) Mathias Soeken, Pascal Raiola, Baruch Sterin, Bernd Becker, Giovanni De Micheli, Matthias SauerSAT-based Combinational and Sequential Dependency Computation 2016 Haifa Verification Conference (HVC) Sebastian Volkmann, Linus Feiten, Christian Zimmermann, Sebastian Sester, Laura Wehle, Bernd BeckerDigitale Tarnkappe: Anonymisierung in Videoaufnahmen 2016 INFORMATIK 2016 Gesellschaft für Informatik (GI) , Heinrich C. Mayr, Martin Pinzger, volume : P-252, pages : 413 - 426» show abstract « hide abstract Abstract Videoüberwachung ist heute allgegenwärtig. Sie dient dazu, Delikte im Nachhinein aufzuklären, zur Echtzeit-Überwachung oder zur Abschreckung. Darüber hinaus gibt es aber auch wirtschaftliche Interessen für eine Videoüberwachung und automatische Erfassung von Personen - z.B. zur Erstellung von Kundenprofilen und somit zur Analyse von Kaufverhalten. Dem gegenüber stehen Rechtsansprüche sowie ethische und gesellschaftliche Grundnormen, etwa dass Menschen nicht unter Generalverdacht gestellt oder ohne Zustimmung aufgezeichnet werden dürfen. In dieser Arbeit wird ein technischer Lösungsansatz behandelt, der eine flexible Handhabung der Videoüberwachung erlaubt. Es werden in diesem Zusammenhang neben der technischen Umsetzung auch ökonomische, ethische und juristische Fragen betrachtet. Der Lösungsansatz besteht darin, Personen auf Videoaufnahmen durch ein kryptographisches Verfahren unkenntlich zu machen, noch bevor die Aufnahmen die Kamera-Elektronik verlassen. Nur mittels eines geheimen kryptographischen Schlüssels können einzelne Zeit- und Bildbereiche einer Aufnahme wieder deanonymisiert werden, wodurch rechtlichen wie ethischen Bedenken Rechnung getragen werden kann. In kommerziellen Szenarien erlaubt es diese digitale Tarnkappe, dass Kunden z.B. im Rahmen eines Prämien-Programms freiwillig auf Anonymisierung verzichten. Während in der Literatur der Informatik bereits seit längerem Technologien für solche System beschrieben werden, werden in dieser Arbeit Wege gezeigt, wie dessen Einbettung in die Gesellschaft wirklich realisiert werden könnte. Tobias Schubert, Isabel DahlhausenBereich Industrie 4.0 2016 DGWF-Jahrestagung 2016 , pages : 142 - 143 Jan Burchard, Tobias Schubert, Bernd BeckerDistributed Parallel #SAT Solving 2016 IEEE Cluster 2016 » show abstract « hide abstract Abstract of available cores dCountAntom solved problems on average 141 times faster than a single core implementation. Mathias Soeken, Pascal Raiola, Baruch Sterin, Matthias SauerSAT-based Functional Dependency Computation 2016 Synthesis Michael Kochte, Rafal Baranowski, Matthias Sauer, Bernd Becker, Hans-Joachim WunderlichFormal Verification of Secure Reconfigurable Scan Network Infrastructure 2016 IEEE European Test Symposium Michael Kochte, Matthias Sauer, Pascal Raiola, Bernd Becker, Hans-Joachim WunderlichFormaler Nachweis komplexer Sicherheitseigenschaften in rekonfigurierbarer Infrastruktur 2016 eda Workshop Andreas Riefert, Riccardo Cantoro, Matthias Sauer, Matteo Sonza Reorda, Bernd BeckerEffective Generation and Evaluation of Diagnostic SBST Programs 2016 IEEE VLSI Test Symposium Linus Feiten, Matthias Sauer, Bernd BeckerOn Metrics to Quantify the Inter-Device Uniqueness of PUFs 2016 TRUDEVICE Workshop, Dresden » show abstract « hide abstract Abstract presented in the literature. An important criterion is always the diversity of PUF responses for different devices, called inter-device uniqueness. A very popular uniqueness metric consists of calculating the pairwise hamming distance between the response bit-strings of all devices, assuming that all response bits are uncorrelated. Such correlations, however, should be regarded when a statement about inter-device uniqueness is made. We therefore propose a novel correlation metric to fulfil this requirement. Furthermore, we show that the hamming distance metric is actually redundant when at the same time the also popular bit-aliasing metric is applied. Matthias Sauer, Sven Reimer, Daniel Tille, Karsten Scheibler, Dominik Erb, Ulrike Pfannkuchen, Bernd BeckerClock Cycle Aware Encoding for SAT-based Circuit Initialization 2016 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Andreas Riefert, Riccardo Cantoro, Matthias Sauer, Matteo Sonza Reorda, Bernd BeckerA Flexible Framework for the Automatic Generation of SBST Programs 2016 IEEE Transactions on Very Large Scale Integration (VLSI) Systems , volume : 24, issue : 10, pages : 3055 - 3066 Scholl Philipp, Van Laerhoven KristofA Multi-Media Exchange Format for Time-Series Dataset Curation 2016 UbiComp 2016 Adjunct - Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing » show abstract « hide abstract Abstract Exchanging data as comma-separated values (CSV) is slow, cumbersome and error-prone. Especially for timeseries data, which is common in Activity Recognition, synchronizing several independently recorded sensors is challenging. Adding second level evidence, like video recordings from multiple angles and time-coded annotations, further complicates the matter of curating such data. A possible alternative is to make use of standardized multimedia formats. Sensor data can be encoded in audio format, and time-coded information, like annotations, as subtitles. Video data can be added easily. All this media can be merged into a single container file, which makes the issue of synchronization explicit. The incurred performance overhead by this encoding is shown to be negligible and compression can be applied to optimize storage and transmission overhead. Scholl Philipp, Van Laerhoven KristofA Multi-Media Exchange Format for Time-Series Dataset Curation 2016 UbiComp 2016 Adjunct - Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing » show abstract « hide abstract Abstract Exchanging data as comma-separated values (CSV) is slow, cumbersome and error-prone. Especially for timeseries data, which is common in Activity Recognition, synchronizing several independently recorded sensors is challenging. Adding second level evidence, like video recordings from multiple angles and time-coded annotations, further complicates the matter of curating such data. A possible alternative is to make use of standardized multimedia formats. Sensor data can be encoded in audio format, and time-coded information, like annotations, as subtitles. Video data can be added easily. All this media can be merged into a single container file, which makes the issue of synchronization explicit. The incurred performance overhead by this encoding is shown to be negligible and compression can be applied to optimize storage and transmission overhead. Scholl P, Van Laerhoven KA multi-media exchange format for time-series dataset curation 2016 UbiComp 2016 Adjunct - Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing » show abstract « hide abstract Abstract © 2016 ACM. Exchanging data as comma-separated values (CSV) is slow, cumbersome and error-prone. Especially for timeseries data, which is common in Activity Recognition, synchronizing several independently recorded sensors is challenging. Adding second level evidence, like video recordings from multiple angles and time-coded annotations, further complicates the matter of curating such data. A possible alternative is to make use of standardized multimedia formats. Sensor data can be encoded in audio format, and time-coded information, like annotations, as subtitles. Video data can be added easily. All this media can be merged into a single container file, which makes the issue of synchronization explicit. The incurred performance overhead by this encoding is shown to be negligible and compression can be applied to optimize storage and transmission overhead. Scholl P, Van Laerhoven KA multi-media exchange format for time-series dataset curation 2016 UbiComp 2016 Adjunct - Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing » show abstract « hide abstract Abstract © 2016 ACM. Exchanging data as comma-separated values (CSV) is slow, cumbersome and error-prone. Especially for timeseries data, which is common in Activity Recognition, synchronizing several independently recorded sensors is challenging. Adding second level evidence, like video recordings from multiple angles and time-coded annotations, further complicates the matter of curating such data. A possible alternative is to make use of standardized multimedia formats. Sensor data can be encoded in audio format, and time-coded information, like annotations, as subtitles. Video data can be added easily. All this media can be merged into a single container file, which makes the issue of synchronization explicit. The incurred performance overhead by this encoding is shown to be negligible and compression can be applied to optimize storage and transmission overhead. Felix Neubauer, Karsten Scheibler, Bernd Becker, Ahmed Mahdi, Martin Fränzle, Tino Teige, Tom Bienmüller, Detlef FehrerAccurate Dead Code Detection in Embedded C Code by Arithmetic Constraint Solving 2016 First International Workshop on Satisfiability Checking and Symbolic Computation - FETOPEN-CSA SC2 Workshop - Affiliated with SYNASC 2016 » show abstract « hide abstract Abstract Unreachable code fragments in software, despite not having a negative impact on the functional behavior, can have a bad effect in other areas, such as code optimization or coverage-based code validation and certification. Especially the latter is important in industrial, safety critical environments, where detecting such dead code is a major goal to adjust the coverage of software tests. In the context of embedded systems we focus on C programs which are reactive, control-oriented, and floating-point dominated. Such programs are either automatically generated from Simulink-plus-Stateflow models or hand-written according to coding guidelines. While there are various techniques – e.g. abstract interpretation or Counterexample guided abstraction refinement (CEGAR) – to deal with individual issues of this
domain, there is none which can cover all of them. The AVACS transfer project T1 aims at the combination of these techniques in order to provide an accurate and efficient dead code detection. In this paper we present the ideas and goals of the project as well as the current status (including very promising experimental results) and future challenges. Karsten Scheibler, Felix Neubauer, Ahmed Mahdi, Martin Fränzle, Tino Teige, Tom Bienmüller, Detlef Fehrer, Bernd BeckerAccurate ICP-based Floating-Point Reasoning 2016 Formal Methods in Computer-Aided Design , pages : 177 - 184» show abstract « hide abstract Abstract In scientific and technical software, floating-point arithmetic is often used to approximate arithmetic on physical quantities natively modeled as reals. Checking properties for such programs (e.g. proving unreachability of code fragments) requires accurate reasoning over floating-point arithmetic. Currently, most of the SMT-solvers addressing this problem class rely on bit-blasting. Recently, methods based on reasoning in interval lattices have been lifted from the reals (where they traditionally have been successful) to the floating-point numbers. The approach presented in this paper follows the latter line of interval-based reasoning, but extends it by including bitwise integer operations and cast operations between integer and floating-point arithmetic. Such operations have hitherto been omitted, as they tend to define sets not concisely representable in interval lattices, and were consequently considered the domain of bit-blasting approaches. By adding them to interval-based reasoning, the full range of basic data types and operations of C programs is supported. Furthermore, we propose techniques in order to mitigate the problem of aliasing during interval reasoning. The experimental results confirm the efficacy of the proposed techniques. Our approach outperforms solvers relying on bit-blasting as well as the existing interval-based SMT-solver. , volume : 421Advances in Intelligent Systems and Computing Van Laerhoven K, Scholl P» show abstract « hide abstract Abstract © Springer International Publishing Switzerland 2016. Wearable sensors have traditionally been designed around a micro controller that periodically reads values from attached sensor chips, before analyzing and forwarding data. As many off-the-shelf sensor chips have become smaller and widespread in consumer appliances, the way they are interfaced has become digital and more potent. This paper investigates the impact of using such chips that are not only smaller and cheaper as their predecessors, but also come with an arsenal of extra processing and detection capabilities, built in the sensor package. A case study with accompanying experiments using two MEMS accelerometers, show that using these capabilities can cause significant reductions in resources for data acquisition, and could even support basic recognition tasks. , volume : 421Advances in Intelligent Systems and Computing Van Laerhoven K, Scholl P» show abstract « hide abstract Abstract © Springer International Publishing Switzerland 2016. Wearable sensors have traditionally been designed around a micro controller that periodically reads values from attached sensor chips, before analyzing and forwarding data. As many off-the-shelf sensor chips have become smaller and widespread in consumer appliances, the way they are interfaced has become digital and more potent. This paper investigates the impact of using such chips that are not only smaller and cheaper as their predecessors, but also come with an arsenal of extra processing and detection capabilities, built in the sensor package. A case study with accompanying experiments using two MEMS accelerometers, show that using these capabilities can cause significant reductions in resources for data acquisition, and could even support basic recognition tasks. Ahmed Mahdi, Karsten Scheibler, Felix Neubauer, Martin Fränzle, Bernd BeckerAdvancing software model checking beyond linear arithmetic theories 2016 12th International Haifa Verification Conference, HVC 2016, Haifa, Israel, November 14-17, 2016 Twelfth Haifa Verification Conference 2016 , Bloem, Roderick, Arbel, Eli (Eds.), volume : 10028, pages : 186 - 201» show abstract « hide abstract Abstract Motivated by the practical need for verifying embedded control programs involving linear, polynomial, and transcendental arithmetics, we demonstrate in this paper a CEGAR technique addressing reachability checking over that rich fragment of arithmetics. In contrast to previous approaches, it is neither based on bit-blasting of floating-point implementations nor confined to decidable fragments of real arithmetic, namely linear or polynomial arithmetic. Its CEGAR loop is based on Craig interpolation within the iSAT3 SMT solver, which employs (abstract) conflict-driven clause learning (CDCL) over interval domains together with interval constraint propagation. As usual, the interpolants thus obtained on spurious counterexamples are used to subsequently refine the abstraction, yet in contrast to manipulating and refining the state set of a discrete-state abstraction, we propose a novel technique for refining the abstraction, where we annotate the abstract model’s transitions with side-conditions summarizing their effect. We exploit this for implementing case-based reasoning based on assumption-commitment predicates extracted from the stepwise interpolants in a lazy abstraction mechanism. We implemented our approach within iSAT3 and demonstrate its effectiveness by verifying several benchmarks. Schwabe Nils, Scholl Philipp, Van Laerhoven KristofAn Ad-Hoc Capture System for Augmenting Non-Digital Water Meters 2016 Proceedings of the 6th International Conference on the Internet of Things Schwabe Nils, Scholl Philipp, Van Laerhoven KristofAn Ad-Hoc Capture System for Augmenting Non-Digital Water Meters 2016 Proceedings of the 6th International Conference on the Internet of Things Karsten Scheibler, Dominik Erb, Bernd BeckerApplying Tailored Formal Methods to X-ATPG 2016 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract However, usually, one can not assume full control of all sequential elements and primary inputs. Therefore, uninitialized flip-flops may occur in the design -- leading to binary but unknown values (X-values). Classically, these X-values are handled by three-valued logic (01X). But this introduces pessimism when X-values reconverge in the circuit. On the other hand, the expressiveness of QBF allows to encode the unknown values accurately -- but comes at the cost of much higher runtimes. Therefore, we presented in [1] an accurate SAT-based CEGAR approach for automatic test pattern generation (ATPG) in presence of X-values. This approach is tailored to and optimized for this specific problem class and could also be used for closely related topics such as equivalence checking in presence of X-values. Although the approach is similar to an existing CEGAR-based 2-QBF solving algorithm, we show that our method has up to 50x lower runtimes. This underlines that an efficient CNF-encoding of a problem together with the fastest available standard solver does not guarantee optimal performance. Tailoring the underlying formal methods to the target problem class may give remarkable speedups on top. Thomas Metz, Uwe Wagschal, Thomas Waldvogel, Marko Bachl, Linus Feiten, Bernd BeckerDas Debat-O-Meter: ein neues Instrument zur Analyse von TV-Duellen 2016 ZSE Zeitschrift für Staats- und Europawissenschaften , volume : 14, issue : 1, pages : 124 - 149 Ralf Wimmer, Christoph Scholl, Karina Wimmer, Bernd BeckerDependency Schemes for DQBF 2016 Proceedings of the 19th International Conference on Theory and Applications of Satisfiability Testing (SAT) , Springer, volume : 9710, pages : 473 - 489» show abstract « hide abstract Abstract Dependency schemes allow to identify variable independencies in QBFs or DQBFs. For QBF, several dependency schemes have been proposed, which differ in the number of independencies they are able to identify. In this paper, we analyze the spectrum of dependency schemes that were proposed for QBF. It turns out that only some of them are sound for DQBF. For the sound ones, we provide a correctness proof, for the others counterexamples. Experiments show that a significant number of dependencies can either be added to or removed from a formula without changing its truth value, but with significantly increasing the flexibility for modifying the representation. Mashhadi Afra, Acer Utku, Boran Aidan, Scholl Philipp, Forlivesi Claudio, Vanderhulst Geert, Kawsar FahimExploring space syntax on entrepreneurial opportunities with Wi-Fi analytics 2016 Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing - UbiComp ’16 » show abstract « hide abstract Abstract Industrial events and exhibitions play a powerful role in creating social relations amongst individuals and firms, enabling them to expand their social network so to acquire resources. However, often these events impose a spatial structure which impacts encounter opportunities. In this paper, we study the impact that the spatial configuration has on the formation of network relations. We designed, developed and deployed aWi-Fi analytics solution comprising of wearableWi-Fi badges and gateways in a large scale industrial exhibition event to study the spatio-temporal trajectories of the 2.5K+ attendees including two special groups: 34 investors and 27 entrepreneurs. Our results suggest that certain zones with designated functionalities play a key role in forming social ties across attendees and the different behavioural properties of investors and entrepreneurs can be explained through a spatial lens. Based on our findings we offer three concrete recommendations for future organisers of networking events. Mashhadi Afra, Acer Utku, Boran Aidan, Scholl Philipp, Forlivesi Claudio, Vanderhulst Geert, Kawsar FahimExploring space syntax on entrepreneurial opportunities with Wi-Fi analytics 2016 Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing - UbiComp ’16 » show abstract « hide abstract Abstract Industrial events and exhibitions play a powerful role in creating social relations amongst individuals and firms, enabling them to expand their social network so to acquire resources. However, often these events impose a spatial structure which impacts encounter opportunities. In this paper, we study the impact that the spatial configuration has on the formation of network relations. We designed, developed and deployed aWi-Fi analytics solution comprising of wearableWi-Fi badges and gateways in a large scale industrial exhibition event to study the spatio-temporal trajectories of the 2.5K+ attendees including two special groups: 34 investors and 27 entrepreneurs. Our results suggest that certain zones with designated functionalities play a key role in forming social ties across attendees and the different behavioural properties of investors and entrepreneurs can be explained through a spatial lens. Based on our findings we offer three concrete recommendations for future organisers of networking events. Karsten Scheibler, Felix Neubauer, Ahmed Mahdi, Martin Fränzle, Tino Teige, Tom Bienmüller, Detlef Fehrer, Bernd BeckerExtending iSAT3 with ICP-Contractors for Bitwise Integer Operations AVACS Technical Report, SFB/TR 14 AVACS, Subproject T1 , volume : 116, 2016» show abstract « hide abstract Abstract Up to now SMT-solvers addressing floating-point arithmetic were based on bit-blasting. Quite recently, methods based on interval constraint propagation (ICP) for accurate reasoning over
floating-point arithmetic were proposed. One prominent use-case for such methods is automatic dead-code detection in floating-point dominated embedded C programs. However, C programs usually contain a mix of floating-point arithmetic, integer arithmetic and bitwise integer operations. Thus, adding ICP-based support for floating-point arithmetic is not enough – bitwise integer operations have to be supported as well. Therefore, this report gives a detailed overview how these operations can be handled with ICP. Maël Gay, Jan Burchard, Jan Horáček, Ange-Salomé Messeng Ekossono, Tobias Schubert, Bernd Becker, Ilia Polian, Martin KreuzerFormulas, Circuit-Implementations and Fault Equations 2016 FCTRU'16 Van Laerhoven Kristof, Scholl PhilippInterrupts Become Features: Using On-Sensor Intelligence for Recognition Tasks In : Embedded Engineering Education 2016, Cham Springer International Publishing , Van Laerhoven Kristof, Scholl PhilippInterrupts Become Features: Using On-Sensor Intelligence for Recognition Tasks In : Embedded Engineering Education 2016, Cham Springer International Publishing , Dominik Erb, Karsten Scheibler, Michael A. Kochte, Matthias Sauer, Hans-Joachim Wunderlich, Bernd BeckerMixed 01X-RSL-Encoding for Fast and Accurate ATPG with Unknowns 2016 21st Asia and South Pacific Design Automation Conference » show abstract « hide abstract Abstract In this paper, we propose new ATPG techniques to overcome such limitations. An efficient hybrid encoding combines the accuracy of RSL-based modeling with the compactness of conventional three-valued encoding. A low-cost two-valued SAT-based untestability check is able to classify most untestable faults with low runtime. An incremental and event-based accurate fault simulator is introduced to reduce fault simulation effort. The experiments demonstrate the effectiveness of the proposed techniques. On average, over 99.3% of the faults are accurately classified. Both the number of aborts and the total runtime are significantly reduced compared to the state-of-the-art pure RSL-based algorithm. For circuits up to a million gates, the fault coverage could be increased considerably compared to a state-of-the-art commercial tool with very competitive runtimes. Der Andere Verlag On the Handling of Uncertainty in Test Pattern Generation ISBN : 978-3-86247-571-1 Dominik Erb Theiss Marian, Scholl PhilippPredicting Grasps with a Wearable Inertial and EMG Sensing Unit for Low-Power Detection of In-Hand Objects 2016 Augmented Human » show abstract « hide abstract Abstract Detecting the task at hand can often be improved when it is also known what object the user is holding. Several sensing modalities have been suggested to identify handheld objects, from wrist-worn RFID readers to cameras. A critical obstacle to using such sensors, however, is that they tend to be too power hungry for continuous usage. This paper proposes a system that detects grasping using first inertial sensors and then Electromyography (EMG) on the forearm, to then selectively activate the object identification sensors. This three- Tiered approach would therefore only attempt to identify in-hand objects once it is known a grasping has occurred. Our experiments show that high recall can be obtained for grasp detection, 95% on average across participants, with the grasping of lighter and smaller objects clearly being more dificult. © 2016 ACM. Theiss Marian, Scholl PhilippPredicting Grasps with a Wearable Inertial and EMG Sensing Unit for Low-Power Detection of In-Hand Objects 2016 Augmented Human » show abstract « hide abstract Abstract Detecting the task at hand can often be improved when it is also known what object the user is holding. Several sensing modalities have been suggested to identify handheld objects, from wrist-worn RFID readers to cameras. A critical obstacle to using such sensors, however, is that they tend to be too power hungry for continuous usage. This paper proposes a system that detects grasping using first inertial sensors and then Electromyography (EMG) on the forearm, to then selectively activate the object identification sensors. This three- Tiered approach would therefore only attempt to identify in-hand objects once it is known a grasping has occurred. Our experiments show that high recall can be obtained for grasp detection, 95% on average across participants, with the grasping of lighter and smaller objects clearly being more dificult. © 2016 ACM. Scholl Philipp, Van Laerhoven KristofRemind - Towards a Personal Remembrance Search Engine for Motion Augmented Multi-Media Recordings 2016 MUM 2016 - Proceedings of the international 2016 Conference on Multimedia and Ubiquitous Systems Scholl Philipp, Van Laerhoven KristofRemind - Towards a Personal Remembrance Search Engine for Motion Augmented Multi-Media Recordings 2016 MUM 2016 - Proceedings of the international 2016 Conference on Multimedia and Ubiquitous Systems Linus Feiten, Sebastian Sester, Christian Zimmermann, Sebastian Volkmann, Laura Wehle, Bernd BeckerRevocable Anonymisation in Video Surveillance: A "Digital Cloak of Invisibility" In : Technology and Intimacy: Choice or Coercion 2016, Springer International Publishing , pages : 314 - 327, ISBN : 978-3-319-44804-6» show abstract « hide abstract Abstract Video surveillance is an omnipresent phenomenon in today's metropolitan life. Mainly intended to solve crimes, to prevent them by realtime-monitoring or simply as a deterrent, video surveillance has also become interesting in economical contexts; e.g. to create customer profiles and analyse patterns of their shopping behaviour. The extensive use of video surveillance is challenged by legal claims and societal norms like not putting everybody under generalised suspicion or not recording people without their consent. In this work we propose a technological solution to balance the positive and negative effects of video surveillance. With automatic image recognition algorithms on the rise, we suggest to use that technology to not just automatically identify people but blacken their images. This blackening is done with a cryptographic procedure allowing to revoke it with an appropriate key. Many of the legal and ethical objections to video surveillance could thereby be accommodated. In commercial scenarios, the operator of a customer profiling program could offer enticements for voluntarily renouncing one's anonymity. Customers could e.g. wear a small infrared LED to signal their agreement to being tracked. After explaining the implementation details, this work outlines a multidisciplinary discussion incorporating an economic, ethical and legal viewpoint. Bernd Becker, Katrin Weber, Linus FeitenSMartphones In der LEhre (SMILE) In : Kreativ, Innovativ, Motivierend - Lehrkonzepte in der Praxis: Der Instructional Development Award (IDA) der Universität Freiburg 2016, Universitäts Verlag Webler (UVW) , pages : 117 - 133, ISBN : 3-946017-01-0 Karina Wimmer, Ralf Wimmer, Christoph Scholl, Bernd BeckerSkolem Functions for DQBF 2016 Int'l Symposium on Automated Technology for Verification and Analysis (ATVA) Proceedings of the 14th International Symposium on Automated Technology for Verification and Analysis (ATVA) , Springer, volume : 9938, pages : 395 - 411» show abstract « hide abstract Abstract We consider the problem of computing Skolem functions for satisfied dependency quantified Boolean formulas (DQBFs). We show how Skolem functions can be obtained from an elimination-based DQBF solver and how to take preprocessing steps into account. The size of the Skolem functions is optimized by don't-care minimization using Craig interpolants and rewriting techniques. Experiments with our DQBF solver HQS show that we are able to effectively compute Skolem functions with very little overhead compared to the mere solution of the formula. Karina Wimmer, Ralf Wimmer, Christoph Scholl, Bernd BeckerSkolem Functions for DQBF (Extended Version) , 2016 Linus Feiten, Jonathan Oesterle, Tobias Martin, Matthias Sauer, Bernd BeckerSystematic Frequency Biases in Ring Oscillator PUFs on FPGAs 2016 IEEE Transactions on Multi-Scale Computing Systems (TMSCS) , volume : PP, issue : 99» show abstract « hide abstract Abstract Physically unclonable functions (PUFs) are an emerging primitive in hardware security, enabling the identification of computer-chips. A promising type particularly for FPGA implementations is the Ring Oscillator (RO) PUF, where signal delays – stemming from uncontrollable variations in the manufacturing process – are used as device-specific characteristics. Based on experimental results gathered with 38 identical Altera FPGAs, we show the existence of nondevice- specific i.e. systemic RO frequency biases, traced back to (1) the internal routing within the RO’s look-up tables, (2) the RO locations on the FPGAs, or (3) the non-PUF payload activity. As these biases are the same for all devices, the result is poor inter-device uniqueness and unreliable signatures under changing payloads. After characterising these biases with a newly developed set of metrics, we suggest a method to overcome them: Using only a small sample of devices, the average bias over all devices for each RO is predicted and the relative differences caused by systemic biases are nullified. We demonstrate the viability of this method by determining the sufficient random sample sizes and showing that the inter-device uniqueness is drastically increased and the PUF signatures become reliable even under changing payload activities. Ralf WimmerTagungsband des 19. GI/ITG/GMM-Workshops "Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen" (MBMV) 2016 Universität Freiburg / FreiDok , page : 150 Paolo Marin, Massimno Narizzano, Luca Pulina, Armando Tacchella, Enrico GiunchigliaTwelve Years of QBF Evaluations: QSAT Is PSpace-Hard and It Shows 2016 Fund Inform , volume : 149, issue : 1-2, pages : 133 - 158 Matthias Sauer, Linus Feiten, Bernd Becker, Ulrich Rührmair, Ilia PolianUtilizing Intrinsic Delay Variability in Complex Digital Circuits for Defining PUF Behavior 2016 TRUDEVICE Workshop, Dresden Karsten Scheibler, Dominik Erb, Bernd Beckersigns 2016 Conf. on Design, Automation and Test in Europe » show abstract « hide abstract Abstract Therefore, this paper focuses on a new highly incremental CEGAR-based algorithm that overcomes these limitations and hence is completely accurate in presence of unknown values. It relies on a modified SAT-solver tailored to this specific problem. The experimental results for circuits with up to 2.400.000 gates show that this combination allows high accuracy and high scalability at the same time. Compared to a state-of-the-art commercial tool, the fault coverage could be increased significantly. Furthermore, the runtime is reduced remarkably compared to a QBF-based encoding of the problem. back to the year overview Jan Burchard, Tobias Schubert, Bernd BeckerLaissez-Faire Caching for Parallel #SAT Solving 2015 International Conference on Theory and Applications of Satisfiability Testing , volume : 9340, pages : 46 - 61» show abstract « hide abstract Abstract caching only adds a small computational overhead. Katrin Weber, Corinna Bertuzzi, Tobias SchubertKooperation in der virtuellen Projektarbeit – Vom Pilotkurs zum erfolgreichen Weiterbildungsangebot in den Technikwissenschaften 2015 DGWF-Jahrestagung 2015 , pages : 144 - 145 Paolo Marin, Massimo Narizzano, Luca Pulina, Armando Tacchella, Enrico GiunchigliaAn Empirical Perspective on Ten Years of QBF Solving 2015 22nd RCRA International Workshop on Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion 2015 (RCRA 2015) Proceedings of the 22nd RCRA International Workshop on Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion (RCRA 2015) , CEUR-WS.org, volume : 1451, pages : 62 - 75 Linus Feiten, Tobias Martin, Matthias Sauer, Bernd BeckerImproving RO-PUF Quality on FPGAs by Incorporating Design-Dependent Frequency Biases 2015 IEEE European Test Symposium » show abstract « hide abstract Abstract the mean frequency of each RO - derived using only a small training set of devices - from the sampled frequencies overcomes this disadvantage; i.e. the uniqueness is increased drastically while maintaining reliability. Kohei Miyase, Matthias Sauer, Bernd Becker, Xiaoqing Wen, Seiji KajiharaIdentification of High Power Consuming Areas with Gate Type and Logic Level Information 2015 IEEE European Test Symposium Andreas Riefert, Riccardo Cantoro, Matthias Sauer, Matteo Sonza Reorda, Bernd BeckerOn the Automatic Generation of SBST Test Programs for In-Field Test 2015 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Bettina Braitling, Luis María Ferrer Fioriti, Hassan Hatefi, Ralf Wimmer, Bernd Becker, Holger HermannsAbstraction-based Computation of Reward Measures for Markov Automata 2015 Mumbai, India Int'l Conf. Verification, Model Checking, and Abstract Interpretation (VMCAI) , volume : 8931, pages : 172 - 189 Dominik Erb, Michael A. Kochte, Sven Reimer, Matthias Sauer, Hans-Joachim Wunderlich, Bernd BeckerAccurate QBF-based Test Pattern Generation in Presence of Unknown Values 2015 Computer-Aided Design of Integrated Circuits and Systems (TCAD) » show abstract « hide abstract Abstract circuits investigate the increase in fault coverage for conventional deterministic and potential detection requirements for both randomized and clustered X-sources. Hartwig Matthias, Scholl Philipp, Budde Vanessa, Windel ArminAdaptive Reminders for Safe Work 2015 Cham Persuasive Technology: 10th International Conference, PERSUASIVE 2015 , Springer International Publishing Hartwig Matthias, Scholl Philipp, Budde Vanessa, Windel ArminAdaptive Reminders for Safe Work 2015 Cham Persuasive Technology: 10th International Conference, PERSUASIVE 2015 , Springer International Publishing Linus Feiten, Tobias Martin, Matthias Sauer, Bernd BeckerAnalysis and utilisation of deviations in RO-PUFs under altered FPGA designs 2015 TRUDEVICE Workshop, Grenoble » show abstract « hide abstract Abstract Here, we are sharing our experiences from analyses on 70 FPGAs about how ROs implemented on Altera Cyclone IV FPGAs are affected in non-uniform ways depending on the non-PUF circuitry of the design. As specific ROs are affected towards faster or slower frequencies on all devices, this leads to bad inter-device uniqueness. We suggest that subtracting the mean frequency - derived with a training set of devices - from the sampled frequencies per RO on all devices will overcome this disadvantage. Der Andere Verlag , page : 174Bekannte Unbekannte - Formale Methoden in Anwesenheit unbekannter Werte ISBN : 978-3-86247-569-8 Sven Reimer Hassan Hatefi, Bettina Braitling, Ralf Wimmer, Luis Maria Ferrer Fioriti, Holger Hermanns, Bernd BeckerCost vs. Time in Stochastic Games and Markov Automata 2015 International Symposium on Dependable Software Engineering: Theory, Tools and Applications (SETTA) Proc. of SETTA , Springer-Verlag, volume : 9409, pages : 19 - 34» show abstract « hide abstract Abstract Costs and rewards are important tools for analysing quantitative aspects of models like energy consumption and costs of maintenance and repair. Under the assumption of transient costs, this paper considers the computation of expected cost-bounded rewards and cost-bounded reachability for Markov automata and stochastic games. We give a transformation of this class of properties to expected time-bounded rewards and time-bounded reachability, which can be computed by available algorithms. We prove the correctness of the transformation and show its effectiveness on a number of case studies. Tim Quatmann, Nils Jansen, Christian Dehnert, Ralf Wimmer, Erika Ábrahám, Joost-Pieter KatoenCounterexamples for Expected Rewards 2015 Proceedings of the 20th International Symposium on Formal Methods (FM) , Springer, volume : 9109, pages : 435 - 452» show abstract « hide abstract Abstract The computation of counterexamples for probabilistic systems has gained a lot of attention during the last few years. All of the proposed methods focus on the situation when the probabilities of certain events are too high. In this paper we investigate how counterexamples for properties concerning expected costs (or, equivalently, expected rewards) of events can be computed. We propose methods to extract a minimal subsystem which already leads to costs beyond the allowed bound. Besides these exact methods, we present heuristic approaches based on path search and on best-first search, which are applicable to very large systems when deriving a minimum subsystem becomes infeasible due to the system size. Experiments show that we can compute counterexamples for systems with millions of states. Linus Feiten, Matthias SauerExtracting the RC4 secret key of the Open Smart Grid Protocol (OSGP) 2015 Industrial Control System Security (ICSS) Workshop » show abstract « hide abstract Abstract The Open Smart Grid Protocol (OSGP) is a widely used industry standard for exchanging sensitive data between devices inside of smart grids. For message confidentiality, OSGP implements a customised form of the RC4 stream cipher. In this work, we show how already known weaknesses of RC4 can be exploited to successfully attack the OSGP implementation as well. The attack modification is able to effectively derive the secret OSGP encryption and decryption key, given that an attacker can accumulate the cipher streams of approximately 90,000 messages. The possession of this key allows the attacker to decrypt all data intercepted on the OSGP smart grid and thereby obtain privacy critical information of its participants. Linus Feiten, Matthias Sauer, Tobias Schubert, Victor Tomashevich, Ilia Polian, Bernd BeckerFormal Vulnerability Analysis of Security Components 2015 IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems (TCAD) , volume : 34, issue : 8, pages : 1358 - 1369» show abstract « hide abstract Abstract Vulnerability to malicious fault attacks is an emerging concern for hardware circuits that are employed in mobile and embedded systems and process sensitive data. We describe a new methodology to assess the vulnerability of a circuit to such attacks, taking into account built-in protection mechanisms. Our method is based on accurate modeling of fault effects and detection status expressed by Boolean satisfiability (SAT) formulas. Vulnerability is quantified based on the number of solutions of these formulas, which are determined by an efficient #SAT solver. We demonstrate the applicability of this method for design space exploration of a pseudo random number generator and for calculating the attack success rate in a multiplier circuit protected by robust error-detecting codes. Ralf Wimmer, Nils Jansen, Erika Ábrahám, Joost-Pieter KatoenHigh-level Counterexamples for Probabilistic Automata 2015 Log Meth Comput Sci (Logical Methods In Computer Science) , volume : 11, issue : 1:15, pages : 1 - 23» show abstract « hide abstract Abstract Providing compact and understandable counterexamples for violated system properties is an essential task in model checking. Existing works on counterexamples for probabilistic systems so far computed either a large set of system runs or a subset of the system's states, both of which are of limited use in manual debugging. Many probabilistic systems are described in a guarded command language like the one used by the popular model checker PRISM. In this paper we describe how a smallest possible subset of the commands can be identified which together make the system erroneous. We additionally show how the selected commands can be further simplified to obtain a well-understandable counterexample. Andreas Riefert, Matthias Sauer, Sudhakar Reddy, Bernd BeckerImproving Diagnosis Resolution of a Fault Detection Test Set 2015 VLSI Test Symposium » show abstract « hide abstract Abstract In this work we present a method to improve the diagnosis resolution of a compact fault detection test set without increasing pattern count or decreasing fault coverage. The basic idea of the approach is to generate a SAT formula which enforces diagnosis and is solved by a MAX-SAT solver which is a SAT-based maximization tool. We believe this is the first time a method to improve diagnosis resolution of a test set of given size has been reported. Experimental results on ISCAS 89 circuits demonstrate the effectiveness of the proposed method. Karsten Scheibler, Dominik Erb, Bernd BeckerImproving Test Pattern Generation in Presence of Unknown Values beyond Restricted Symbolic Logic 2015 to be published at European Test Symposium (ETS) » show abstract « hide abstract Abstract In this paper, we improve test generation based on RSL in two directions in order to reduce the gap to QBF further. First, we present techniques to go beyond the accuracy of RSL when generating test patterns. Second, we include a check which is able to accurately identify untestable faults. Experimental results show the high efficiency of the proposed method classifying almost all faults as testable or proving untestability. , volume : 9072Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) ISBN : 9783319203058 Hartwig M, Scholl P, Budde V, Windel A» show abstract « hide abstract Abstract © Springer International Publishing Switzerland 2015. In working context, new information and communication systems like head-mounted displays are often established to increase productivity. However, new technology can also potentially assist users to increase their health and well-being by encouraging safe behaviour, as they offer rich potential for persuasive elements and are highly adaptive to the respective situation. A setup is presented that evaluates the persuasive effect of a head mounted displaybased assistance system on safe behaviour during a task in a laboratory simulation. In a randomised experimental design, the results show that the persuasive assistance system led to a significant and substantial reduction of safety violations compared to a traditionally designed warning sign. The findings suggest to design reminders not only according to ergonomic aspects, but also to consider persuasive effects on behavior. Furthermore, the results highlight the opportunities of mobile devices for assisting healthy behaviour in working environments. , volume : 9072Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) ISBN : 9783319203058 Hartwig M, Scholl P, Budde V, Windel A» show abstract « hide abstract Abstract © Springer International Publishing Switzerland 2015. In working context, new information and communication systems like head-mounted displays are often established to increase productivity. However, new technology can also potentially assist users to increase their health and well-being by encouraging safe behaviour, as they offer rich potential for persuasive elements and are highly adaptive to the respective situation. A setup is presented that evaluates the persuasive effect of a head mounted displaybased assistance system on safe behaviour during a task in a laboratory simulation. In a randomised experimental design, the results show that the persuasive assistance system led to a significant and substantial reduction of safety violations compared to a traditionally designed warning sign. The findings suggest to design reminders not only according to ergonomic aspects, but also to consider persuasive effects on behavior. Furthermore, the results highlight the opportunities of mobile devices for assisting healthy behaviour in working environments. Bernd Becker, Matthias Sauer, Christoph Scholl, Ralf WimmerModeling Unknown Values in Test and Verification In : Formal Modeling and Verification of Cyber-Physical Systems (Proceedings of the 1st International Summer School on Methods and Tools for the Design of Digital Systems) 2015, Springer , Rolf Drechsler, Ulrich Kühne, pages : 122 - 150, Rolf Drechsler, Ulrich Kühne, ISBN : 978-3-658-09993-0» show abstract « hide abstract Abstract mbination with algorithms balancing exactness of representation and efficiency of calculation. In the following, state-of-the-art approaches are described that enable an efficient and successful handling of unknown values using formal techniques in the areas of Test and Verification. Dominik Erb, Karsten Scheibler, Matthias Sauer, Sudhakar M. Reddy, Bernd BeckerMulti-Cycle Circuit Parameter Independent ATPG for Interconnect Open Defects 2015 33rd VLSI Test Symposium (VTS) » show abstract « hide abstract Abstract In this work we extend the validity of CPI tests by introducing so-called multi-cycle CPI tests and single-value CPI tests. By doing so, we significantly improve the coverage of open defects and ensure their detection whilst including the additional effect of leakage currents on opens. Experimental results for circuits with over 500k non-equivalent faults and several thousand aggressors %deriving the new classes of CPI tests show the effectiveness of the newly proposed CPI tests as well as the high efficiency of a new ATPG algorithm to generate these new classes of CPI tests. Andreas Riefert, Riccardo Cantoro, Matthias Sauer, Matteo Sonza Reorda, Bernd BeckerOn the Automatic Generation of SBST Test Programs for In-Field Test 2015 Conf. on Design, Automation and Test in Europe Sachidananda V, Noack D, Khelil A, Van Laerhoven K, Scholl POptimized multi-attribute co-design for maximizing efficiency in Wireless Sensor Networks 2015 2015 IEEE 10th International Conference on Intelligent Sensors, Sensor Networks and Information Processing, ISSNIP 2015 » show abstract « hide abstract Abstract © 2015 IEEE. A key task in Wireless Sensor Networks (WSNs) is to deliver specific information about a spatial phenomenon of interest. However, in WSNs the operating conditions and/or user requirements are often desired to be evolvable, whether driven by changes of the monitored parameters or WSN properties. To this end, few sensor nodes sample the phenomenon and transmit the acquired samples, typically multihop, to the application through a gateway called sink. Accurately representing the physical phenomenon and reliably, timely delivering the user required information comes at the cost of higher energy as additional messages are required. This work proposes a tunable co-design for network optimization to avoid under or over provision of information and interaction of the attributes and their effects on each other. We validate the approach viability through analytical modeling, simulations for a range of requirements. Sachidananda V, Noack D, Khelil A, Van Laerhoven K, Scholl POptimized multi-attribute co-design for maximizing efficiency in Wireless Sensor Networks 2015 2015 IEEE 10th International Conference on Intelligent Sensors, Sensor Networks and Information Processing, ISSNIP 2015 » show abstract « hide abstract Abstract © 2015 IEEE. A key task in Wireless Sensor Networks (WSNs) is to deliver specific information about a spatial phenomenon of interest. However, in WSNs the operating conditions and/or user requirements are often desired to be evolvable, whether driven by changes of the monitored parameters or WSN properties. To this end, few sensor nodes sample the phenomenon and transmit the acquired samples, typically multihop, to the application through a gateway called sink. Accurately representing the physical phenomenon and reliably, timely delivering the user required information comes at the cost of higher energy as additional messages are required. This work proposes a tunable co-design for network optimization to avoid under or over provision of information and interaction of the attributes and their effects on each other. We validate the approach viability through analytical modeling, simulations for a range of requirements. Matthias Sauer, Bernd Becker, Ilia PolianPHAETON: A SAT-based Framework for Timing-aware Path Sensitization 2015 Ieee T Comput , volume : PP, issue : 99» show abstract « hide abstract Abstract nowledge about sensitizable paths through combinational logic is essential for numerous design tasks. We present the framework PHAETON which identifies sensitizable paths and generates test pairs to exercise these paths using Boolean satisfiability (SAT). PHAETON supports a large number of models and sensitization conditions and provides a generic interface that can be used by applications. It incorporates a novel application-specific unary representation of integer numbers to integrate timing information with logical conditions within the same monolithic SAT formula. Due to a number of further elaborate speedup techniques, PHAETON scales to industrial circuits. Experimental results show the performance of PHAETON in classical K longest path generation tasks and in new post-silicon validation and characterization scenarios. Ralf Wimmer, Karina Gitina, Jennifer Nist, Christoph Scholl, Bernd BeckerPreprocessing for DQBF 2015 Proceedings of the 18th International Conference on Theory and Applications of Satisfiability Testing (SAT) , Springer, volume : 9340, pages : 173 - 190» show abstract « hide abstract Abstract For SAT and QBF formulas many techniques are applied in order to reduce/modify the number of variables and clauses of the formula, before the formula is passed to the actual solving algorithm. It is well known that these preprocessing techniques often reduce the computation time of the solver by orders of magnitude. In this paper we generalize different preprocessing techniques for SAT and QBF problems to dependency quantified Boolean formulas (DQBF) and describe how they need to be adapted to work with a DQBF solver core. We demonstrate their effectiveness both for CNF- and non-CNF-based DQBF algorithms. Scholl P, Schultes T, Van Laerhoven KRFID-based compound identification in wet laboratories with google glass 2015 ACM International Conference Proceeding Series , volume : 25-26-June» show abstract « hide abstract Abstract Experimentation in Wet Laboratories requires tracking and identification of small containers like test tubes, flasks, and bottles. The current practise involves colored adhesive markers, water proof hand-writing, QR- and Barcodes, or RFID-Tags. These markers are often not self-descriptive and require a lookup table on paper or some digitally stored counterpart. Furthermore they are subject to harsh environmental conditions (e.g. samples are kept in a freezer), and can be hard to share with other lab workers for lack of a consistent annotation systems. Increasing their durability, as well as providing a central tracking system for these containers, is therefore of great interest. In this paper we present a system for the implicit tracking of RFID-augmented containers with a wrist-worn reader unit, and a voice-interaction scheme based on a head-mounted display. Scholl P, Schultes T, Van Laerhoven KRFID-based compound identification in wet laboratories with google glass 2015 ACM International Conference Proceeding Series , volume : 25-26-June» show abstract « hide abstract Abstract Experimentation in Wet Laboratories requires tracking and identification of small containers like test tubes, flasks, and bottles. The current practise involves colored adhesive markers, water proof hand-writing, QR- and Barcodes, or RFID-Tags. These markers are often not self-descriptive and require a lookup table on paper or some digitally stored counterpart. Furthermore they are subject to harsh environmental conditions (e.g. samples are kept in a freezer), and can be hard to share with other lab workers for lack of a consistent annotation systems. Increasing their durability, as well as providing a central tracking system for these containers, is therefore of great interest. In this paper we present a system for the implicit tracking of RFID-augmented containers with a wrist-worn reader unit, and a voice-interaction scheme based on a head-mounted display. Karina Gitina, Ralf Wimmer, Sven Reimer, Matthias Sauer, Christoph Scholl, Bernd BeckerSolving DQBF Through Quantifier Elimination 2015 Conf. on Design, Automation and Test in Europe Karsten Scheibler, Leonore Winterer, Ralf Wimmer, Bernd BeckerTowards Verification of Artificial Neural Networks 2015 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract In this paper we take a typical control problem, namely the Cart Pole System (a.k.a. inverted pendulum), and a model of its physical environment and study safety verification of this system. To do so, we use bounded model checking (BMC). The created formulas are solved with the SMT-solver iSAT3. We examine the problems that occur during solving these formulas and show that extending the solver by special deduction routines can reduce both memory consumption and computation time on such instances significantly. This constitutes a first step towards verification of machine-learned controllers, but a lot of challenges remain. Ernst Moritz Hahn, Holger Hermanns, Ralf Wimmer, Bernd BeckerTransient Reward Approximation for Continuous-Time Markov Chains 2015 Ieee T Reliab , volume : 64, issue : 4» show abstract « hide abstract Abstract We are interested in the analysis of very large continuous-time Markov chains (CTMCs) with many distinct rates. Such models arise naturally in the context of the reliability analysis, e.g., of computer networks performability analysis, of power grids, of computer virus vulnerability, and in the study of crowd dynamics. We use abstraction techniques together with novel algorithms for the computation of bounds on the expected final and accumulated rewards in continuous-time Markov decision processes (CTMDPs). These ingredients are combined in a partly symbolic and partly explicit (symblicit) analysis approach. In particular, we circumvent the use of multi-terminal decision diagrams, because the latter do not work well if facing a large number of different rates. We demonstrate the practical applicability and efficiency of the approach on two case studies. Christian Miller, Paolo Marin, Bernd BeckerVerification of Partial Designs Using Incremental QBF 2015 Ai Commun , volume : 28, issue : 2, pages : 283 - 307» show abstract « hide abstract Abstract SAT solving is an indispensable core component of numerous formal verification tools and has found widespread use in industry, in particular when using it in an incremental fashion, e.g. in Bounded Model Checking (BMC). On the other hand, for some applications SAT formulas are not expressive enough, whereas a description via Quantified Boolean Formulas (QBF) is much more adequate, for instance when dealing with partial designs. Motivated by the success of incremental SAT, in this paper we explore various approaches to solve QBF problems in an incremental fashion and thereby make this technology usable as a core component of BMC. Firstly, we realized an incremental QBF solver based on the state-of-the-art QBF solver QuBE: Taking profit from the reuse of some information from previous iterations, the search space can be pruned, in some cases, to even less than a quarter. However, the need for preprocessing QBF formulas prior to the solving phase, that in general cannot be paired with incremental solving because of the non-predictable elimination of variables in the future incremental steps, posed the question of incremental QBF preprocessing. In this context we present an approach for retaining the QBF formula being preprocessed while extending its clauses and prefix incrementally. This procedure results in a significant size reduction of the QBF formulas, hence leading to a reduced solving time. As this may come together with a high preprocessing time, we analyze various heuristics to dynamically disable incremental preprocessing when its overhead raises over a certain threshold and is not compensated by the reduced solving time anymore. For proving the efficacy of our methods experimentally, as an application we consider BMC for partial designs (i.e. designs containing so-called blackboxes which represent unknown parts). Here, we disprove realizability, that is, we prove that an unsafe state is reachable no matter how the blackboxes are implemented. We examine all these incremental approaches from both the point of view of the effectiveness of the single procedure and the benefits that a range of QBF solvers can take from it. On a domain of partial design benchmarks, engaging incremental QBF methods significant performance gains over non incremental BMC can be achieved. Scholl Philipp, Wille Matthias, Laerhoven KristofWearables in the Wet Lab: A Laboratory System for Capturing and Guiding Experiments 2015 Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing » show abstract « hide abstract Abstract Wet Laboratories are highly dynamic, shared environments full of tubes, racks, compounds, and dedicated machinery. The recording of experiments, despite the fact that sev-eral ubiquitous computing systems have been suggested in the past decades, still relies predominantly on hand-written notes. Similarly, the information retrieval capabilities in-side a laboratory are limited to traditional computing inter-faces, which due to safety regulations are sometimes not us-able at all. In this paper, Google Glass is combined with a wrist-worn gesture sensor to support Wetlab experimenters. Taking "in-situ" documentation while an experiment is per-formed, as well as contextualizing the protocol at hand can be implemented on top of the proposed system. After an analysis of current practices and needs through a series of explorative deployments in wet labs, we motivate the need for a wearable hands-free system, and introduce our specific design to guide experimenters. Finally, using a study with 22 participants evaluating the system on a benchmark DNA extraction experiment, we explore the use of gesture recog-nition for enabling the system to track where the user might be in the experiment. Scholl Philipp, Wille Matthias, Laerhoven KristofWearables in the Wet Lab: A Laboratory System for Capturing and Guiding Experiments 2015 Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing » show abstract « hide abstract Abstract Wet Laboratories are highly dynamic, shared environments full of tubes, racks, compounds, and dedicated machinery. The recording of experiments, despite the fact that sev-eral ubiquitous computing systems have been suggested in the past decades, still relies predominantly on hand-written notes. Similarly, the information retrieval capabilities in-side a laboratory are limited to traditional computing inter-faces, which due to safety regulations are sometimes not us-able at all. In this paper, Google Glass is combined with a wrist-worn gesture sensor to support Wetlab experimenters. Taking "in-situ" documentation while an experiment is per-formed, as well as contextualizing the protocol at hand can be implemented on top of the proposed system. After an analysis of current practices and needs through a series of explorative deployments in wet labs, we motivate the need for a wearable hands-free system, and introduce our specific design to guide experimenters. Finally, using a study with 22 participants evaluating the system on a benchmark DNA extraction experiment, we explore the use of gesture recog-nition for enabling the system to track where the user might be in the experiment. back to the year overview Sven Reimer, Matthias Sauer, Tobias Schubert, Bernd BeckerIncremental Encoding and Solving of Cardinality Constraints 2014 International Symposium on Automated Technology for Verification and Analysis , Springer International Publishing, volume : 8837, pages : 297 - 313» show abstract « hide abstract Abstract Traditional SAT-based MaxSAT solvers encode cardinality constraints directly as part of the CNF and solve the entire optimization problem by a sequence of iterative calls of the underlying SAT solver. The main drawback of such approaches is their dependence on the number of soft clauses: The more soft clauses the MaxSAT instance contains, the larger is the CNF part encoding the cardinality constraints. To counter this drawback, we introduce an innovative encoding of cardinality constraints: Instead of translating the entire and probably bloated constraint network into CNF, a divide-and-conquer approach is used to encode partial constraint networks successively. The resulting subproblems are solved and merged incrementally, reusing not only intermediate local optima, but also additional constraints which are derived from solving the individual subproblems by the back-end SAT solver. Extensive experimental results for the last MaxSAT evaluation benchmark suitew demonstrate that our encoding is in general smaller compared to existing methods using a monolithic encoding of the constraints and converges faster to the global optimum. Dominik Erb, Karsten Scheibler, Matthias Sauer, Sudhakar M. Reddy, Bernd Beckerfor Interconnect Open Defects 2014 23nd IEEE Asian Test Symposium (ATS) » show abstract « hide abstract Abstract faults and several thousand aggressors. Sven Reimer, Matthias Sauer, Paolo Marin, Bernd BeckerQBF with Soft Variables 2014 International Workshop on Automated Verification of Critical Systems (AVOCS) Bernd Becker, Rolf Drechsler, Stephan Eggersglüß, Matthias SauerRecent advances in SAT-based ATPG: Non-standard fault models, multi constraints and optimization 2014 International Conference on Design Technology of Integrated Systems In Nanoscale Era (DTIS) Matthias Sauer, Ilia Polian, Michael E. Imhof, Abdullah Mumtaz, Eric Schneider, Alexander Czutro, Hans-Joachim Wunderlich, Bernd BeckerVariation-Aware Deterministic ATPG 2014 IEEE European Test Symposium , pages : 1 - 6» show abstract « hide abstract Abstract In technologies affected by variability, the detection status of a small-delay fault may vary among manufactured circuit instances. The same fault may be detected, missed or provably undetectable in different circuit instances. We introduce the first complete flow to accurately evaluate and systematically maximize the test quality under variability. As the number of possible circuit instances is infinite, we employ statistical analysis to obtain a test set that achieves a fault-efficiency target with an user-defined confidence level. The algorithm combines a classical path-oriented test-generation procedure with a novel waveformaccurate engine that can formally prove that a small-delay fault is not detectable and does not count towards fault efficiency. Extensive simulation results demonstrate the performance of the generated test sets for industrial circuits affected by uncorrelated and correlated variations. Matthias Sauer, Sven Reimer, Sudhakar M. Reddy, Bernd BeckerEfficient SAT-based Circuit Initialization for Large Designs 2014 Int'l Conf. on VLSI Design » show abstract « hide abstract Abstract We present a procedure to determine initialization sequences for a sequential circuit optimizing sequence length and unknown values (Xes) in the flip-flops. Specifically, we consider the following two problems: (1) Determine a sequence that initializes a maximal set of flip-flops starting in a completely unknown state. (2) Determine a minimal subset of flip-flops that need to be controllable such that the circuit can be completely initialized in a limited number of time frames. The underlying principle of our methods is a maximization formalism using formal optimization techniques based on satisfiability solvers (MaxSAT). We introduce several heuristics which increase the scalability of our approach significantly. Experimental results demonstrate the applicability of the method for large academic and industrial benchmark circuits with up to a few hundred thousand gates. Nils Jansen, Florian Corzilius, Matthias Volk, Ralf Wimmer, Erika Ábrahám, Joost-Pieter Katoen, Bernd BeckerAccelerating Parametric Probabilistic Verification 2014 Int'l Conf. on Quantitative Evaluation of Systems (QEST) , Springer-Verlag, volume : 8657, page : 404-420» show abstract « hide abstract Abstract We present a novel method for computing reachability probabilities of parametric discrete-time Markov chains whose transition probabilities are fractions of polynomials over a set of parameters. Our algorithm is based on two key ingredients: a graph decomposition into strongly connected subgraphs combined with a novel factorization strategy for polynomials. Experimental evaluations show that these approaches can lead to a speed-up of up to several orders of magnitude in comparison to existing approaches. Tobias Schubert, Marc Pfeifer, Bernd BeckerAccurate Controlling of Velocity on a Mobile Robot 2014 29th International Conference on Computers and Their Applications Andreas Riefert, Lyl Ciganda, Matthias Sauer, Paolo Bernadi, Matteo Sonza Reorda, Bernd BeckerAn Effective Approach to Automatic Functional Processor Test Generation for Small-Delay Faults 2014 Conf. on Design, Automation and Test in Europe » show abstract « hide abstract Abstract Functional microprocessor test methods provide several advantages compared to DFT approaches, like reduced chip cost and at-speed execution. However, the automatic generation of functional test patterns is an open issue. In this work we present an approach for the automatic generation of functional microprocessor test sequences for small-delay faults based on Bounded Model Checking. We utilize an ATPG framework for small-delay faults in sequential, non-scan circuits and propose a method for constraining the input space for generating functional test sequences (i.e., test programs). We verify our approach by evaluating the miniMIPS microprocessor. In our experiments we were able to reach over 97 % fault efficiency. To the best of our knowledge, this is the first fully automated approach to functional microprocessor test for small-delay faults. Sreedhar Saseendran Kumar, Jan Wülfing, Joschka Boedecker, Ralf Wimmer, Martin Riedmiller, Bernd Becker, Ulrich EgertAutonomous Control of Network Activity 2014 9th International Meeting on Substrate-Integrated Microelectrode Arrays (MEA) » show abstract « hide abstract Abstract Electrical stimulation of the brain is used to treat neurological disorders. Yet it is unknown how to find stimulation patterns that produce desired results with the least interference. Towards this goal, we tested a generic closed-loop paradigm that autonomously optimizes stimulation settings. We used neuronal networks coupled to a reinforcement learning based controller to maximize response lengths. Dominik Erb, Karsten Scheibler, Michael Kochte, Matthias Sauer, Hans-Joachim Wunderlich, Bernd BeckerBased on Restricted Symbolic Logic 2014 Int'l Test Conf. » show abstract « hide abstract Abstract more scalable and more robust. Wille Matthias, Scholl Philipp, Wischniewski Sascha, Van Laerhoven KristofComparing google glass with tablet-PC as guidance system for assembling tasks 2014 Proceedings - 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, BSN Workshops 2014 » show abstract « hide abstract Abstract © 2014 IEEE. Head mounted displays (HMDs) can be used as an guidance system for manual assembling tasks: contrary to using a Tablet-PC, instructions are always shown in the field of view while hands are kept free for the task. This is believed to be one of the major advantage of using HMDs. In the study reported here, performance, visual fatigue, and subjective strain was measured in a dual task paradigm. Participants were asked to follow a toy car assembly instructions while monitoring a virtual gauge. Both tasks had to be executed in parallel either while wearing Google Glass or using a Tablet-PC. Results show slower performance on the HMD but no difference in subjective strain. Wille Matthias, Scholl Philipp, Wischniewski Sascha, Van Laerhoven KristofComparing google glass with tablet-PC as guidance system for assembling tasks 2014 Proceedings - 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, BSN Workshops 2014 » show abstract « hide abstract Abstract © 2014 IEEE. Head mounted displays (HMDs) can be used as an guidance system for manual assembling tasks: contrary to using a Tablet-PC, instructions are always shown in the field of view while hands are kept free for the task. This is believed to be one of the major advantage of using HMDs. In the study reported here, performance, visual fatigue, and subjective strain was measured in a dual task paradigm. Participants were asked to follow a toy car assembly instructions while monitoring a virtual gauge. Both tasks had to be executed in parallel either while wearing Google Glass or using a Tablet-PC. Results show slower performance on the HMD but no difference in subjective strain. Erika Ábrahám, Bernd Becker, Christian Dehnert, Nils Jansen, Joost-Pieter Katoen, Ralf WimmerCounterexample Generation for Discrete-Time Markov Models: An Introductory Survey In : International School on Formal Methods for the Design of Computer, Communication, and Software Systems (SFM), Advanced Lectures 2014, Springer-Verlag , pages : 65 - 121, » show abstract « hide abstract Abstract This paper is an introductory survey of available methods for the computation and representation of probabilistic counterexamples for discrete-time Markov chains and probabilistic automata. In contrast to traditional model checking, probabilistic counterexamples are sets of finite paths with a critical probability mass. Such counterexamples are not obtained as a by-product of model checking, but by dedicated algorithms. We define what probabilistic counterexamples are and present approaches how they can be generated. We discuss methods based on path enumeration, the computation of critical subsystems, and the generation of critical command sets, both, using explicit and symbolic techniques. Scholl Philipp, Borazio Marko, Martin JDiary-Like Long-Term Activity Recognition : Touch or Voice Interaction ? 2014 Eyewear Computers (GEC), IEEE Press Scholl Philipp, Borazio Marko, Martin JDiary-Like Long-Term Activity Recognition : Touch or Voice Interaction ? 2014 Eyewear Computers (GEC), IEEE Press Scholl P, Borazio M, Jansch M, Van Laerhoven KDiary-like long-term activity recognition: Touch or voice interaction? 2014 Proceedings - 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, BSN Workshops 2014 » show abstract « hide abstract Abstract © 2014 IEEE. The experience sampling methodology is a well known tool in psychology to asses a subject’s condition. Regularly or whenever an important event happens the subject stops whatever he is currently involved in and jots down his current perceptions, experience, and activities, which in turn form the basis of these diary studies. Such methods are also widely in use for gathering labelled data for wearable long-term activity recognition, where subjects are asked to note conducted activities. We present the design of a personal electronic diary for daily activities, including user interfaces on a PC, Smartphone, and Google Glass. A 23-participant structured in-field study covering seven different activities highlights the difference of mobile touch interaction and ubiquitous voice recognition for tracking activities. Scholl P, Borazio M, Jansch M, Van Laerhoven KDiary-like long-term activity recognition: Touch or voice interaction? 2014 Proceedings - 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, BSN Workshops 2014 » show abstract « hide abstract Abstract © 2014 IEEE. The experience sampling methodology is a well known tool in psychology to asses a subject’s condition. Regularly or whenever an important event happens the subject stops whatever he is currently involved in and jots down his current perceptions, experience, and activities, which in turn form the basis of these diary studies. Such methods are also widely in use for gathering labelled data for wearable long-term activity recognition, where subjects are asked to note conducted activities. We present the design of a personal electronic diary for daily activities, including user interfaces on a PC, Smartphone, and Google Glass. A 23-participant structured in-field study covering seven different activities highlights the difference of mobile touch interaction and ubiquitous voice recognition for tracking activities. Dominik Erb, Karsten Scheibler, Matthias Sauer, Bernd BeckerEfficient SMT-based ATPG for Interconnect Open Defects 2014 Conf. on Design, Automation and Test in Europe » show abstract « hide abstract Abstract magnitude. Dominik Erb, Michael Koche, Matthias Sauer, Stefan Hillebrecht, Tobias Schubert, Hans-Joachim Wunderlich, Bernd BeckerExact Logic and Fault Simulation in Presence of Unknowns 2014 ACM Transactions on Design Automation of Electronic Systems (TODAES) , volume : 19, pages : 28:1 - 28:17» show abstract « hide abstract Abstract significantly increased without increasing the test set size. Christian Dehnert, Nils Jansen, Ralf Wimmer, Erika Ábrahám, Joost-Pieter KatoenFast Debugging of PRISM Models 2014 Int'l Symp. on Automated Technology for Verfication and Analysis , Springer-Verlag» show abstract « hide abstract Abstract In addition to rigorously checking whether a system conforms to a specification, model checking can provide valuable feedback in the form of succinct and understandable counterexamples. In the context of probabilistic systems, path- and subsystem-based counterexamples at the state-space level can be of limited use in debugging. As many probabilistic systems are described in a guarded command language like the one used by the popular model checker Prism, a technique identifying a subset of critical commands has recently been proposed. Based on repeatedly solving MAXSAT~instances, our novel approach to computing a minimal critical command set achieves a speed-up of up to five orders of magnitude over the previously existing technique. Linus Feiten, Andreas Spilla, Matthias Sauer, Tobias Schubert, Bernd BeckerImplementation and Analysis of Ring Oscillator PUFs on 60 nm Altera Cyclone FPGAs 2014 Information Security Journal: A Global Perspective , volume : 22, issue : 5-6, pages : 265 - 273» show abstract « hide abstract Abstract Ring Oscillator (RO) physically unclonable functions (PUFs) on field programmable gate arrays (FPGAs) have drawn much attention in recent years. Making each FPGA uniquely identifiable, they allow for protection of intellectual property (IP) or generation of secret encryption keys. Their implementation has been widely discussed, but most experiments have been conducted on Xilinx platforms. In this paper, we report the statistical results from an analysis spanning 20 Cyclone IV FPGAs with 60 nm technology. We parameterize the RO length, placement, ambient temperature, and non-PUF switching activity and discuss the observed effects on PUF quality. Karsten Scheibler, Bernd BeckerImplication Graph Compression inside the SMT Solver iSAT3 2014 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract nations of linear and non-linear arithmetic constraint formulas (including transcendental functions), and thus is suitable to verify safety properties of systems consisting of both, linear and non-linear behaviour. The iSAT algorithm tightly integrates interval constraint propagation into the conflict-driven clause-learning framework. During the solving process, this may result in a huge implication graph. This paper presents a method to compress the implication graph on-the-fly. Experiments demonstrate that this method is able to reduce the overall memory footprint up to an order of magnitude. Ralf Wimmer, Erika ÁbrahámMaybe or Maybe Not - Contributions to Stochastic Verification In : Aspekte der Technischen Informatik: Festschrift zum 60. Geburtstag von Bernd Becker 2014, Monsenstein und Vannerdat , Rolf Drechsler, pages : 119 - 127, Rolf Drechsler, Bettina Braitling, Luis María Ferrer Fioriti, Hassan Hatefi, Ralf Wimmer, Bernd Becker, Holger HermannsMeGARA: Menu-based Game Abstraction and Abstraction Refinement of Markov Automata 2014 International Workshop on Quantitative Aspects of Programming Languages and Systems , volume : EPTCS» show abstract « hide abstract Abstract abstraction. Ralf Wimmer, Nils Jansen, Erika Ábrahám, Joost-Pieter Katoen, Bernd BeckerMinimal Counterexamples for Linear-Time Probabilistic Verification 2014 Theor Comput Sci » show abstract « hide abstract Abstract Counterexamples for property violations have a number of important applications like supporting the debugging of erroneous systems and verifying large systems via counterexample-guided abstraction refinement. In this paper, we propose the usage of minimal critical subsystems of discrete-time Markov chains and Markov decision processes as counterexamples for violated omega-regular properties. Minimality can thereby be defined in terms of the number of states or transitions. This problem is known to be NP-complete for Markov decision processes. We show how to compute such subsystems using mixed integer linear programming and evaluate the practical applicability in a number of experiments. They show that our method yields substantially smaller counterexample than using existing techniques. , volume : 554Studies in Computational Intelligence Scholl P, Brachmann M, Santini S, Van Laerhoven K» show abstract « hide abstract Abstract © Springer-Verlag Berlin Heidelberg 2014. The Robot Operating System (ROS) is a popular middleware that eases the design, implementation, and maintenance of robot systems. In particular, ROS enables the integration of a large number of heterogeneous devices in a single system. To allow these devices to communicate and cooperate, ROS requires device-specific interfaces to be available. This restricts the number of devices that can effectively be integrated in a ROS-based system. In this work we present the design, implementation, and evaluation of a ROS middleware client that allows to integrate constrained devices like wireless sensor nodes in a ROSbased system. Wireless sensor nodes embedded in the environment in which a robot system is operating can indeed help robots in navigating and interacting with the environment. The client has been implemented for devices running the Contiki operating system but its design can be readily extended to other systems like, e.g., TinyOS. Our evaluation shows that: in-buffer processing of ROS messages without relying on dynamic memory allocation is possible; message contents can be accessed conveniently using well-known concepts of the C language (structs) with negligible processing overhead with respect to a C++-based client; and that ROS’ message-passing abstraction facilitates the integration of devices running event-based systems like Contiki. , volume : 554Studies in Computational Intelligence Scholl P, Brachmann M, Santini S, Van Laerhoven K» show abstract « hide abstract Abstract © Springer-Verlag Berlin Heidelberg 2014. The Robot Operating System (ROS) is a popular middleware that eases the design, implementation, and maintenance of robot systems. In particular, ROS enables the integration of a large number of heterogeneous devices in a single system. To allow these devices to communicate and cooperate, ROS requires device-specific interfaces to be available. This restricts the number of devices that can effectively be integrated in a ROS-based system. In this work we present the design, implementation, and evaluation of a ROS middleware client that allows to integrate constrained devices like wireless sensor nodes in a ROSbased system. Wireless sensor nodes embedded in the environment in which a robot system is operating can indeed help robots in navigating and interacting with the environment. The client has been implemented for devices running the Contiki operating system but its design can be readily extended to other systems like, e.g., TinyOS. Our evaluation shows that: in-buffer processing of ROS messages without relying on dynamic memory allocation is possible; message contents can be accessed conveniently using well-known concepts of the C language (structs) with negligible processing overhead with respect to a C++-based client; and that ROS’ message-passing abstraction facilitates the integration of devices running event-based systems like Contiki. Nils Jansen, Ralf Wimmer, Erika Ábrahám, Barna Zajzon, Joost-Pieter Katoen, Bernd Becker, Johann SchusterSymbolic Counterexample Generation for Large Discrete-Time Markov Chains 2014 Sci Comput Program , volume : 91, issue : A, pages : 90 - 114» show abstract « hide abstract Abstract This paper presents several symbolic counterexample generation algorithms for discrete-time Markov chains (DTMCs) violating a PCTL formula. A counterexample is (a symbolic representation of) a sub-DTMC that is incrementally generated. The crux to this incremental approach is the symbolic generation of paths that belong to the counterexample. We consider two approaches. First, we extend bounded model checking and develop a simple heuristic to generate highly probable paths first. We then complement the SAT-based approach by a fully (multi-terminal) BDD-based technique. All symbolic approaches are implemented, and our experimental results show a substantially better scalability than existing explicit techniques. In particular, our BDD-based approach using a method called fragment search allows for counterexample generation for DTMCs with billions of states (up to 10^{15}). Der Andere Verlag , page : 184Testing Time - Time to Test? -- Using Formal Methods for the Timing Analysis of Digital Circuits -- ISBN : 978-3-86247-451-6 Matthias Sauer Borazio Marko, Berlin Eugen, Kücükyildiz Nagihan, Scholl Philipp, Van Laerhoven KristofTowards Benchmarked Sleep Detection with Inertial Wrist-worn Sensing Units 2014 Healthcare Informatics (ICHI) Borazio Marko, Berlin Eugen, Kücükyildiz Nagihan, Scholl Philipp, Van Laerhoven KristofTowards Benchmarked Sleep Detection with Inertial Wrist-worn Sensing Units 2014 Healthcare Informatics (ICHI) Borazio M, Berlin E, Kucukyildiz N, Scholl P, Van Laerhoven KTowards benchmarked sleep detection with wrist-worn sensing units 2014 Proceedings - 2014 IEEE International Conference on Healthcare Informatics, ICHI 2014 » show abstract « hide abstract Abstract © 2014 IEEE. The monitoring of sleep by quantifying sleeping time and quality is pivotal in many preventive health care scenarios. A substantial amount of wearable sensing products have been introduced to the market for just this reason, detecting whether the user is either sleeping or awake. Assessing these devices for their accuracy in estimating sleep is a daunting task, as their hardware design tends to be different and many are closed-source systems that have not been clinically tested. In this paper, we present a challenging benchmark dataset from an open source wrist-worn data logger that contains relatively high-frequent (100Hz) 3D inertial data from 42 sleep lab patients, along with their data from clinical polysomnography. We analyse this dataset with two traditional approaches for detecting sleep and wake states and propose a new algorithm specifically for 3D acceleration data, which operates on a principle of Estimation of Stationary Sleep-segments (ESS). Results show that all three methods generally over-estimate for sleep, with our method performing slightly better (almost 79% overall median accuracy) than the traditional activity count-based methods. Borazio M, Berlin E, Kucukyildiz N, Scholl P, Van Laerhoven KTowards benchmarked sleep detection with wrist-worn sensing units 2014 Proceedings - 2014 IEEE International Conference on Healthcare Informatics, ICHI 2014 » show abstract « hide abstract Abstract © 2014 IEEE. The monitoring of sleep by quantifying sleeping time and quality is pivotal in many preventive health care scenarios. A substantial amount of wearable sensing products have been introduced to the market for just this reason, detecting whether the user is either sleeping or awake. Assessing these devices for their accuracy in estimating sleep is a daunting task, as their hardware design tends to be different and many are closed-source systems that have not been clinically tested. In this paper, we present a challenging benchmark dataset from an open source wrist-worn data logger that contains relatively high-frequent (100Hz) 3D inertial data from 42 sleep lab patients, along with their data from clinical polysomnography. We analyse this dataset with two traditional approaches for detecting sleep and wake states and propose a new algorithm specifically for 3D acceleration data, which operates on a principle of Estimation of Stationary Sleep-segments (ESS). Results show that all three methods generally over-estimate for sleep, with our method performing slightly better (almost 79% overall median accuracy) than the traditional activity count-based methods. Karsten Scheibler, Bernd BeckerUsing Interval Constraint Propagation for Pseudo-Boolean Constraint Solving 2014 Formal Methods in Computer-Aided Design » show abstract « hide abstract Abstract Derived from the SMT solver iSAT3 we present the solver iSAT3p that on the one hand allows the efficient handling of huge pseudo-Boolean constraints with several thousand summands and large integer coefficients. On the other hand, experimental results demonstrate that at the same time iSAT3p is competitive or even superior to other solvers on standard pseudo-Boolean benchmark families. Sven Reimer, Matthias Sauer, Tobias Schubert, Bernd BeckerUsing MaxBMC for Pareto-Optimal Circuit Initialization 2014 Conf. on Design, Automation and Test in Europe » show abstract « hide abstract Abstract In this paper we present MaxBMC, a novel formalism for solving optimization problems in sequential systems. Our approach combines techniques from symbolic SAT-based Bounded Model Checking (BMC) and incremental MaxSAT, leading to the first MaxBMC solver. In traditional BMC safety and liveness properties are validated. We extend this formalism: in case the required property is satisfied, an optimization problem is defined to maximize the quality of the reached witnesses. Further, we compare its qualities in different depths of the system, leading to Pareto-optimal solutions. We state a sound and complete algorithm that not only tackles the optimization problem but moreover verifies whether a global optimum has been identified by using a complete BMC solver as back-end. As a first reference application we present the problem of circuit initialization. Additionally, we give pointers to other tasks which can be covered by our formalism quite naturally and further demonstrate the efficiency and effectiveness of our approach. Scholl Philipp, Van Laerhoven KristofWearable digitization of life science experiments 2014 New York, New York, USA Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp ’14 Adjunct , ACM Press Scholl Philipp, Van Laerhoven KristofWearable digitization of life science experiments 2014 New York, New York, USA Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp ’14 Adjunct , ACM Press back to the year overview Linus Feiten, Katrin Weber, Bernd BeckerSMILE: Smartphones in der Lehre – ein Rück- und Überblick 2013 INFORMATIK 2013 Gesellschaft für Informatik (GI) , Matthias Horbach, volume : P-220, pages : 255 - 269» show abstract « hide abstract Abstract Bei SMILE handelt es sich um ein Projekt, das im Wintersemester 2010 gestartet wurde. Sein fortlaufendes Ziel ist die Erforschung von Möglichkeiten, die universitäre Lehre - insbesondere Vorlesungen mit über hundert Studierenden - durch den Einsatz von IT zu bereichern. Hierfür wurden Apps für Studierende und Dozierenden entwickelt, die auf allen gängigen Geräten laufen. Die konzeptionelle Planung liegt bei einem interdisziplinären Team aus Informatikern und Instructional Designern; für die Implementierung wird eng mit einem studentischen Entwicklerteam zusammen gearbeitet. Zwei Preisauszeichnungen zeigen die positive Resonanz, die SMILE bisher hervor gerufen hat. Im Sommersemester 2013 wird SMILE zum vierten Mal in einer größeren Vorlesung eingesetzt. In diesem Workshop-Paper beschreiben wir den bisherigen Verlauf des Projekts und die Kernfunktionalitäten der Software. Tobias Schubert, Jan Burchard, Matthias Sauer, Bernd BeckerS-Trike: A Mobile Robot Platform for Higher Education 2013 International Conference on Computer Applications in Industry and Engineering , pages : 243 - 248 Linus Feiten, Andreas Spilla, Matthias Sauer, Tobias Schubert, Bernd BeckerAnalysis of Ring Oscillator PUFs on 60nm FPGAs 2013 TRUDEVICE Workshop, Avignon » show abstract « hide abstract Abstract The implementation of RO PUFs on FPGAs has been widely discussed but most experiments have been conducted on Xilinx FPGAs. In this paper we are reporting statistical results from an analysis spanning 20 equivalent Altera FPGAs. The presented results include the PUF quality's dependency on different parameters like RO length and placement on the FPGA. We identify the optimal RO length of 16 Logic Elements (LE) and show some specific placement cases for which the otherwise very good PUF quality decreases drastically. Linus Feiten, Matthias Sauer, Tobias Schubert, Alexander Czutro, Victor Tomashevich, Eberhard Böhl, Ilia Polian, Bernd Becker#SAT for Vulnerability Analysis of Security Components 2013 (Workshop-Paper, Informal Proceedings) IEEE European Test Symposium » show abstract « hide abstract Abstract Vulnerability to malicious fault attacks is an emerging concern for hardware circuits that process sensitive data.We describe a new methodology to assess the vulnerability to such attacks, taking into account built-in protection mechanisms. Our method is based on accurate modeling of fault effects and their detection status expressed as Boolean satisfiability (SAT) formulae. Vulnerability is quantified based on the number of solutions of such formulae, which are computed by an eficient #SAT solver. We demonstrate the applicability of this method by analyzing a sequential pseudo random number generator and a combinatorial multiplier circuit both protected by robust error-detecting codes. Ulrich Loup, Karsten Scheibler, Florian Corzilius, Erika Ábrahám, Bernd BeckerA Symbiosis of Interval Constraint Propagation and Cylindrical Algebraic Decomposition 2013 CADE , Springer, volume : 7898, pages : 193 - 207» show abstract « hide abstract Abstract A symbiosis of iSAT and CAD combines the advantages of both methods resulting in a fast and complete solver. In particular, the interval box determined by iSAT provides precious extra information to guide the CAD-method search routine: We use the interval box to prune the CAD search space in both phases, the projection and the construction phase, forming a search `tube' rather than a search tree. This proves to be particularly beneficial for a CAD implementation designed to search a satisfying assignment pointedly, as opposed to search and exclude conflicting regions. Nils Jansen, Florian Corzilius, Matthias Volk, Ralf Wimmer, Erika Ábrahám, Joost-Pieter Katoen, Becker, BerndAccelerating Parametric Probabilistic Verification arXiv , volume : arXiv:1312.3979, 2013 Benjamin Andres, Matthias Sauer, Martin Gebser, Tobias Schubert, Bernd Becker, Torsten SchaubAccurate Computation of Sensitizable Paths using Answer Set Programming 2013 Int. Conf. on Logic Programming and Nonmonotonic Reasoning , pages : 92 - 101» show abstract « hide abstract Abstract Precise knowledge of the longest sensitizable paths in a circuit is crucial for various tasks in computer-aided design, including timing analysis, performance optimization, delay testing, and speed binning. As delays in today’s nanoscale technologies are increasingly affected by statistical parameter variations, there is significant interest in obtaining sets of paths that are within a length range. For instance, such path sets can be used in the emerging areas of Post-silicon validation and characterization and Adaptive Test. We present an ASP-based method for computing well-defined sets of sensitizable paths within a length range. Unlike previous approaches, the method is accurate and does not rely on a priori relaxations. Experimental results demonstrate the applicability and scalability of our method. Dominik Erb, Michael A Kochte, Matthias Sauer, Hans-Joachim Wunderlich, Bernd BeckerAccurate Multi-Cycle ATPG in Presence of X-Values 2013 22nd IEEE Asian Test Symposium (ATS) » show abstract « hide abstract Abstract Unknown (X) values in a circuit impair test quality and increase test costs. Classical n-valued algorithms for fault simulation and ATPG, which typically use a three- or four-valued logic for the good and faulty circuit, are in principle pessimistic in presence of X-values and cannot accurately compute the achievable fault coverage. In partial scan or pipelined circuits, X-values originate in non-scan flip-flops. These circuits are tested using multi-cycle tests. Here we present multi-cycle test generation techniques for circuits with X-values due to partial scan or other X-sources. The proposed techniques have been integrated into a multi-cycle ATPG framework which employs formal Boolean and quantified Boolean (QBF) satisfiability techniques to compute the possible signal states in the circuit accurately. Efficient encoding of the problem instance ensures reasonable runtimes. We show that in presence of X-values, the detection of stuck-at faults requires not only exact formal reasoning in a single cycle, but especially the consideration of multiple cycles for excitation of the fault site as well as propagation and controlled reconvergence of fault effects. For the first time, accurate deterministic ATPG for multi-cycle test application is supported for stuck-at faults. Experiments on ISCAS'89 and industrial circuits with X-sources show that this new approach increases the fault coverage considerably. Stefan Hillebrecht, Michael A. Kochte, Dominik Erb, Hans-Joachim Wunderlich, Bernd BeckerAccurate QBF-based test pattern generation in presence of unknown values 2013 Conf. on Design, Automation and Test in Europe » show abstract « hide abstract Abstract circuits. Matthias Sauer, Sven Reimer, Stefan Kupferschmid, Tobias Schubert, Paolo Marin, Bernd BeckerApplying BMC, Craig Interpolation and MAX-SAT to Functional Justification in Sequential Circuits 2013 RCRA International Workshop on Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion » show abstract « hide abstract Abstract such sequences depending on the number of controllable flip-flops. Renatus Derbidge, Linus Feiten, Oliver Conradt, Peter Heusser, Stefan BaumgartnerAssessment of Shape Changes of Mistletoe Berries: A New Software Approach to Automatize the Parameterization of Path Curve Shaped Contours 2013 Plos One , volume : 8, issue : 4» show abstract « hide abstract Abstract Photographs of mistletoe (Viscum album L.) berries taken by a permanently fixed camera during their development in autumn were subjected to an outline shape analysis by fitting path curves using a mathematical algorithm from projective geometry. During growth and maturation processes the shape of mistletoe berries can be described by a set of such path curves, making it possible to extract changes of shape using one parameter called Lambda. Lambda describes the outline shape of a path curve. Here we present methods and software to capture and measure these changes of form over time. The present paper describes the software used to automatize a number of tasks including contour recognition, optimization of fitting the contour via hill-climbing, derivation of the path curves, computation of Lambda and blinding the pictures for the operator. The validity of the program is demonstrated by results from three independent measurements showing circadian rhythm in mistletoe berries. The program is available as open source and will be applied in a project to analyze the chronobiology of shape in mistletoe berries and the buds of their host trees. Scholl P, Kücükyildiz N, Van Laerhoven KBridging the last gap: Ledtx - optical data transmission of sensor data for web-services 2013 UbiComp 2013 Adjunct - Adjunct Publication of the 2013 ACM Conference on Ubiquitous Computing » show abstract « hide abstract Abstract Data transmission from small-scale data loggers such as human activity recognition sensors is an inherent system’s design challenge. Interfaces based on USB or Bluetooth still require platform-dependent code on the retrieval computer system, and therefore require a large maintenance effort. In this paper, we present LedTX, a system that is able to transmit wirelessly through LEDs and the camera included in most user’s hardware. This system runs completely in modern browsers and presents a uni- directional, platform-independent communication channel. We illustrate this system on the UbiLighter, an instrumented lighter that tracks ones smoking behaviour. Scholl P, Kücükyildiz N, Van Laerhoven KBridging the last gap: Ledtx - optical data transmission of sensor data for web-services 2013 UbiComp 2013 Adjunct - Adjunct Publication of the 2013 ACM Conference on Ubiquitous Computing » show abstract « hide abstract Abstract Data transmission from small-scale data loggers such as human activity recognition sensors is an inherent system’s design challenge. Interfaces based on USB or Bluetooth still require platform-dependent code on the retrieval computer system, and therefore require a large maintenance effort. In this paper, we present LedTX, a system that is able to transmit wirelessly through LEDs and the camera included in most user’s hardware. This system runs completely in modern browsers and presents a uni- directional, platform-independent communication channel. We illustrate this system on the UbiLighter, an instrumented lighter that tracks ones smoking behaviour. Scholl P, El Majoub B, Santini S, Van Laerhoven KConnecting wireless sensor networks to the robot operating system 2013 Procedia Computer Science , volume : 19» show abstract « hide abstract Abstract Robot systems largely depend on embedded systems to operate. The interfaces of those embedded systems, e.g. motor controllers or laser scanners, are often vendor-specific and therefore require a component that translates from/to the Robot Operating System (ROS) Middleware interface. In this work we present an implementation and evaluation of a ROS Middleware client based on the Contiki operating systems, which is suitable for constrained embedded devices, like wireless sensor nodes. We show that in-buffer processing of ROS messages without relying on dynamic memory allocation is possible. That message contents can be accessed conveniently via well-known concepts of the C language (structs) with negligible processing overhead compared to a C++-based client. And that the message-passing middleware concept of ROS fits nicely in Contiki’s event-based nature. Furthermore, in order for an environment enriched with wireless sensor network to help robots in navigating, understanding and manipulating environments a direct integration is mandatory. © 2013 The Authors. Published by Elsevier B.V. Scholl P, El Majoub B, Santini S, Van Laerhoven KConnecting wireless sensor networks to the robot operating system 2013 Procedia Computer Science , volume : 19» show abstract « hide abstract Abstract Robot systems largely depend on embedded systems to operate. The interfaces of those embedded systems, e.g. motor controllers or laser scanners, are often vendor-specific and therefore require a component that translates from/to the Robot Operating System (ROS) Middleware interface. In this work we present an implementation and evaluation of a ROS Middleware client based on the Contiki operating systems, which is suitable for constrained embedded devices, like wireless sensor nodes. We show that in-buffer processing of ROS messages without relying on dynamic memory allocation is possible. That message contents can be accessed conveniently via well-known concepts of the C language (structs) with negligible processing overhead compared to a C++-based client. And that the message-passing middleware concept of ROS fits nicely in Contiki’s event-based nature. Furthermore, in order for an environment enriched with wireless sensor network to help robots in navigating, understanding and manipulating environments a direct integration is mandatory. © 2013 The Authors. Published by Elsevier B.V. Karsten Scheibler, Matthias Sauer, Kohei Miyase, Bernd BeckerControlling Small-Delay Test Power Consumption using Satisfibility Modulo Theory Solving 2013 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract monstrate the applicability of the method. Young Moon Kim, Jun Seomun, Hyung-Ock Kim, Kyung Tae Do, Jung Yun Choi, Kee Sup Kim, Matthias Sauer, Bernd Becker, Subhasish MitraDetection of early-life failures in high-K metal-gate transistors and ultra low-K inter-metal dielectrics 2013 Custom Integrated Circuits Conference , pages : 1 - 4» show abstract « hide abstract Abstract associated with stress tests such as burn-in. Matthias Sauer, Young Moon Kim, Jun Seomun, Hyung-Ock Kim, Kyung-Tae Do, Jung Yun Choi, Kee Sup Kim, Subhasish Mitra, Bernd BeckerEarly-Life-Failure Detection using SAT-based ATPG 2013 Int'l Test Conf. , pages : 1 - 10» show abstract « hide abstract Abstract Early-life failures (ELF) result from weak chips that may pass manufacturing tests but fail early in the field, much earlier than expected product lifetime. Recent experimental studies over a range of technologies have demonstrated that ELF defects result in changes in delays over time inside internal nodes of a logic circuit before functional failure occurs. Such changes in delays are distinct from delay degradation caused by circuit aging mechanisms such as Bias Temperature Instability. Traditional transition fault or robust path delay fault test patterns are inadequate for detecting such ELF-induced changes in delays because they do not model the demanding detection conditions precisely. In this paper, we present an automatic test pattern generation (ATPG) technique based on Boolean Satisfiability (SAT) for detecting ELF-induced delay changes at all gates in a given circuit. Our simulation results, using various circuit blocks from the industrial OpenSPARC T2 design as well as standard benchmarks, demonstrate the effectiveness and practicality of our approach in achieving high coverage of ELF-induced delay change detection. We also demonstrate the robustness of our approach to manufacturing process variations. Matthias Sauer, Sven Reimer, Tobias Schubert, Ilia Polian, Bernd BeckerEfficient SAT-Based Dynamic Compaction and Relaxation for Longest Sensitizable Paths 2013 Conf. on Design, Automation and Test in Europe , pages : 448 - 453» show abstract « hide abstract Abstract Comprehensive coverage of small-delay faults under massive process variations is achieved when multiple paths through the fault locations are sensitized by the test pair set. Using one test pair per path may lead to impractical test set sizes and test application times due to the large number of near-critical paths in state-of-the-art circuits. Karina Gitina, Sven Reimer, Matthias Sauer, Ralf Wimmer, Christoph Scholl, Bernd BeckerEquivalence Checking for Partial Implementations Revisited 2013 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Universität Rostock ITMZ, pages : 61 - 70» show abstract « hide abstract Abstract In this paper we consider the problem of checking whether a partial implementation can (still) be extended to a complete design which is equivalent to a given full specification. In particular, we investigate the relationship between the equivalence checking problem for partial implementations (PEC) and the validity problem for quantified boolean formulae (QBF) with so-called Henkin quantifiers. Our analysis leads us to a sound and complete algorithmic solution to the PEC problem as well as to an exact complexity theoretical classification of the problem. Karina Gitina, Sven Reimer, Matthias Sauer, Ralf Wimmer, Christoph Scholl, Bernd BeckerEquivalence Checking of Partial Designs using Dependency Quantified Boolean Formulae 2013 Int'l Conf. on Computer Design , IEEE Computer Society, pages : 396 - 403» show abstract « hide abstract Abstract Our novel algorithm to solve DQBF based on quantifier elimination can therefore be applied to solve PEC. We also present first experimental results showing the feasibility of our approach and the inaccuracy of QBF approximations, which are usually used for deciding the PEC so far. Katrin Weber, Bernd BeckerFormative Evaluation des mobilen Classroom-Response-Systems SMILE 2013 GMW 2013 eLearing , pages : 277 - 289» show abstract « hide abstract Abstract Answer – von den Studierenden der Pilotvorlesung unterschiedlich häufig genutzt wurden. Darüber hinaus wurden die Akzeptanz der Studierenden gegenüber dem Einsatz des Tools und die Wirkung der Quiz-Fragen untersucht. Es zeigt sich, dass Schwächen der technischen Ausstattung im Hörsaal und die technische Zuverlässigkeit der SMILE-App wesentliche Einflussfaktoren auf die Nutzungshäufigkeit sein könnten. Im Sinne einer formativen Evaluation konnten nicht nur technische Problembereiche identifiziert werden, sondern auch Ableitungen für das didaktische Konzept der SMILE-unterstützten Vorlesungen gemacht werden. Ralf Wimmer, Nils Jansen, Andreas Vorpahl, Erika Ábrahám, Joost-Pieter Katoen, Bernd BeckerHigh-Level Counterexamples for Probabilistic Automata 2013 Springer-Verlag, volume : 8054, pages : 18 - 33» show abstract « hide abstract Abstract simplified to obtain a well-understandable counterexample. Ralf Wimmer, Nils Jansen, Andreas Vorpahl, Erika Ábrahám, Joost-Pieter Katoen, Bernd BeckerHigh-Level Counterexamples for Probabilistic Automata , volume : arxiv:1305.5055, 2013 Andreas Riefert, Joerg Mueller, Matthias Sauer, Wolfram Burgard, Bernd BeckerIdentification of Critical Variables using an FPGA-based Fault Injection Framework 2013 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Andreas Riefert, Joerg Mueller, Matthias Sauer, Wolfram Burgard, Bernd BeckerIdentification of Critical Variables using an FPGA-based Fault Injection Framework 2013 VLSI Test Symp. , pages : 1 - 6» show abstract « hide abstract Abstract The shrinking nanometer technologies of modern microprocessors and the aggressive supply voltage down-scaling drastically increase the risk of soft errors. In order to cope with this risk efficiently, selective hardware and software protection schemes are applied. In this paper, we propose an FPGA-based fault injection framework which is able to identify the most critical registers of an entire microprocessor. Further-more, our framework identifies critical variables in the source code of an arbitrary application running in its native environment. We verify the feasibility and relevance of our approach by implementing a lightweight and efficient error correction mechanism protecting only the most critical parts of the system. Experimental results with state estimation applications demonstrate a significantly reduced number of critical calculation errors caused by faults injected into the processor. Nikitaki Sofia, Scholl Philipp, Laerhoven Kristof, Tsakalides PanagiotisLocalization in Wireless Networks via Laser Scanning and Bayesian Compressed Sensing 2013 Signal Processing Advances in Wireless Communications (SPAWC), 2013 IEEE 14th Workshop on Nikitaki Sofia, Scholl Philipp, Laerhoven Kristof, Tsakalides PanagiotisLocalization in Wireless Networks via Laser Scanning and Bayesian Compressed Sensing 2013 Signal Processing Advances in Wireless Communications (SPAWC), 2013 IEEE 14th Workshop on Matthias Sauer, Sven Reimer, Ilia Polian, Tobias Schubert, Bernd BeckerProvably Optimal Test Cube Generation Using Quantified Boolean Formula Solving 2013 ASP Design Automation Conf. » show abstract « hide abstract Abstract Circuits that employ test pattern compression rely on test cubes to achieve high compression ratios. The less inputs of a test pattern are specified, the better it can be compacted and hence the lower the test application time. Although there exist previous approaches to generate such test cubes, none of them are optimal. We present for the first time a framework that yields provably optimal test cubes by using the theory of quantified Boolean formulas (QBF). Extensive comparisons with previous methods demonstrate the quality gain of the proposed method. Karsten Scheibler, Stefan Kupferschmid, Bernd BeckerRecent Improvements in the SMT Solver iSAT 2013 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract Over the past decades embedded systems have become more and more complex. Furthermore, besides digital components they now often contain additional analog parts -- making them to embedded hybrid systems. If such systems are used in safety critical environments, a formal correctness proof is highly desirable. The SMT solver iSAT aims at solving boolean combinations of linear and non-linear constraint formulas (including transcendental functions), and thus is suitable to verify safety properties of systems consisting of both, linear and non-linear behaviour. To keep up with the ever increasing complexity of these systems we present recent improvements in various parts of iSAT. Experiments demonstrate that iSAT with all the presented improvements integrated typically gains a speedup between one and two orders of magnitude. Matthias Sauer, Alexander Czutro, Tobias Schubert, Stefan Hillebrecht, Ilia Polian, Bernd BeckerSAT-based Analysis of Sensitisable Paths 2013 Test of Computers , volume : 30, issue : 4, pages : 81 - 88» show abstract « hide abstract Abstract A common trend in the past has been to detect delay defects in nanoscale technologies through the longest sensitisable paths. This approach does not hold up for non-trivial defects due to modeling inaccuracies. This article supports tests through all paths of customized length, using current SAT-solving advances. Kohei Miyase, Matthias Sauer, Bernd Becker, Xiaoqing Wen, Seiji KajiharaSearch Space Reduction for Low-Power Test Generation 2013 22nd IEEE Asian Test Symposium (ATS) » show abstract « hide abstract Abstract Ongoing research to shrink feature sizes of LSI circuits leads to an always increasing number of logic gates in a circuit. In general, the complexity of test generation depends on the size of a circuit. Furthermore, modern test generation methods have to consider power reduction in addition to fault detection, since excessive power caused by testing may result in over testing. In this work, we propose a method to reduce the computation time of low-power test generation. The proposed method specifies gates which will cause power issues, consequently reducing the search space for X-filling technique. The reduction of search space for Xfilling also further minimizes the amount of switching activity. Experimental results for circuits of Open Cores provided by IWLS2005 benchmarks show that the proposed method achieves both a reduced computation time and at the same time increased power reduction compared to previous methods. Bettina Braitling, Ralf Wimmer, Bernd Becker, Erika ÁbrahámStochastic Bounded Model Checking: Bounded Rewards and Compositionality 2013 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 243 - 254» show abstract « hide abstract Abstract cases. Matthias Sauer, Jan Burchard, Tobias Schubert, Ilia Polian, Bernd BeckerWaveform-Guided Fault Injection by Clock Manipulation 2013 TRUDEVICE Workshop Scholl Philipp, Kücükyildiz Nagihan, Van Laerhoven Kristof, Laerhoven KristofWhen Do You Light a Fire? Capturing Tobacco Use with Situated, Wearable Sensors 2013 Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication Scholl Philipp, Kücükyildiz Nagihan, Van Laerhoven Kristof, Laerhoven KristofWhen Do You Light a Fire? Capturing Tobacco Use with Situated, Wearable Sensors 2013 Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication Scholl P, Kücükyildiz N, Van Laerhoven KWhen do you light a fire? capturing tobacco use with situated, wearable sensors 2013 UbiComp 2013 Adjunct - Adjunct Publication of the 2013 ACM Conference on Ubiquitous Computing » show abstract « hide abstract Abstract An important step towards assessing smoking behavior is to detect and log smoking episodes in an unobtrusive way. Detailed information on an individual’s consumption can then be used to highlight potential health risks and behavioral statistics to increase the smoker’s awareness, and might be applied in smoking cessation programs. In this paper, we present an evaluation of two different monitoring prototypes which detect a user’s smoking behavior, based on augmenting a lighter. Both prototypes capture and record instances when the user smokes, and are sufficiently robust and power efficient to allow deployments of several weeks. A real-world feasibility study with 11 frequently-smoking participants investigates the deployment and adoption of the system, hinting that smokers are generally unaware of their daily smoking patterns, and tend to overestimate their consumption. Copyright © 2013 ACM. Scholl P, Kücükyildiz N, Van Laerhoven KWhen do you light a fire? capturing tobacco use with situated, wearable sensors 2013 UbiComp 2013 Adjunct - Adjunct Publication of the 2013 ACM Conference on Ubiquitous Computing » show abstract « hide abstract Abstract An important step towards assessing smoking behavior is to detect and log smoking episodes in an unobtrusive way. Detailed information on an individual’s consumption can then be used to highlight potential health risks and behavioral statistics to increase the smoker’s awareness, and might be applied in smoking cessation programs. In this paper, we present an evaluation of two different monitoring prototypes which detect a user’s smoking behavior, based on augmenting a lighter. Both prototypes capture and record instances when the user smokes, and are sufficiently robust and power efficient to allow deployments of several weeks. A real-world feasibility study with 11 frequently-smoking participants investigates the deployment and adoption of the system, hinting that smokers are generally unaware of their daily smoking patterns, and tend to overestimate their consumption. Copyright © 2013 ACM. Der Andere Verlag , page : 266Über Craigsche Interpolation und deren Anwendung in der formalen Modellprüfung ISBN : 978-3-86247-411-0 Stefan Kupferschmid back to the year overview Alexander Czutro, Michael Imhof, Jie Jiang, Abdullah Mumtaz, Matthias Sauer, Bernd Becker, Ilia Polian, Hans-Joachim WunderlichVariation-Aware Fault Grading 2012 IEEE Asian Test Symp. , pages : 344 - 349 Christian Miller, Paolo Marin, Bernd BeckerA Dynamic QBF Preprocessing Approach for the Verification of Incomplete Designs 2012 Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion , volume : 19» show abstract « hide abstract Abstract preprocessing effectiveness does not compensate for the time spent. We present multiple switching heuristics, and evaluate the overall performance by running several state-of-the-art QBF solvers as back-end. Comparing these results to those obtained by standard and incremental BMC procedures it turns out that dynamic preprocessing leads to the best overall performance of the whole verification process. Sven Reimer, Florian Pigorsch, Christoph Scholl, Bernd BeckerEnhanced Integration of QBF Solving Techniques 2012 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 133 - 143» show abstract « hide abstract Abstract In this paper we present a novel QBF solving technique which is based on the integration of a search based (DPLL) and a rewriting based approach: While traversing the search space in a DPLL manner, we iteratively generate many sub-problems, which are handed over to the rewriting method one by one. Instead of just communicating back satisfiability results of the individual sub-problems, we collect as many constraints derived by the rewriting based solver as possible, and transfer them back to the search based solver. This allows not only to prune the current branch, but also to avoid the unnecessary traversal of search paths in different regions of the search tree. We also discuss heuristics to determine suitable switching points between these two methods. We present first promising results that underline the potential of our approach. Linus Feiten, Matthias Sauer, Tobias Schubert, Alexander Czutro, Eberhard Böhl, Ilia Polian, Bernd Becker#SAT-Based Vulnerability Analysis of Security Components -- A Case Study 2012 IEEE International Symposium on Defect and Fault Tolerance (DFT) , pages : 49 - 54» show abstract « hide abstract Abstract In this paper we describe a new approach to assess a circuit's vulnerability to fault attacks. This is achieved through analysis of the circuit's design specification, making use of modern SAT solving techniques. For each injectable fault, a corresponding SAT instance is generated. Every satisfying solution for such an instance is equivalent to a circuit state and an input assignment for which the fault affcts the circuit's outputs such that the error is not detected by the embedded fault detection. The number of solutions is precisely calculated by a #SAT solver and can be translated into an exact vulnerability measure. We demonstrate the applicability of this method for design space exploration by giving detailed results for various implementations of a deterministic random bit generator. Scholl Philipp, Laerhoven KristofA Feasibility Study of Wrist-Worn Accelerometer Based Detection of Smoking Habits 2012 International Workshop on Extending Seamlessly to the Internet of Things Scholl Philipp, Laerhoven KristofA Feasibility Study of Wrist-Worn Accelerometer Based Detection of Smoking Habits 2012 International Workshop on Extending Seamlessly to the Internet of Things Scholl P, Van Laerhoven KA feasibility study of wrist-worn accelerometer based detection of smoking habits 2012 Proceedings - 6th International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing, IMIS 2012 » show abstract « hide abstract Abstract Cigarette smoking is one of the major causes of lung cancer, and has been linked to a large amount of other cancer types and diseases. Smoking cessation, the only mean to avoid these serious risks, is hindered by the ease to ignore these risks in day-to-day life. In this paper we present a feasibility study with smokers wearing an accelerometer device on their wrist over the course of a week to detect their smoking habits based on detecting typical gestures carried out while smoking a cigarette. We provide a basic detection method that identifies when the user is smoking, with the goal of building a system that provides an individualized risk estimation to increase awareness and motivate smoke cessation. Our basic method detects typical smoking gestures with a precision of 51.2% and shows a user-specific recall of over 70% - creating evidence that an unobtrusive wrist-watch-like sensor can detect smoking. © 2012 IEEE. Scholl P, Van Laerhoven KA feasibility study of wrist-worn accelerometer based detection of smoking habits 2012 Proceedings - 6th International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing, IMIS 2012 » show abstract « hide abstract Abstract Cigarette smoking is one of the major causes of lung cancer, and has been linked to a large amount of other cancer types and diseases. Smoking cessation, the only mean to avoid these serious risks, is hindered by the ease to ignore these risks in day-to-day life. In this paper we present a feasibility study with smokers wearing an accelerometer device on their wrist over the course of a week to detect their smoking habits based on detecting typical gestures carried out while smoking a cigarette. We provide a basic detection method that identifies when the user is smoking, with the goal of building a system that provides an individualized risk estimation to increase awareness and motivate smoke cessation. Our basic method detects typical smoking gestures with a precision of 51.2% and shows a user-specific recall of over 70% - creating evidence that an unobtrusive wrist-watch-like sensor can detect smoking. © 2012 IEEE. Bernd Becker, Ruediger Ehlers, Matthew Lewis, Paolo MarinALLQBF Solving by Computational Learning 2012 Automated Technology for Verification and Analysis , Springer, volume : 7561, pages : 370 - 384» show abstract « hide abstract Abstract In the last years, search-based QBF solvers have become essential formany applications in the formal methods domain. The exploitation oftheir reasoning efficiency has however been restricted to applicationsin which a ``satisfiable/unsatisfiable'' answer or one model of anopen quantified Boolean formula suffices as an outcome, whereasapplications in which a compact representation of all models isrequired could not be tackled with QBF solvers so far. In this paper,we describe how computational learning provides a remedy to thisproblem. Our algorithms employ a search-based QBF solver and learn theset of all models of a given open QBF problem in a CNF (conjunctivenormal form), DNF (disjunctive normal form), or CDNF (conjunction ofDNFs) representation. We evaluate our approach experimentally usingbenchmarks from synthesis of finite-state systems from temporal logicand monitor computation. Benjamin Andres, Matthias Sauer, Martin Gebser, Tobias Schubert, Bernd Becker, Torsten SchaubAccurate Computation of Longest Sensitizable Paths using Answer Set Programming 2012 GMM/ITG-Fachtagung “Zuverlässigkeit und Entwurf” Matthias Sauer, Stefan Kupferschmid, Alexander Czutro, Sudhakar M. Reddy, Bernd BeckerAnalysis of Reachable Sensitisable Paths in Sequential Circuits with SAT and Craig Interpolation 2012 Int'l Conf. on VLSI Design » show abstract « hide abstract Abstract Test pattern generation for sequential circuits benefits from scanning strategies as these allow the justification of arbitrary circuit states. However, some of these states may be unreachable during normal operation. This results in non-functional operation which may lead to abnormal circuit behaviour and result in over-testing. In this work, we present a versatile approach that combines a highly adaptable SAT-based path-enumeration algorithm with a model-checking solver for invariant properties that relies on the theory of Craig interpolants to prove the unreachability of circuit states. The method enumerates a set of longest sensitisable paths and yields test sequences of minimal length able to sensitise the found paths starting from a given circuit state. We present detailed experimental results on the reach ability of sensitisable paths in ITC 99 circuits. Stefan Hillebrecht, Michael Kochte, Hans-Joachim Wunderlich, Bernd BeckerExact Stuck-at Fault Classification in Presence of Unknowns 2012 IEEE European Test Symp. » show abstract « hide abstract Abstract results on ISCAS benchmark and industrial circuits. The applicability of our algorithm to large industrial circuits is demonstrated. Scholl P, Kohlbrecher S, Sachidananda V, Van Laerhoven KFast indoor radio-map building for RSSI-based localization systems 2012 9th International Conference on Networked Sensing Systems, INSS 2012 - Conference Proceedings » show abstract « hide abstract Abstract Wireless Indoor localization systems based on RSSI-values typically consist of an offline training phase and online position determination phase. During the offline phase, geo-referenced RSSI measurements, called fingerprints, are recorded to build a radio-map of the building. This radiomap is then searched during the position determination phase to estimate another nodes’ location. Usually the radiomap is build manually, either by users pin-pointing their location on a ready-made floorplan or by moving in pre-specified patterns while scanning the network for RSSI values. This cumbersome process leads to inaccuracies in the radiomap. Here, we propose a system to build the floorplan and radio-map simultaneously by employing a handheld laser mapping system in an IEEE802.15.4-compatible network. This makes indoor- and radio-mapping for wireless localization less cumbersome, faster, more reliable and delivers a new way to evaluate wireless localization systems. © 2012 IEEE. Scholl P, Kohlbrecher S, Sachidananda V, Van Laerhoven KFast indoor radio-map building for RSSI-based localization systems 2012 9th International Conference on Networked Sensing Systems, INSS 2012 - Conference Proceedings » show abstract « hide abstract Abstract Wireless Indoor localization systems based on RSSI-values typically consist of an offline training phase and online position determination phase. During the offline phase, geo-referenced RSSI measurements, called fingerprints, are recorded to build a radio-map of the building. This radiomap is then searched during the position determination phase to estimate another nodes’ location. Usually the radiomap is build manually, either by users pin-pointing their location on a ready-made floorplan or by moving in pre-specified patterns while scanning the network for RSSI values. This cumbersome process leads to inaccuracies in the radiomap. Here, we propose a system to build the floorplan and radio-map simultaneously by employing a handheld laser mapping system in an IEEE802.15.4-compatible network. This makes indoor- and radio-mapping for wireless localization less cumbersome, faster, more reliable and delivers a new way to evaluate wireless localization systems. © 2012 IEEE. Matthias Sauer, Stefan Kupferschmid, Alexander Czutro, Ilia Polian, Sudhakar M. Reddy, Bernd BeckerFunctional Justification in Sequential Circuits using SAT and Craig Interpolation 2012 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Matthias Sauer, Stefan Kupferschmid, Alexander Czutro, Ilia Polian, Sudhakar M. Reddy, Bernd BeckerFunctional Test of Small-Delay Faults using SAT and Craig Interpolation 2012 Int'l Test Conf. , pages : 1 - 8» show abstract « hide abstract Abstract We present SATSEQ, a timing-aware ATPG system for small-delay faults in non-scan circuits. The tool identifies the longest paths suitable for functional fault propagation and generates the shortest possible sub-sequences per fault. Based on advanced model-checking techniques, SATSEQ provides detection of small-delay faults through the longest functional paths. All test sequences start at the circuit's initial state; therefore, overtesting is avoided. Moreover, potential invalidation of the fault detection is taken into account. Experimental results show high detection and better performance than scan testing in terms of test application time and overtesting-avoidance. Paolo Marin, Christian Miller, Bernd BeckerIncremental QBF Preprocessing for Partial Design Verification - (Poster Presentation) 2012 Int'l Conf. on Theory and Applications of Satisfiability Testing , Springer, volume : 7317, pages : 473 - 474» show abstract « hide abstract Abstract Bounded Model Checking (BMC) is a major verification method for finding errors in sequential circuits. BMC accomplishes this by iteratively unfolding a circuit k times, adding the negated property, and finally converting the BMC instance into a sequence of satisfiability (SAT) problems. When considering incomplete designs (i.e. those containing so-called blackboxes), we rather need the logic of Quantified Boolean Formulas (QBF) to obtain a more precise modeling of the unknown behavior of the blackbox. Here, we answer the question of unrealizability of a property, where finding a path of length k proves that the property is violated regardless of the implementation of the blackbox. To boost this task, solving blackbox BMC problems incrementally has been shown to be feasible [3], although the restrictions required in the preprocessing phase reduce its effectiveness. In this paper we enhance the verification procedure when using an off-the-shelf QBF solver, through a stronger preprocessing of the QBF formulas applied in an incremental fashion. Ralf Wimmer, Nils Jansen, Erika Ábrahám, Joost-Pieter Katoen, Bernd BeckerMinimal Counterexamples for Refuting omega-Regular Properties of Markov Decision Processes AVACS Technical Report , issue : 88, 2012 Ralf Wimmer, Bernd Becker, Nils Jansen, Erika Ábrahám, Joost-Pieter KatoenMinimal Critical Subsystems as Counterexamples for omega-Regular DTMC Properties 2012 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Verlag Dr. Kovac, pages : 169 - 180 Ralf Wimmer, Bernd Becker, Nils Jansen, Erika Ábrahám, Joost-Pieter KatoenMinimal Critical Subsystems for Discrete-Time Markov Models 2012 Int'l Conf. on Tools and Algorithms for the Construction and Analysis of Systems , Springer-Verlag, volume : 7214, pages : 299 - 314» show abstract « hide abstract Abstract We propose a new approach to compute counterexamples for violated ω-regular properties of discrete-time Markov chains and Markov decision processes. Whereas most approaches compute a set of system paths as a counterexample, we determine a critical subsystem that already violates the given property. In earlier work we introduced methods to compute such subsystems based on a search for shortest paths. In this paper we use SMT solvers and mixed integer linear programming to determine minimal critical subsystems. Alexander Czutro, Matthias Sauer, Ilia Polian, Bernd BeckerMulti-Conditional ATPG using SAT with Preferences 2012 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Alexander Czutro, Matthias Sauer, Ilia Polian, Bernd BeckerMulti-Conditional SAT-ATPG for Power-Droop Testing 2012 IEEE European Test Symp. » show abstract « hide abstract Abstract Power droop is a non-trivial signal-integrity-related effect triggered by specific power-supply conditions. High-frequency and low-frequency power droop may lead to failure of an IC during application time, but they usually remain undetected by state-of-the-art manufacturing test methods, as the fault excitation imposes particular conditions on global switching activity over several time frames. Hence, ATPG for power-droop test (PD-ATPG) is an extremely hard problem that has not yet been solved optimally. In this paper, we use a SAT-based ATPG engine that employs a mechanism known as SAT-solving with qualitative preferences to generate a solution guaranteed to be optimal for a given set of optimisation criteria, however at the expense of high SAT-solving times. Therefore, a well-balanced set of criteria has to be chosen for the SAT-formulation in order to get as good solutions as possible without rendering the SAT-instances impracticably hard. We explore several strategies and evaluate them experimentally. Jie Jiang, Matthias Sauer, Alexander Czutro, Bernd Becker, Ilia PolianOn the Optimality of K Longest Path Generation Algorithm Under Memory Constraints 2012 Conf. on Design, Automation and Test in Europe , pages : 418 - 423» show abstract « hide abstract Abstract Adequate coverage of small-delay defects in circuits affected by statistical process variations requires identification and sensitization of multiple paths through potential defect sites. Existing K longest path generation (KLPG) algorithms use a data structure called path store to prune the search space by restricting the number of sub-paths considered at the same time. While this restriction speeds up the KLPG process, the algorithms lose their optimality and do not guarantee that the K longest sensitizable paths are indeed found. We investigate, for the first time, the effects of missing some of the longest paths on the defect coverage. We systematically quantify how setting different limits on the path-store size affects the numbers and relative lengths of identified paths, as well as the run-times of the algorithm. We also introduce a new optimal KLPG algorithm that works iteratively and pinpointedly addresses defect locations for which the path-store size limit has been exceeded in previous iterations. We compare this algorithm with a naïve KLPG approach that achieves optimality by setting the path-store size limit to a very large value. Extensive experiments are reported for 45nm-technology data. Matthias Sauer, Alexander Czutro, Bernd Becker, Ilia PolianOn the Quality of Test Vectors for Post-Silicon Characterization 2012 IEEE European Test Symp. » show abstract « hide abstract Abstract Post-silicon validation, i.e., physical characterization of a small number of fabricated circuit instances before start of high-volume manufacturing, has become an essential step in integrated circuit production. Post-silicon validation is required to identify intricate logic or electrical bugs which could not be found during pre-silicon verification. In addition, physical characterization is useful to determine the performance distribution of the manufactured circuit instances and to derive performance yield. Test vectors used for this step are subject to different requirements compared to vectors for simulation-based verification or for manufacturing test. In particular, they must sensitize a very comprehensive set of paths in the circuit, assuming massive variations and possible modeling deficiencies. An inadequate test vector set may result in overly optimistic yield estimates and wrong manufacturing decisions. On the other hand, the size of the test vector set is less important than in verification or manufacturing test. In this paper, we systematically investigate the relationship between the quality of the employed test vectors and the accuracy of yield-performance predictions. We use a highly efficient SAT-based algorithm to generate comprehensive test vector sets based on simple model assumptions and validate these test sets using simulated circuit instances which incorporate effects of process variations. The obtained vector sets can also serve as a basis for adaptive manufacturing test. Alexander Czutro, Matthias Sauer, Tobias Schubert, Ilia Polian, Bernd BeckerSAT-ATPG Using Preferences for Improved Detection of Complex Defect Mechanisms 2012 VLSI Test Symp. » show abstract « hide abstract Abstract Failures caused by phenomena such as crosstalk or power-supply noise are gaining in importance in advanced nanoscale technologies. The detection of such complex defects benefits from the satisfaction of certain constraints, for instance justifying specific transitions on neighbouring lines of the defect location. We present a SAT-based ATPG-tool that supports the enhanced conditional multiple-stuck-at fault model (ECMS@). This model can specify multiple fault locations along with a set of hard conditions imposed on arbitrary lines; hard conditions must hold in order for the fault effect to become active. Additionally, optimisation constraints that may be required for best coverage can be specified via a set of soft conditions. The introduced tool justifies as many of these conditions as possible, using a mechanism known as SAT with preferences. Several applications are discussed and evaluated by extensive experimental data. Furthermore, a novel fault-clustering technique is introduced, thanks to which the time required to classify all stuck-at faults in a suite of industrial benchmarks was reduced by up to 65%. Linus Feiten, Manuel Bührer, Sebastian Sester, Bernd BeckerSMILE - SMARTPHONES IN LECTURES - Initiating a Smartphone-based Audience Response System as a Student Project 2012 4th International Conference on Computer Supported Education (CSEDU) , pages : 288 - 293» show abstract « hide abstract Abstract Final results of the evaluation will be subject to a later publication later. Here we are sharing first tendenciesalready observable and details of the conception and the ongoing development phases. Matthias Sauer, Alexander Czutro, Ilia Polian, Bernd BeckerSmall-Delay-Fault ATPG with Waveform Accuracy 2012 Int'l Conf. on CAD , pages : 30 - 36» show abstract « hide abstract Abstract The detection of small-delay faults is traditionally performed by sensitizing transitions on a path of sufficient length from an input to an output of the circuit going through the fault site. While this approach allows efficient test generation algorithms, it may result in false positives and false negatives as well, i.e. undetected faults are classified as detected or detectable faults are classified as undetectable. We present an automatic test pattern generation algorithm which considers waveforms and their propagation on each relevant line of the circuit. The model incorporates individual delays for each gate and filtering of small glitches. The algorithm is based on an optimized encoding of the test generation problem by a Boolean satisfiability (SAT) instance and is implemented in the tool WaveSAT. Experimental results for ISCAS-85, ITC-99 and industrial circuits show that no known definition of path sensitization can eliminate false positives and false negatives at the same time, thus resulting in inadequate small-delay fault detection. WaveSAT generates a test if the fault is testable and is also capable of automatically generating a formal redundancy proof for undetectable small-delay faults; to the best of our knowledge this is the first such algorithm that is both scalable and complete. Nils Jansen, Erika Ábrahám, Barna Zajzon, Ralf Wimmer, Johann Schuster, Joost-Pieter Katoen, Bernd BeckerSymbolic Counterexample Generation for Discrete-time Markov Chains 2012 Int'l Symp. on Formal Aspects of Component Software , Springer-Verlag, volume : 7684, pages : 134 - 151» show abstract « hide abstract Abstract In this paper we investigate the generation of counterexamples for discrete-time Markov chains (DTMCs) and PCTL properties. Whereas most available methods use explicit representations for at least some intermediate results, our aim is to develop fully symbolic algorithms. As in most related work, our counterexample computations are based on path search. We first adapt bounded model checking as a path search algorithm and extend it with a novel SAT-solvingheuristics to prefer paths with higher probabilities. As a second approach, we use symbolic graph algorithms to find counterexamples. Experiments show that our approaches, in contrast to other existing techniques, are applicable to very large systems with millions of states. Ralf WimmerSymbolische Methoden für die probabilistische Verifikation In : Ausgezeichnete Informatik-Dissertationen 2012, Gesellschaft für Informatik , Stefan Hölldobler et al., pages : 271 - 280, Stefan Hölldobler et al., ISBN : 978-3-88579-416-5» show abstract « hide abstract Abstract Laufzeit bzw. der Größe der handhabbaren Systeme deutlich überlegen sind. Nils Jansen, Erika Ábrahám, Matthias Volk, Ralf Wimmer, Joost-Pieter Katoen, Bernd BeckerThe COMICS Tool - Computing Minimal Counterexamples for DTMCs 2012 Automated Technology for Verification and Analysis , Springer-Verlag, volume : 7561, pages : 349 - 353» show abstract « hide abstract Abstract This paper presents the tool COMICS 1.0, which performs model checking and generates counterexamples for DTMCs. For an input DTMC, COMICS computes an abstract system that carries the model checking information and uses this result to compute a critical subsystem, which induces a counterexample. This abstract subsystem can be refined and concretized hierarchically. The tool comes with a command line version as well as a graphical user interface that allows the user to interactively influence the refinement process of the counterexample. Nils Jansen, Erika Ábrahám, Maik Scheffler, Matthias Volk, Andreas Vorpahl, Ralf Wimmer, Joost-Pieter Katoen, Bernd BeckerThe COMICS Tool - Computing Minimal Counterexamples for DTMCs , volume : arxiv:1206.0603, 2012 Ernst Moritz Hahn, Holger Hermanns, Ralf Wimmer, Bernd BeckerTransient Reward Approximation for Grids, Crowds, and Viruses arXiv , volume : arxiv:1212.1251, 2012 Paolo Marin, Christian Miller, Matthew Lewis, Bernd BeckerVerification of Partial Designs Using Incremental QBF Solving 2012 Conf. on Design, Automation and Test in Europe , pages : 623 - 628» show abstract « hide abstract Abstract SAT solving is an indispensable core component of numerous formal verification tools and has found widespread use in industry, in particular when using it in an incremental fashion, e.g. in Bounded Model Checking (BMC). On the other hand, there are applications, in particular in the area of partial design verification, where SAT formulas are not expressive enough and a description via Quantified Boolean Formulas (QBF) is much more adequate.In this paper we introduce incremental QBF solving and thereby make it usable as a core component of BMC. To do so, we realized an incremental version of the state-of-the-art QBF solver QuBE, allowing for the reuse of learnt information e.g. in the form of conflict clauses and solution cubes. As an application we consider BMC for partial designs (i.e. designs containing so-called blackboxes) and thereby disprove realizability, that is, we prove that an unsafe state is reachable no matter how the blackboxes are implemented. In our experimental analysis, we compare different incremental approaches implemented in our BMC tool. BMC with incremental QBF turns out to be feasible for designs with more than 21,000 gates and 2,700 latches. Significant performance gains over non incremental QBF based BMC can be obtained on many benchmark circuits, in particular when using the so-called backward-incremental approach combined with incremental preprocessing. Scholl P, Van Laerhoven K, Gordon D, Scholz M, Berning MjNode: A sensor network platform that supports distributed inertial kinematic monitoring 2012 9th International Conference on Networked Sensing Systems, INSS 2012 - Conference Proceedings » show abstract « hide abstract Abstract Because of the intrinsic advantages of wireless inertial motion tracking, standalone devices that integrate inertial motion units with wireless networking capabilities have gained much interest in recent years. Several platforms, both commercially available and academic, have been proposed to balance the challenges of a small form-factor, power consumption, accuracy and processing speed. Applications include ambulatory monitoring to support healthcare, sport activity analysis, recognizing human group behaviour, navigation support for humans, robots and unmanned vehicles, but also in structural monitoring of large buildings. This paper provides an analysis of the current state-of-the-art platforms in wireless inertial motion tracking and presents a novel open-source and open-hardware hybrid tracking platform that is extensible, low-power, flexible enough to be used for both short- and long-term monitoring and based on a firmware that allows it to be easily adapted after being deployed. © 2012 IEEE. Scholl P, Van Laerhoven K, Gordon D, Scholz M, Berning MjNode: A sensor network platform that supports distributed inertial kinematic monitoring 2012 9th International Conference on Networked Sensing Systems, INSS 2012 - Conference Proceedings » show abstract « hide abstract Abstract Because of the intrinsic advantages of wireless inertial motion tracking, standalone devices that integrate inertial motion units with wireless networking capabilities have gained much interest in recent years. Several platforms, both commercially available and academic, have been proposed to balance the challenges of a small form-factor, power consumption, accuracy and processing speed. Applications include ambulatory monitoring to support healthcare, sport activity analysis, recognizing human group behaviour, navigation support for humans, robots and unmanned vehicles, but also in structural monitoring of large buildings. This paper provides an analysis of the current state-of-the-art platforms in wireless inertial motion tracking and presents a novel open-source and open-hardware hybrid tracking platform that is extensible, low-power, flexible enough to be used for both short- and long-term monitoring and based on a firmware that allows it to be easily adapted after being deployed. © 2012 IEEE. Celia Kändler, Linus Feiten, Katrin Weber, Michael Wiedmann, Manuel Bührer, Sebastian Sester, Bernd Beckeruniversity learning environment: a classroom response system 2012 International Conference of the Learning Sciences (ICLS) , International Society of the Learning Sciences, volume : 2, pages : 515 - 516» show abstract « hide abstract Abstract We describe and evaluate a tool using smartphones in a university learning environment. Mass lectures often lack in interactivity. Due to missing feedback, the lecture’s content may not be adapted to the students' understanding. The tool consists of two modules, a comprehension level visualization and multiple choice questions, that are hypothesized to increase interactivity and adaptation to students' understanding. We report data from a formative evaluation of the tool's first implementation in a university lecture. back to the year overview Jie Jiang, Matthias Sauer, Alexander Czutro, Bernd Becker, Ilia PolianOn the Optimality of K Longest Path Generation 2011 Workshop on RTL and High Level Testing Matthias Sauer, Alexander Czutro, Ilia Polian, Bernd BeckerEstimation of Component Criticality in Early Design Steps 2011 IEEE Int'l Online Testing Symp. , pages : 104 - 110 Christian Miller, Christoph Scholl, Bernd BeckerVerifying Incomplete Networks of Timed Automata 2011 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , volume : 14» show abstract « hide abstract Abstract Verification of real-time systems - e.g. communication protocols or embedded controllers - is an important task. One method to detect errors is called bounded model checking (BMC). In BMC a system is iteratively unfolded and then transformed into a satisfiability problem. If an appropriate solver finds the k-th instance to be satisfiable, a counterexample for a given safety property has been found. In this paper, we present a novel approach to apply BMC to networks of timed automata (i.e. a system of several interacting subautomata) where parts of the network are unspecified (so-called blackboxes). Here, we would like to answer the question of unrealizability, that is, is there a path of a certain length violating a safety property regardless of the implementation of the blackboxes. Focusing on two prevalent communication models for networks of interacting timed automata, we introduce two methods to solve these types of BMC problems. The first is based on fixed transitions, which in some cases is too coarse of an abstraction. In the second approach we prove unrealizability by introducing universal quantification, yielding more advanced quantified SAT-Modulo-Theory formulas. Matthias Sauer, Victor Tomashevich, Jörg Müller, Matthew Lewis, Ilia Polian, Bernd Becker, Wolfram BurgardAn FPGA-Based Framework for Run-time Injection and Analysis of Soft Errors in Microprocessors 2011 IEEE Int'l Online Testing Symp. , pages : 182 - 185» show abstract « hide abstract Abstract State-of-the-art cyber-physical systems are increasingly deployed in harsh environments with non-negligible soft error rates, such as aviation or search-and-rescue missions. State-of-the-art nanoscale manufacturing technologies are more vulnerable to soft errors. In this paper, we present an FPGA-based framework for injecting soft errors into user-specified memory elements of an entire microprocessor (MIPS32) running application software. While the framework is applicable to arbitrary software, we demonstrate its usage by characterizing soft errors effects on several software filters used in aviation for probabilistic sensor data fusion. Pepijn Crouzen, Ernst Moritz Hahn, Holger Hermanns, Abhishek Dhama, Oliver Theel, Ralf Wimmer, Bettina Braitling, Bernd BeckerBounded Fairness for Probabilistic Distributed Algorithms 2011 Int'l Conf. on Application of Concurrency to System Design , IEEE Computer Society, pages : 89 - 97» show abstract « hide abstract Abstract This paper investigates quantitative dependability metrics for distributed algorithms operating in the presence of sporadic or frequently occurring faults. In particular, we investigate necessary revisions of traditional fairness assumptions in order to arrive at useful metrics, without adding hidden assumptions that may obfuscate their validity. We formulate faulty distributed algorithms as Markov decision processes to incorporate both probabilistic faults and non-determinism arising from concurrent execution. We lift the notion of bounded fairness to the setting of Markov decision processes. Bounded fairness is particularly suited for distributed algorithms running on nearly symmetric infrastructure, as it is common for sensor network applications. Finally, we apply this fairness notion in the quantitative model-checking of several case studies. Christian Miller, Karina Gitina, Bernd BeckerBounded Model Checking of Incomplete Real-time Systems Using Quantified SMT Formulas 2011 Int'l Workshop on Microprocessor Test and Verification , pages : 22 - 27» show abstract « hide abstract Abstract Verification of real-time systems - e.g. communication protocols or embedded controllers - is an important task. One method to detect errors is bounded model checking (BMC) where the system is iteratively unfolded and transformed into a satisfiability problem. In this paper, we apply BMC to incomplete networks of timed automata, i.e. systems containing so-called black boxes. In this context, we answer the question of unrealizability, that is, is a safety property violated regardless of the implementation of the black boxes. We focus on problems where the error path depends on the behavior of the black boxes such that an encoding using universal quantification is needed. This yields more advanced quantified SMT formulas, which can no longer be solved by conventional SMT solvers. Taking several prevalent communication models into account, we show how to encode the unrealizability problem and present several procedures to solve the resulting quantified SMT formulas efficiently using And-Inverter-Graphs extended by linear constraints over reals. Finally, we give experimental results to show the applicability of our verification methods. Bettina Braitling, Ralf Wimmer, Bernd Becker, Nils Jansen, Erika ÁbrahámCounterexample Generation for Markov Chains using SMT-based Bounded Model Checking 2011 IFIP Int'l Conf. on Formal Methods for Open Object-based Distributed Systems , Springer-Verlag, volume : 6722, pages : 75 - 89» show abstract « hide abstract Abstract Generation of counterexamples is a highly important task in the model checking process. In contrast to, e.\,g., digital circuits where counterexamples typically consist of a single path leading to a critical state of the system, in the probabilistic setting counterexamples may consist of a large number of paths. In order to be able to handle large systems and to use the capabilities of modern SAT-solvers, bounded model checking (BMC) for discrete-time Markov chains was established.\par In this paper we introduce the usage of SMT-solving over linear real arithmetic for the BMC procedure. SMT-solving, extending SAT with theories in this context on the one hand leads to a convenient way to express conditions on the probability of certain paths and on the other hand allows to handle Markov reward models. We use the former to find paths with high probability first. This leads to more compact counterexamples. We report on some experiments, which show promising results. Stefan Kupferschmid, Bernd BeckerCraig interpolation in the presence of non-linear constraints 2011 Formal Modeling and Analysis of Timed Systems , Springer, pages : 240 - 255» show abstract « hide abstract Abstract An increasing number of applications in particular in the verification area leverages Craig interpolation. Craig interpolants (CIs) can be computed for many different theories such as: propositional logic, linear inequalities over the reals, and the combination of the preceding theories with uninterpreted function symbols. To the best of our knowledge all previous tools that provide CIs are addressing decidable theories. With this paper we make Craig interpolation available for an in general undecidable theory that contains Boolean combinations of linear and non-linear constraints including transcendental functions like $\sin(\cdot)$ and $\cos(\cdot)$. Such formulae arise e.g.~during the verification of hybrid systems. We show how the construction rules for CIs can be extended to handle non-linear constraints. To do so, an existing SMT solver based on a close integration of SAT and Interval Constraint Propagation is enhanced to construct CIs on the basis of proof trees. We provide first experimental results demonstrating the usefulness of our approach: With the help of Craig interpolation we succeed in proving safety in cases where the basic solver could not provide a complete answer. Furthermore, we point out the (heuristic) decisions we made to obtain suitable CIs and discuss further possibilities to increase the flexibility of the CI construction. Stefan Kupferschmid, Bernd BeckerCraigsche Interpolation für Boolesche Kombinationen linearer und nichtlinearer Ungleichungen 2011 OFFIS-Institut für Informatik, pages : 279 - 288» show abstract « hide abstract Abstract Die vorliegende Arbeit behandelt die Berechnung Craigscher Interpolanten für beliebige boolesche Kombinationen linearer und nichtlinearer Gleichungen, einschließlich transzendenter Funktionen wie sin(x) und cos(x). Craigsche Interpolanten sind in der computergestützten Verifikation weit verbreitet und können für viele verschiedene Theorien berechnet werden, zum Beispiel: Aussagenlogik, lineare Arithmetik über rationale Zahlen sowie die Theorie mit uninterpretierten Funktionsymbolen. Nach bestem Wissen berechnen alle uns bekannten Interpolationsprogramme Craigsche Interpolanten für entscheidbare Theorien. In dieser Arbeit stellen wir einen Ansatz vor, der für eine im Allgemeinem nicht entscheidbare Theorie Craigsche Interpolanten berechnen kann. Wir erlauben neben linearen und nichtlinearen Gleichungen auch transzendente Funktionen. Solche Formeln entstehen bei der Analyse und Verifikation hybrider Systeme. Der in dieser Arbeit vorgestellte Ansatz wurde mithilfe des Sat-Modulo-Theorie Solvers iSAT implementiert. Erste Ergebnisse zeigen, dass Craigsche Interpolanten bei der Verifikation solcher Systeme erfolgreich eingesetzt werden können. Matthias Sauer, Jie Jiang, Alexander Czutro, Ilia Polian, Bernd BeckerEfficient SAT-Based Search for Longest Sensitisable Paths 2011 Test Symposium (ATS), 2011 20th Asian , pages : 108 - 113» show abstract « hide abstract Abstract We present a versatile method that enumerates all or a user-specified number of longest sensitisable paths in the whole circuit or through specific components. The path information can be used for design and test of circuits affected by statistical process variations. The algorithm encodes all aspects of the path search as an instance of the Boolean Satisfiability Problem (SAT), which allows the method not only to benefit from recent advances in SAT-solving technology, but also to avoid some of the drawbacks of previous structural approaches. Experimental results for academic and industrial benchmark circuits demonstrate the method's accuracy and scalability. Nils Jansen, Erika Ábrahám, Jens Katelaan, Ralf Wimmer, Joost-Pieter Katoen, Bernd BeckerHierarchical Counterexamples for Discrete-Time Markov Chains 2011 Int'l Symp. on Automated Technology for Verification and Analysis , Springer-Verlag, volume : 6996, pages : 443 - 452» show abstract « hide abstract Abstract This paper introduces a novel counterexample generation approach for the verification of discrete-time Markov chains (DTMCs) with two main advantages: (1) We generate abstract counterexamples which can be refined in a hierarchical manner. (2) We aim at minimizing the number of states involved in the counterexamples, and compute a critical subsystem of the DTMC whose paths form a counterexample. Experiments show that with our approach we can reduce the size of counterexamples and the number of computation steps by several orders of magnitude. Nils Jansen, Erika Ábrahám, Jens Katelaan, Ralf Wimmer, Joost-Pieter Katoen, Bernd BeckerHierarchical Counterexamples for Discrete-Time Markov Chains , volume : AIB-2011-11, 2011 Stefan Kupferschmid, Matthew Lewis, Tobias Schubert, Bernd BeckerIncremental preprocessing methods for use in BMC 2011 Formal Methods in System Design , volume : 39, pages : 185 - 204» show abstract « hide abstract Abstract . Unfortunately, many preprocessing techniques such as variable and (blocked) clause elimination cannot be directly used in an incremental manner. This work focuses on extending these techniques and Craig interpolation so that they can be used effectively together in incremental SAT solving (in the context of BMC). The techniques introduced here doubled the performance of our BMC solver on both SAT and UNSAT problems. For UNSAT problems, preprocessing had the added advantage that Craig interpolation was able to find the fixed point sooner, reducing the number of incremental SAT iterations. Furthermore, our ideas seem to perform better as the benchmarks become larger, and/or deeper, which is exactly when they are needed. Lastly, our methods can be integrated into other SAT based BMC tools to achieve similar speedups. Sven Reimer, Florian Pigorsch, Christoph Scholl, Bernd BeckerIntegration of Orthogonal QBF Solving Techniques 2011 Conf. on Design, Automation and Test in Europe , pages : 149 - 154» show abstract « hide abstract Abstract In this paper we present a method for integrating two complementary solving techniques for QBF formulas, i. e. variable elimination based on an AIG-framework and search with DPLL based solving. We develop a sophisticated mechanism for coupling these techniques, enabling the transfer of partial results from the variable elimination part to the search part. This includes the definition of heuristics to (1) determine appropriate points in time to snapshot the current partial result during variable elimination (by estimating its quality) and (2) switch from variable elimination to search-based methods (applied to the best known snapshot) when the progress of variable elimination is supposed to be too slow or when representation sizes grow too fast. We will show in the experimental section that our combined approach is clearly superior to both individual methods run in a stand-alone manner. Moreover, our combined approach significantly outperforms all other state-of-the-art solvers. Ernst Althaus, Bernd Becker, Daniel Dumitriu, Stefan KupferschmidIntegration of an LP solver into interval constraint propagation 2011 Int'l Conf. on Combinatorial optimization and applications , Springer, pages : 343 - 356» show abstract « hide abstract Abstract This paper describes the integration of an LP solver into iSAT, a Satisfiability Modulo Theories solver that can solve Boolean combinations of linear and nonlinear constraints. iSAT is a tight integration of the well-known DPLL algorithm and interval constraint propagation allowing it to reason about linear and nonlinear constraints. As interval arithmetic is known to be less efficient on solving linear programs, we will demonstrate how the integration of an LP solver can improve the overall solving performance of iSAT. Kortuem Gerd, Kawsar Fahim, Scholl Philipp, Beigl Michael, Simeone Adalberto, Smith KevinMiniaturized Display Network for Situated Glyphs 2011 Demo Paper at The Ninth International Conference on Pervasive Computing Kortuem Gerd, Kawsar Fahim, Scholl Philipp, Beigl Michael, Simeone Adalberto, Smith KevinMiniaturized Display Network for Situated Glyphs 2011 Demo Paper at The Ninth International Conference on Pervasive Computing Matthew Lewis, Paolo Marin, Tobias Schubert, Massimo Narizzano, Bernd Becker, Enrico GiunchigliaParallel QBF Solving with Advanced Knowledge Sharing 2011 Fundamenta Informaticae , volume : 107, issue : 2-3, pages : 139 - 166» show abstract « hide abstract Abstract In this paper we present the parallel QBF Solver PaQuBE. This new solver leverages the additional computational power that can be exploited from modern computer architectures, from pervasive multi-core boxes to clusters and grids, to solve more relevant instances faster than previous generation solvers. Furthermore, PaQuBE's progressive MPI based parallel framework is the first to support advanced knowledge sharing in which solution cubes as well as conflict clauses can be exchanged between solvers. Knowledge sharing plays a critical role in the performance of PaQuBE. However, due to the overhead associated with sending and receiving MPI messages, and the restricted communication/network bandwidth available between solvers, it is essential to optimize not only what information is shared, but the way in which it is shared. In this context, we compare multiple conflict clause and solution cube sharing strategies, and finally show that an adaptive method provides the best overall results. Erika Ábrahám, Tobias Schubert, Bernd Becker, Martin Fränzle, Christian HerdeParallel SAT Solving in Bounded Model Checking 2011 Journal of Logic and Computation , volume : 21, issue : 1, pages : 5 - 21» show abstract « hide abstract Abstract Bounded model checking (BMC) is an incremental refutation technique to search for counterexamples of increasing length. The existence of a counterexample of a fixed length is expressed by a first-order logic formula that is checked for satisfiability using a suitable solver. We apply communicating parallel solvers to check satisfiability of the BMC formulae. In contrast to other parallel solving techniques, our method does not parallelize the satisfiability check of a single formula, but the parallel solvers work on formulae for different counterexample lengths. We adapt the method of constraint sharing and replication of Shtrichman, originally developed for sequential BMC, to the parallel setting. Since the learning mechanism is now parallelized, it is not obvious whether there is a benefit from the concepts of Shtrichman in the parallel setting. We demonstrate on a number of benchmarks that adequate communication between the parallel solvers yields the desired results. Stefan Kupferschmid, Bernd Becker, Tino Teige, Martin FränzleProof Certificates and Non-linear Arithmetic Constraints 2011 IEEE Design and Diagnostics of Electronic Circuits and Systems , pages : 429 - 434» show abstract « hide abstract Abstract Symbolic methods in computer-aided verification rely heavily on constraint solvers. The correctness and reliability of these solvers are of vital importance in the analysis of safety-critical systems, e.g., in the automotive context. Satisfiabilityresults of a solver can usually be checked by probing the computed solution. This isin general not the case for unsatisfiability results. In this paper, we propose a certification method for unsatisfiability results for mixed Boolean and non-lineararithmetic constraint formulae. Such formulae arise in the analysis of hybrid discrete/continuous systems. Furthermore, we test our approach by enhancing the iSATconstraint solver to generate unsatisfiability proofs, and implemented a tool that can efficiently validate such proofs. Finally, some experimental results showing theeffectiveness of our techniques are given. Ralf Wimmer, Ernst Moritz Hahn, Holger Hermanns, Bernd BeckerReachability Analysis for Incomplete Networks of Markov Decision Processes 2011 Int'l Conf. on Formal Methods and Models for Co-Design , IEEE Computer Society Press, pages : 151 - 160» show abstract « hide abstract Abstract Assume we have a network of discrete-time Markov decision processes (MDPs) which synchronize via common actions. We investigate how to compute probability measures in case the structure of some of the component MDPs (so-called blackbox MDPs) is not known. We then extend this computation to work on networks of MDPs that share integer data variables of finite domain. We use a protocol which spreads information within a network as a case study to show the feasibility and effectiveness of our approach. Matthew LewisSAT, QBF, and Multicore Processors Der Andere Verlag, 2011 Matthias Sauer, Alexander Czutro, Tobias Schubert, Stefan Hillebrecht, Ilia Polian, Bernd BeckerSAT-Based Analysis of Sensitisable Paths 2011 IEEE Design and Diagnostics of Electronic Circuits and Systems , pages : 93 - 98» show abstract « hide abstract Abstract Manufacturing defects in nanoscale technologies have highly complex timing behaviour that is also affected by process variations. While conventional wisdom suggests that it is optimal to detect a delay defect through the longest sensitisable path, non-trivial defect behaviour along with modelling inaccuracies necessitate consideration of paths of well-controlled length during test generation. We present a generic methodology that yields tests through all sensitisable paths of user-specified length. The resulting tests can be employed within the framework of adaptive testing. The methodology is based on encoding the problem as a Boolean-satisfiability (SAT) instance and thereby leverages recent advances in SAT-solving technology. Bettina Braitling, Ralf Wimmer, Bernd Becker, Nils Jansen, Erika ÁbrahámSMT-based Counterexample Generation for Markov Chains 2011 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Offis Oldenburg, volume : 14, pages : 19 - 28» show abstract « hide abstract Abstract Counterexamples are a highly important feature of the model checking process. In contrast to, e.\,g., digital circuits where counterexamples typically consist of a single path leading to a critical state of the system, in the probabilistic setting, counterexamples may consist of a large number of paths. In order to be able to handle large systems and to use the capabilities of modern SAT-solvers, bounded model checking (BMC) for discrete-time Markov chains was established.In this paper we introduce the usage of SMT-solving over the reals for the BMC procedure. SMT-solving, extending SAT with theories, leads in this context on the one hand to a convenient way to express conditions on the probability of certain paths and on the other hand to handle Markov Reward models. The former can be used to find paths with high probability first. This leads to more compact counterexamples.We report on some experiments, which show promising results. Andreas Eggers, Evgeny Kruglov, Stefan Kupferschmid, Karsten Scheibler, Tino Teige, Christoph WeidenbachSuperposition modulo non-linear arithmetic 2011 Int'l Symp. on Frontiers of Combining Systems , Springer, pages : 119 - 134» show abstract « hide abstract Abstract The first-order theory over non-linear arithmetic including transcendental functions (NLA) is undecidable.Nevertheless, in this paper we show that a particular combination with superposition leads to a sound and complete calculus that is useful in practice. We follow basically the ideas of the SUP(LA) combination, but have to take care of undecidability, resulting in ``unknown'' answers by the NLA reasoning procedure. A pipeline of NLA constraint simplification techniques related to the SUP(NLA) framework significantly decreases the number of ``unknown'' answers.The resulting approach is implemented as SUP(NLA) by a system combination of SPASS\ and iSAT. Applied to various scenarios of traffic collision avoidance protocols, we show by experiments that SPASS(iSAT) can fully automatically proof and disproof safety properties of such protocols using the very same formalization. back to the year overview Christian Miller, Karina Gitina, Christoph Scholl, Bernd BeckerBounded Model Checking of Incomplete Networks of Timed Automata 2010 Int'l Workshop on Microprocessor Test and Verification , IEEE Computer Society, volume : 11, pages : 61 - 66» show abstract « hide abstract Abstract Verification of real-time systems - e.g. communication protocols or embedded controllers - is an important task. One method to detect errors is called bounded model checking (BMC). In BMC the system is iteratively unfolded and then transformed into a satisfiability problem. If an appropriate solver finds the k-th instance to be satisfiable a counterexample for a given safety property has been found. In this paper we present a first approach to apply BMC to networks of timed automata (that is a system of several interacting subautomata) where parts of the network are unspecified (so called blackboxes). Here, we would like to answer the question of unrealizability, that is, is there a path of a certain length violating a safety property regardless of the implementation of the blackboxes. We provide solutions to this problem for two timed automata communication models. For the simple synchronization model, a BMC approach based on fixed transitions is introduced resulting in a SAT-Modulo-Theory formula. With respect to the use of bounded integer variables for communication, we prove unrealizability by introducing universal quantification, yielding more advanced quantified SAT-Modulo-Theory formulas. Stefan Kupferschmid, Matthew Lewis, Tobias Schubert, Bernd BeckerIncremental Preprocessing Methods for use in BMC 2010 Int'l Workshop on Hardware Verification » show abstract « hide abstract Abstract Traditional incremental SAT solvers have achieved great success in the domain of Bounded Model Checking (BMC). However, modern solvers depend on advanced preprocessing procedures to obtain high levels of performance. Unfortunately,many preprocessing techniques such as a variable and (blocked) clauseelimination cannot be directly used in an incremental manner. This work focuses on extending these techniques and Craig interpolation so that they can be used effectively together in incremental SAT solving (in the context of BMC). The techniques introduced here doubled the performance of our BMC solver onboth SAT and UNSAT problems. For UNSAT problems, preprocessing had the added advantagethat Craig interpolation was able to find the fixed point sooner,reducing the number of incremental SAT iterations. Furthermore, our ideas seem to perform better as the benchmarks become larger, and/or deeper, which is exactly when they are needed. Lastly, our methods can be extended to other SAT based BMC tools to achieve similar speedups. Christian Miller, Stefan Kupferschmid, Bernd BeckerExploiting Craig Interpolants in Bounded Model Checking for Incomplete Designs 2010 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , volume : 13, pages : 77 - 86 Tobias Nopper, Christian Miller, Matthew Lewis, Bernd Becker, Christoph SchollSAT modulo BDD - A Combined Verification Approach for Incomplete Designs 2010 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , volume : 13, pages : 107 - 116 Marco Bozzano, Alessandro Cimatti, Joost-Pieter Katoen, Viet Yen Nguyen, Thomas Noll, Marco Roveri, Ralf WimmerA Model Checker for AADL 2010 Int'l Conf. on CAV , Springer-Verlag, volume : 6174, pages : 562 - 565» show abstract « hide abstract Abstract We present a graphical toolset for verifying AADL models, which aregaining widespread acceptance in aerospace, automobile and avionicsindustries for comprehensively specifying safety-critical systems bycapturing functional, probabilistic and hybrid aspects. Analysesare implemented on top of mature model checking tools and range fromrequirements validation to functional verification, safety assessmentvia automatic derivation of FMEA tables and dynamic fault trees,to performability evaluation, and diagnosability analysis. The toolset iscurrently being applied to several case studies by a major industrialdeveloper of aerospace systems. Pepijn Crouzen, Ernst Moritz Hahn, Holger Hermanns, Abhishek Dhama, Oliver Theel, Ralf Wimmer, Bettina Braitling, Bernd BeckerBounded Fairness for Probabilistic Distributed Algorithms AVACS Technical Report , volume : 57, 2010 Paolo Marin, Enrico Giunchiglia, Massimo NarizzanoConflict and Solution Driven Constraint Learning in QBF 2010 Doctoral Programme CP 2010 » show abstract « hide abstract Abstract search based QBF solver. Ralf Wimmer, Bernd BeckerCorrectness Issues of Symbolic Bisimulation Computation for Markov Chains 2010 Int'l GI/ITG Conference on “Measurement, Modelling and Evaluation of Computing Systems” , Springer-Verlag, volume : 5987, pages : 287 - 301» show abstract « hide abstract Abstract Bisimulation reduction is a classical means tofight the infamous state space explosion problem, which limits theapplicability of automated methods for verification like modelchecking. A signature-based method, originally developed by Blom andOrzan for labeled transition systems and adapted for Markov chains byDerisavi, has proved to be very efficient. It is possible to implement itsymbolically using binary decision diagrams such that it is able tohandle very large state spaces efficiently. We will show, however,that for Markov chains this algorithm suffers from numericalinstabilities, which often result in too large quotient systems. Wewill present and experimentally evaluate two different approaches toavoid these problems: first the usage of rational arithmetic, andsecond an approach not only to represent the system structure but alsothe transition rates symbolically. In addition, this allows us tomodify their actual values after the quotient computation. Erika Ábrahám, Nils Jansen, Ralf Wimmer, Joost-Pieter Katoen, Bernd BeckerDTMC Model Checking by SCC Reduction 2010 Int'l Conf. on Quantitative Evaluation of Systems , IEEE Computer Society, pages : 37 - 46» show abstract « hide abstract Abstract Discrete-Time Markov Chains (DTMCs) are a widely-usedformalism to model probabilistic systems. On the one hand, availabletools like \textsc{Prism} or \textsc{Mrmc} offer efficient \emph{modelchecking} algorithms and thus support the verification of DTMCs.However, these algorithms do not provide any diagnostic informationin the form of \emph{counterexamples}, which arehighly important for the correction of erroneous systems. On the otherhand, there exist several approaches to generate counterexamples forDTMCs, but all these approaches require the model checking result forcompleteness.In this paper we introduce a model checking algorithm for DTMCs thatalso supports the generation of counterexamples. Our algorithm,based on the detection and abstraction of strongly connectedcomponents, offers \emph{abstract} counterexamples, which can beinteractively refined by the user. Christian Miller, Stefan Kupferschmid, Matthew Lewis, Bernd BeckerEncoding Techniques, Craig Interpolants and Bounded Model Checking for Incomplete Designs 2010 Theory and Applications of Satisfiability Testing , Springer, pages : 194 - 208» show abstract « hide abstract Abstract This paper focuses on bounded invariant checking for partially specified circuits – designs containing so-called blackboxes – using the well known 01X- and QBF-encoding techniques. For detecting counterexamples, modeling the behavior of a blackbox using 01X-encoding is fast, but rather coarse as it limits what problems can be verified. We introduce the idea of 01X-hardness, mainly the classification of problems for which this encoding technique does not provide any useful information about the existence of a counterexample. Furthermore, we provide a proof for 01X-hardness based on Craig interpolation, and show how the information contained within the Craig interpolant or unsat-core can be used to determine heuristically which blackbox outputs to model in a more precise way. We then compare 01X, QBF and multiple hybrid modeling methods. Finally, our total workflow along with multiple state-of-the-art QBF-solvers are shown to perform well on a range of industrial blackbox circuit problems. Natalia Kalinnik, Erika Ábrahám, Tobias Schubert, Ralf Wimmer, Bernd BeckerExploiting Different Strategies for the Parallelization of an SMT Solver 2010 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Fraunhofer Verlag, pages : 97 - 106» show abstract « hide abstract Abstract In this paper we present two different parallelization schemes for the SMT solver iSAT, based on (1) the distribution of work by dividing the search space into disjoint parts and exploring them in parallel, thereby exchanging learnt information, and (2) a portfolio approach, where the entire benchmark instance isexplored in parallel by several copies of the same solver but usingdifferent heuristics to guide the search. We also combine bothapproaches such that solvers examine disjoint parts of the searchspace using different heuristics. The main contribution of the paper is to study the performances of different techniques for parallelizing iSAT. Ilia Polian, Bernd BeckerFault Models and Test Algorithms for Nanoscale Technologies 2010 it - Information Technology , volume : 52, issue : 4, pages : 189 - 194» show abstract « hide abstract Abstract In the age of Nanoscale Integration (NSI), state-of-the-art integrated circuits with gate length under 100 nm consist of hundreds of millions of transistors. This implies new challenges for their reliability. Novel NSI defect mechanisms require special test methods to sort out faulty chips. We present modeling approaches and efficient test algorithms for fundamental NSI defect mechanisms enabling the handling of industrial multi-million-gate circuits. Enrico Giunchiglia, Paolo Marin, Massimo NarizzanoQuBE7.0, System Description 2010 Journal of Satisfiability , volume : 7, issue : 8, pages : 83 - 88» show abstract « hide abstract Abstract In this paper we outline QuBE7’s main features, describing first the options of the preprocessors, and then giving some details about how the core search-based solver (i) performs unit and pure literal propagation; and (ii) performs the “Conflict Analysis procedure” for non-chronological backtracking, generalised from the SAT to the QBF case. We conclude with the experimental evaluation, showing that QuBE7.0 is the a state-of-the-art single-engine QBF solver. Ralf Wimmer, Bettina Braitling, Bernd Becker, Ernst Moritz Hahn, Pepijn Crouzen, Holger Hermanns, Abhishek Dhama, Oliver TheelSymblicit Calculation of Long-Run Averages for Concurrent Probabilistic Systems 2010 Int'l Conf. on Quantitative Evaluation of Systems , IEEE Computer Society, pages : 27 - 36» show abstract « hide abstract Abstract Model checkers for concurrent probabilistic systems havebecome very popular within the last decade. The study of long-runaverage behavior has however received only scant attention in thisarea, at least from the implementation perspective. This paperstudies the problem of how to efficiently realize an algorithm forcomputing optimal long-run average reward values for concurrentprobabilistic systems. At its core is a variation of Howard andVeinott's algorithm for Markov decision processes, where symbolicand non-symbolic representations are intertwined in an effectivemanner: the state space is represented using binary decisiondiagrams, while the linear equation systems which have to be solvedfor the induced Markov chains to improve the current scheduler aresolved using an explicit state representation. In order to keep thelatter small, we apply a symbolic bisimulation minimizationalgorithm to the induced Markov chain. The scheduler improvement step itself is again performed on symbolic data structures. Practical evidence shows that the implementation is effective, andsometimes uses considerably less memory than a fully explicitimplementation. Ralf Wimmer, Salem Derisavi, Holger HermannsSymbolic Partition Refinement with Automatic Balancing of Time and Space 2010 Performance Evaluation , volume : 67, issue : 9, pages : 815 - 835» show abstract « hide abstract Abstract State space lumping is one of the classical means to fight the state space explosion problem in state-based performance evaluation and verification. Particularly when numerical algorithms are applied to analyze a Markov model, one often observes that those algorithms do not scale beyond systems of moderate size. To alleviate this problem, symbolic lumping algorithms have been devised to effectively reduce very large---but symbolically represented---Markov models to moderate size explicit representations. This lumping step partitions the Markov model in such a way that any numerical analysis carried out on the lumped model is guaranteed to produce exact results for the original system. But even this lumping preprocessing may fail due to time or memory limitations. This paper discusses the two main approaches to symbolic lumping, and combines them to improve on their respective limitations. The algorithm automatically converts between known symbolic partition representations in order to provide a trade-off between memory consumption and runtime. We show how to apply this algorithm for the lumping of Markov chains, but the same techniques can be adapted in a straightforward way to other models like Markov reward models, labeled transition systems, or interactive Markov chains. Alexander Czutro, Ilia Polian, Matthew Lewis, Piet Engelke, Sudhakar M. Reddy, Bernd BeckerThread-Parallel Integrated Test Pattern Generator Utilizing Satisfiability Analysis 2010 International Journal of Parallel Programming , volume : 38, issue : 3-4, pages : 185 - 202» show abstract « hide abstract Abstract Efficient utilization of the inherent parallelism of multi-core architectures is a grand challenge in the field of electronic design automation (EDA). One EDA algorithm associated with a high computational cost is automatic test pattern generation (ATPG). We present the ATPG tool TIGUAN based on a thread-parallel SAT solver. Due to a tight integration of the SAT engine into the ATPG algorithm and a carefully chosen mix of various optimization techniques, multi-million-gate industrial circuits are handled without aborts. TIGUAN supports both conventional single-stuck-at faults and sophisticated conditional multiple stuck-at faults which allows to generate patterns for non-standard fault models. We demonstrate how TIGUAN can be combined with conventional structural ATPG to extract full benefit of the intrinsic strengths of both approaches. Fabian Hopsch, Bernd Becker, Sybille Hellebrand, Ilia Polian, Bernd Straube, Wolfgang Vermeiren, Hans-Joachim WunderlichVariation-Aware Fault Modeling 2010 IEEE Asian Test Symp. , pages : 87 - 93» show abstract « hide abstract Abstract To achieve a high product quality for nano-scale systems both realistic defect mechanisms and process variations must be taken into account. While existing approaches for variation-aware digital testing either restrict themselves to special classes of defects or assume given probability distributions to model variabilities, the proposed approach combines defect-oriented testing with statistical library characterization. It uses Monte Carlo simu-lations at electrical level to extract delay distributions of cells in the presence of defects and for the defect-free case. This allows distinguishing the effects of process variations on the cell delay from defect-induced cell delays under process variations. To provide a suitable interface for test algorithms at higher levels of abstraction the distributions are represented as histograms and stored in a histogram data base (HDB). Thus, the computationally expensive defect analysis needs to be performed only once as a preprocessing step for library characterization, and statistical test algorithms do not require any low level information beyond the HDB. The generation of the HDB is demonstrated for primitive cells in 45nm technology. Enrico Giunchiglia, Paolo Marin, Massimo NarizzanosQueezeBF: An effective preprocessor for QBFs 2010 Theory and Applications of Satisfiability Testing , Springer Verlag, volume : 6175, pages : 85 - 98» show abstract « hide abstract Abstract In this paper we present sQueezeBF, an effective preprocessor for QBFs that combines various techniques for eliminating variables and/or redundant clauses. In particular sQueezeBF combines (i) variable elimination via Q-resolution, (ii) variable elimination via equivalence substitution and (iii) equivalence breaking via equivalence rewriting. The experimental analysis shows that sQueezeBF can produce significant reductions in the number of clauses and/or variables - up to the point that some instances are solved directly by sQueezeBF - and that it can significantly improve the efficiency of a range of state-of-the-art QBF solvers - up to the point that some instances cannot be solved without sQueezeBF preprocessing. back to the year overview Alexander Czutro, Ilia Polian, Piet Engelke, Sudhakar M. Reddy, Bernd BeckerDynamic Compaction in SAT-Based ATPG 2009 IEEE Asian Test Symp. Alexander Czutro, Bernd Becker, Ilia PolianPerformance Evaluation of SAT-Based ATPG on Multi-Core Architectures 2009 Test Symposium Marc Hunger, Sybille Hellebrand, Alexander Czutro, Ilia Polian, Bernd BeckerRobustheitsanalyse stark fehlersicherer Schaltungen mit SAT-basierter Testmustererzeugung 2009 GMM/ITG-Fachtagung “Zuverlässigkeit und Entwurf” Piet Engelke, Bernd Becker, Michel Renovell, Jürgen Schlöffel, Bettina Braitling, Ilia PolianSUPERB: Simulator Utilizing Parallel Evaluation of Resistive Bridges 2009 ACM Trans. on Design Automation of Electronic Systems , volume : 14, issue : 4, pages : 56:1 - 56:21 Natalia Kalinnik, Tobias Schubert, Erika Ábrahám, Ralf Wimmer, Bernd BeckerPicoso - A Parallel Interval Constraint Solver 2009 Int'l Conf. on Parallel and Distributed Processing Techniques and Applications , CSREA Press, pages : 473 - 479» show abstract « hide abstract Abstract This paper describes the parallel interval constraint solver Picoso, which can decide (a subclass of) boolean combinations of linear and non-linear constraints. Picoso follows a master/client model based on message passing, making it suitable for any kind of workstation cluster as well as for multi-processor machines. To run several clients in parallel, an efficient work stealing mechanism has been integrated, dividing the overall search space into disjoint parts. Additionally, to prevent the clients from running into identical conflicts, information about conflicts in form of conflict clauses is exchanged among the clients. Performance measurements, using four clients to solve a number of benchmark problems, show that Picoso yields linear speedup compared to the sequential interval constraint solver iSAT, on which the clients of Picoso are based. Marc Hunger, Sybille Hellebrand, Alejandro Czutro, Ilia Polian, Bernd BeckerATPG-Based Grading of Strong Fault-Secureness 2009 IEEE Int'l Online Testing Symp. Nicolas Houarche, Alejandro Czutro, Mariane Comte, Piet Engelke, Ilia Polian, Bernd Becker, Michel RenovellAn Electrical Model for the Fault Simulation of Small-Delay Faults Caused by Crosstalk Aggravated Resistive Short Defects 2009 VLSI Test Symp. Christoph Scholl, Stefan Disch, Florian Pigorsch, Stefan KupferschmidComputing Optimized Representations for Non-convex Polyhedra by Detection and Removal of Redundant Linear Constraints 2009 Tools and Algorithms for the Construction and Analysis of Systems , Springer, volume : 5505, pages : 383 - 397» show abstract « hide abstract Abstract We present a method which computes optimized representationsfor non-convex polyhedra. Our method detects so-called redundantlinear constraints in these representations by using an incremental SMT(Satisfiability Modulo Theories) solver and then removes the redundantconstraints based on Craig interpolation. The approach is motivated byapplications in the context of model checking for Linear Hybrid Automata.Basically, it can be seen as an optimization method for formulasincluding arbitrary boolean combinations of linear constraints andboolean variables. We show that our method provides an essential stepmaking quantifier elimination for linear arithmetic much more efficient.Experimental results clearly show the advantages of our approach incomparison to state-of-the-art solvers. Nicolas Houarche, Alejandro Czutro, Mariane Comte, Piet Engelke, Ilia Polian, Bernd Becker, Michel RenovellDeriving an Electrical Model for Delay Faults Caused by Crosstalk Aggravated Resistive Short Defects 2009 Latin-American Test Workshop Alejandro Czutro, Bernd Becker, Ilia PolianPerformance Evaluation of SAT-Based Automatic Test Pattern Generation on Multi-Core Architectures 2009 GI/ITG Int'l Conf. on Architecture of Computing Systems, Many-Cores Workshop Christian Miller, Tobias Nopper, Christoph SchollSymbolic CTL Model Checking for Incomplete Designs by Selecting Property-Specific Subsets of Local Component Assumptions 2009 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract Black Box symbolic model checking is a method to check whether an incompletely specified circuit,in which some parts of the design have been replaced by ‘Black Boxes’, satisfies a CTL propertyregardless of the actual replacement of the Black Boxes. One possible application is model checkingwith abstraction where complex parts of the design (which are not really relevant for the propertyat hand) are put into Black Boxes in order to speed up the model checking run. However, due toits approximative nature, symbolic model checking for incomplete designs may be unable to provideproofs when the approximations are to coarse.In this paper we show how Black Box model checking can profit from information about the Black-Boxed parts given in form of so-called local component assumptions. The assumptions we use in thiscontext are safety properties arguing about fixed time windows. Furthermore we will present a setof heuristic-based approaches to automatically select a subset of local component assumptions that issufficient to prove or disprove a given CTL property. Alejandro Czutro, Bernd Becker, Ilia PolianPerformance Evaluation of SAT-Based ATPG on Multi-Core Architectures 2009 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Alejandro Czutro, Ilia Polian, Matthew Lewis, Piet Engelke, Sudhakar M. Reddy, Bernd BeckerTIGUAN: Thread-parallel Integrated test pattern Generator Utilizing satisfiability ANalysis 2009 Int'l Conf. on VLSI Design , pages : 227 - 232 V. Izosimov, Ilia Polian, P. Pop, P. Eles, Z. PengAnalysis and optimization of fault-tolerant embedded systems with hardened processors 2009 Conf. on Design, Automation and Test in Europe Andreas Eggers, Natalia Kalinnik, Stefan Kupferschmid, Tino TeigeChallenges in Constraint-Based Analysis of Hybrid Systems 2009 Recent Advances in Constraints , Springer, volume : 5655, pages : 51 - 65» show abstract « hide abstract Abstract In the analysis of hybrid discrete-continuous systems, rich arithmeticconstraint formulae with complex Boolean structure arise naturally. The iSATalgorithm, a solver for such formulae, is aimed at bounded model checking ofhybrid systems. In this paper, we identify challenges emerging from planned and ongoing work to enhancethe iSAT algorithm. First, we propose an extension of iSAT to directly handleordinary differential equations as constraints. Second, we outline the recentlyintroduced generalization of the iSAT algorithm to deal with probabilistichybrid systems and some open research issues in that context. Third, wepresent ideas on how to move from bounded to unbounded model checking by usingthe concept of interpolation. Finally, we discuss the adaption of someparallelization techniques to the iSAT case, which will hopefully lead toperformance gains in the future.By presenting these open research questions, this paper aims at fosteringdiscussions on these extensions of constraint solving. Paolo Marin, Matthew Lewis, Massimo Narizzano, Tobias Schubert, Enrico Giunchiglia, Bernd BeckerComparison of Knowledge Sharing Strategies in a Parallel QBF Solver 2009 High-Performance Computing and Simulation Conference , pages : 161 - 167 Eckard Böde, Marc Herbstritt, Holger Hermanns, Sven Johr, Thomas Peikenkamp, Reza Pulungan, Jan Rakow, Ralf Wimmer, Bernd BeckerCompositional Dependability Evaluation for Statemate 2009 IEEE Trans. on Software Engineering , volume : 35, issue : 2, pages : 274 - 292» show abstract « hide abstract Abstract Software and system dependability is getting ever more important inembedded system design. Current industrial practice of model-basedanalysis is supported by state-transition diagrammatic notationssuch as Statecharts. State-of-the-art modelling tools like\textsc{Statemate} support safety and failure-effect analysis atdesign time, but restricted to qualitative properties. This paperreports on a (plug-in) extension of \textsc{Statemate} enabling theevaluation of quantitative dependability properties at design time.The extension is compositional in the way the model is augmentedwith probabilistic timing information. This fact is exploited in theconstruction of the underlying mathematical model, a uniformcontinuous-time Markov decision process, on which we are able tocheck requirements of the form: \emph{``The probability to hit asafety-critical system configuration within a mission time of 3hours is at most 0.01.''} We give a detailed explanation of theconstruction and evaluation steps making this possible, and reporton a nontrivial case study of a high-speed train signalling systemwhere the tool has been applied successfully. Ralf Wimmer, Bettina Braitling, Bernd BeckerCounterexample Generation for Discrete-time Markov Chains using Bounded Model Checking 2009 Int'l Conf. on Verification, Model Checking and Abstract Interpretation , Springer-Verlag, volume : 5403, pages : 366 - 380 Matthew Lewis, Tobias Schubert, Bernd BeckerDPLL-based Reasoning in a Multi-Core Environment 2009 Int'l Workshop on Microprocessor Test and Verification Abhis, hek Dhama, Oliver Theel, Pepijn Crouzen, Holger Hermanns, Ralf Wimmer, Bernd BeckerDependability Engineering of Silent Self-Stabilizing Systems 2009 Int'l Symp. on Stabilization, Safety, and Security of Distributed Systems , Springer-Verlag, volume : 5873, pages : 238 - 253» show abstract « hide abstract Abstract s an elegant way of realizing non-masking fault-tolerant systems. Sustained research over last decades has produced multiple self-stabilizing algorithms for many problems in distributed computing. In this paper, we present a framework to evaluate multiple self-stabilizing solutions under a fault model that allows intermittent transient faults. To that end, metrics to quantify the dependability of self-stabilizing systems are defined. It is also shown how to derive models that are suitable for probabilistic model checking in order to determine those dependability metrics. A heuristics-based method is presented to analyze counterexamples returned by a probabilistic model checker in case the system under investigation does not exhibit the desired degree of dependability. Based on the analysis, the self-stabilizing algorithm is subsequently refined. Paolo Marin, Matthew Lewis, Tobias Schubert, Massimo Narizzano, Bernd Becker, Enrico GiunchigliaEvaluation of Knowledge Sharing Strategies in a Parallel QBF Solver 2009 RCRA International Workshop on Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion Tobias Schubert, Matthew Lewis, Bernd BeckerPaMiraXT: Parallel SAT Solving with Threads and Message Passing 2009 Journal on Satisfiability, Boolean Modeling, and Computation , volume : 6, pages : 203 - 222 Matthew Lewis, Paolo Marin, Tobias Schubert, Massimo Narizzano, Bernd Becker, Enrico GiunchigliaPaQuBE: Distributed QBF Solving with Advanced Knowledge Sharing 2009 Int'l Conf. on Theory and Applications of Satisfiability Testing , volume : 5584, pages : 509 - 523 Stefan Kupferschmid, Tino Teige, Bernd Becker, Martin FränzleProofs of Unsatisfiability for mixed Boolean and Non-linear Arithmetic Constraint Formulae 2009 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 27 - 36» show abstract « hide abstract Abstract Symbolic methods in computer-aidedverification rely on appropriate constraint solvers. Correctness and reliabilityof solvers are a vital requirement in the analysis of safety-critical systems,e.g., in the automotive context. Satisfiability results of a solver can usuallybe checked by probing the computed solution. However, efficient validation of anuncertified unsatisfiability result for some constraint formula is nearlyimpossible. In this paper, we propose a certification method for unsatisfiability resultsfor mixed Boolean and non-linear arithmetic constraint formulae. Such formulaearise in the analysis of hybrid discrete-continuous systems. Matthew Lewis, Tobias Schubert, Bernd BeckerQMiraXT - A Multithreaded QBF Solver 2009 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” Stefan Hillebrecht, Ilia Polian, P. Ruther, S. Herwik, Bernd Becker, Oliver PaulReliability Characterization of Interconnects in CMOS Integrated Circuits Under Mechanical Stress 2009 Int'l Reliability Physics Symp. back to the year overview Christoph Scholl, Stefan Disch, Florian Pigorsch, Stefan KupferschmidUsing an SMT Solver and Craig Interpolation to Detect and Remove Redundant Linear Constraints in Representations of Non-Convex Polyhedra 2008 Int'l Workshop on Satisfiability Modulo Theories , pages : 18 - 26 Andreas Eggers, Natalia Kalinnik, Stefan Kupferschmid, Tino TeigeChallenges in Constraint-based Analysis of Hybrid Systems 2008 ERCIM Workshop on Constraint Solving and Constraint Logic Programming » show abstract « hide abstract Abstract In the analysis of hybrid discrete-continuous systems, rich arithmeticconstraint formulae with complex Boolean structure arise naturally. The iSATalgorithm, a solver for such formulae, is aimed at bounded model checking ofhybrid systems. In this paper, we identify challenges emerging from planned and ongoing work to enhance the iSAT algorithm. First, we propose anextension of iSAT to directly handleordinary differential equations as constraints. Second, we outline the recentlyintroduced generalization of the iSAT algorithm to deal with probabilistichybrid systems and some open research issues in that context. Third, wepresent ideas on how to move from bounded to unbounded model checking by usingthe concept of interpolation. Finally, we discuss the adaption of someparallelization techniques to the iSAT case, which will hopefully lead toperformance gains in the future.By presenting these open research questions, this paper aims at fosteringdiscussions on these extensions of constraint solving. Alejandro Czutro, Nicolas Houarche, Piet Engelke, Ilia Polian, Mariane Comte, Michel Renovell, Bernd BeckerA Simulator of Small-Delay Faults Caused by Resistive-Open Defects 2008 IEEE European Test Symp. , pages : 113 - 118 Alejandro Czutro, Ilia Polian, Matthew Lewis, Piet Engelke, Sudhakar M. Reddy, Bernd BeckerTIGUAN: Thread-parallel Integrated test pattern Generator Utilizing satisfiability ANalysis 2008 edaWorkshop Piet Engelke, Ilia Polian, Michel Renovell, Sandip Kundu, Bharath Seshadri, Bernd BeckerOn Detection of Resistive Bridging Defects by Low-Temperature and Low-Voltage Testing 2008 IEEE Trans. on CAD , volume : 27, issue : 2, pages : 327 - 338» show abstract « hide abstract Abstract Test application at reduced power supply voltage (low-voltage testing) or reduced temperature (low-temperature testing) can improve the defect coverage of a test set, in particular of resistive short defects. Using a probabilistic model of these defects, we quantify the coverage impact of low-voltage and low-temperature testing for different voltages and temperatures. When considering the coverage increase, we differentiate between defects missed by the test set at nominal conditions and undetectable defects (flaws) detected at non-nominal conditions. In our analysis, the performance degradation of the device caused by lower power supply voltage is accounted for. Furthermore, we describe a situation in which defects detected by conventional testing are missed by low-voltage testing and quantify the resulting coverage loss. Experimental results suggest that test quality is improved even if no cost increase is allowed. If multiple test applications are acceptable, a combination of low-voltage and low-temperature turns out to provide the best coverage of both hard defects and flaws. Damian Nowroth, Ilia Polian, Bernd BeckerA Study of Cognitive Resilience in a JPEG Compressor 2008 Int'l Conf. on Dependable Systems and Networks , pages : 32 - 41» show abstract « hide abstract Abstract Many classes of applications are inherently tolerant to errors. One such class are applications designed for a human end user, where the capabilities of the human cognitive system (cognitive resilience) may compensate some of the errors produced by the application. We present a methodology to automatically distinguish between tolerable errors in imaging applications which can be handled by the human cognitive system and severe errors which are perceptible to a human end user. We also introduce an approach to identify non-critical spots in a hardware circuit which should not be hardened against soft errors because errors that occur on these spots are tolerable. We demonstrate that over 50% of flip-flops in a JPEG compressor chip are non-critical and require no hardening. Enrico Giunchiglia, Paolo Marin, Massimo NarizzanoAn Effective Preprocessor for QBF pre-reasoning 2008 Int'l Workshop on Quantification in Constraint Programming Stefan Spinner, Ilia Polian, Piet Engelke, Bernd Becker, Martin Keim, Wu-Tung ChengAutomatic Test Pattern Generation for Interconnect Open Defects 2008 VLSI Test Symp. , pages : 181 - 186 Stefan Spinner, Ilia Polian, Piet Engelke, Bernd Becker, Martin Keim, Wu-Tung ChengAutomatic Test Pattern Generation for Interconnect Open Defects 2008 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Ilia Polian, Yusuke Nakamura, Piet Engelke, Stefan Hillebrecht, Kohei Miyase, Seiji Kajihara, Bernd Becker, Xiaoqing WenDiagnose realistischer Defekte mit Hilfe des X-Fehlermodells 2008 GMM/GI/ITG Reliability and Design Conf. , pages : 155 - 156 Ilia Polian, Yusuke Nakamura, Piet Engelke, Stefan Spinner, Kohei Miyase, Seiji Kajihara, Bernd Becker, Xiaoqing WenDiagnosis of Realistic Defects Based on the X-Fault Model 2008 IEEE Design and Diagnostics of Electronic Circuits and Systems , pages : 263 - 268 Stefan Hillebrecht, Ilia Polian, Piet Engelke, Bernd Becker, Martin Keim, Wu-Tung ChengExtraction, Simulation and Test Generation for Interconnect Open Defects Based on Enhanced Aggressor-Victim Model 2008 Int'l Test Conf. , pages : 1 - 10 Thomas Eschbach, Bernd BeckerInteractive Circuit Diagram Visualization 2008 Computer Graphics and Imaging , ACTA Press, pages : 218 - 223 Ilia Polian, Sudhakar M. Reddy, Irith Pomeranz, X. Tang, Bernd BeckerNo Free Lunch in Error Protection? 2008 Workshop on Dependable and Secure Nanocomputing Ilia Polian, Sudhakar M. Reddy, Irith Pomeranz, X. Tang, Bernd BeckerOn Reducing Circuit Malfunctions Caused by Soft Errors 2008 Int'l Symp. on Defect and Fault Tolerance , pages : 245 - 253 Riedel T, Scholl P, Decker C, Berehtold M, Beigl MPluggable real world interfaces physically enabled code deployment for networked sensors 2008 Proceedings of INSS 2008 - 5th International Conference on Networked Sensing Systems » show abstract « hide abstract Abstract In this paper we present a novel abstraction and deployment process using real world interfaces, which reflect the realities of pervasive software development. Pluggable real world interfaces support "plug’n’play" deployment for sensoraugmented hardware and provide an object-oriented encapsulation of high-level contextual interfaces. The architecture adds an additional object based abstraction layer between the sensor subsystem (delivering e.g. cues) and the application (delivering the situation context). Component abstraction layers are implemented as code that comes with physical components, e.g. a chair, and provides the functionality for detecting context bundled with the sensory hardware. The approach will lead to pluggable real world interfaces: The functionality of an appliance will be composed from the functionality of its components - just like a meeting room will be composed from many chairs. This paper will present concept, architecture and a first implementation based on a Java run-time system for very tiny, very low-power embedded sensor nodes. Riedel T, Scholl P, Decker C, Berehtold M, Beigl MPluggable real world interfaces physically enabled code deployment for networked sensors 2008 Proceedings of INSS 2008 - 5th International Conference on Networked Sensing Systems » show abstract « hide abstract Abstract In this paper we present a novel abstraction and deployment process using real world interfaces, which reflect the realities of pervasive software development. Pluggable real world interfaces support "plug’n’play" deployment for sensoraugmented hardware and provide an object-oriented encapsulation of high-level contextual interfaces. The architecture adds an additional object based abstraction layer between the sensor subsystem (delivering e.g. cues) and the application (delivering the situation context). Component abstraction layers are implemented as code that comes with physical components, e.g. a chair, and provides the functionality for detecting context bundled with the sensory hardware. The approach will lead to pluggable real world interfaces: The functionality of an appliance will be composed from the functionality of its components - just like a meeting room will be composed from many chairs. This paper will present concept, architecture and a first implementation based on a Java run-time system for very tiny, very low-power embedded sensor nodes. Ralf Wimmer, Alexander Kortus, Marc Herbstritt, Bernd BeckerProbabilistic Model Checking and Reliability of Results 2008 IEEE Design and Diagnostics of Electronic Circuits and Systems , IEEE Computer Science Press, pages : 207 - 212» show abstract « hide abstract Abstract In formal verification, reliable results are of utmost importance. In model checking of digital systems, mainly incorrect implementations of the model checking algorithms due to logical errors are the source of wrong results. In probabilistic model checking, however, numerical instabilities are an additional source for inconsistent results.\par We motivate our investigations with an example, for which several state-of-the-art probabilistic model checking tools give completely wrong results due to inexact computations. We then analyze, at which points inaccuracies are introduced during the model checking process. We discuss first ideas how, in spite of these inaccuracies, reliable results can be obtained or at least the user be warned about potential correctness problems: (1) usage of exact (rational) arithmetic, (2) usage of interval arithmetic to obtain safe approximations of the actual probabilities, (3) provision of certificates which testify that the result is correct, and (4) integration of a ``degree of belief'' for each sub-formula into existing model checking tools. Bernd Becker, Marc Herbstritt, Natalia Kalinnik, Matthew Lewis, Juri Lichtner, Tobias Nopper, Ralf WimmerPropositional Approximations for Bounded Model Checking of Partial Circuit Designs 2008 IEEE Int'l Conf. on Computer Design , IEEE Computer Society Press, pages : 52 - 59» show abstract « hide abstract Abstract Bounded model checking of partial circuit designs enables the detection of errors even when the implementation of the design is not finished. The behavior of the missing parts can be modeled by a conservative extension of propositional logic, called 01X-logic. Then the transitions of the underlying (incomplete) sequential circuit under verification have to be represented adequately. In this work, we investigate the difference between a relation-oriented and a function-oriented approach for this issue. Experimental results on a large set of examples show that the function-oriented representation is most often superior w.\,r.\,t. (1) CPU runtime and (2) accuracy regarding the ability to find a counterexample, such that by using the function-oriented approach an increase of accuracy up to 210\% and a speed-up of the CPU runtime up to 390\% compared to the relation-oriented approach are achieved. But there are also relevant examples, e.\,g. a VLIW-ALU, for which the relation-oriented approach outperforms the function-oriented one by 300\% in terms of CPU-time, showing that both approaches are efficient for different scenarios. Piet Engelke, Ilia Polian, Jürgen Schlöffel, Bernd BeckerResistive Bridging Fault Simulation of Industrial Circuits 2008 Conf. on Design, Automation and Test in Europe , pages : 628 - 633» show abstract « hide abstract Abstract We report the successful application of a resistive bridging fault (RBF) simulator to industrial benchmark circuits. Despite the slowdown due to the consideration of the sophisticated RBF model, the run times of the simulator were within an order of magnitude of the run times for pattern-parallel complete-circuit stuck-at fault simulation. Industrial-size circuits, including a multi-million-gates design, could be simulated in reasonable time despite a significantly higher number of faults to be simulated compared with stuck-at fault simulation. Piet Engelke, Ilia Polian, Jürgen Schlöffel, Bernd BeckerResistive Bridging Fault Simulation of Industrial Circuits 2008 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Ilia Polian, Sudhakar M. Reddy, Bernd BeckerScalable Calculation of Logical Masking Effects for Selective Hardening Against Soft Errors 2008 IEEE Int'l Symp. on VLSI , pages : 257 - 262 Christian G. Zoellin, Hans-Joachim Wunderlich, Ilia Polian, Bernd BeckerSelective Hardening in Early Design Steps 2008 IEEE European Test Symp. , pages : 185 - 190 Ilia Polian, W. RaoSelective Hardening of NanoPLA Circuits 2008 Int'l Symp. on Defect and Fault Tolerance , pages : 263 - 271 Ralf Wimmer, Salem Derisavi, Holger HermannsSymbolic Partition Refinement with Dynamic Balancing of Time and Space 2008 Int'l Conf. on Quantitative Evaluation of Systems , IEEE Computer Science Press, pages : 65 - 74» show abstract « hide abstract Abstract Bisimulation minimization is one of the classical means to fight the infamous state space explosion problem in verification. Particularly in stochastic verification, numerical algorithms are applied, which do not scale beyond systems of moderate size. To alleviate this problem, symbolic bisimulation minimization has been used effectively to reduce very large symbolically represented state spaces to moderate size explicit representations. But even this minimization may fail due to time or memory limitations. This paper presents a symbolic algorithm which relies on a hybrid symbolic partition representation. It dynamically converts between two known representations in order to provide a trade-off between memory consumption and runtime. The conversion itself is logarithmic in the partition size. We show how to apply it for the minimization of Markov chains, but the same techniques can be adapted in a straightforward way to other models like labeled transition systems or interactive Markov chains. Bernd Becker, Paul MolitorTechnische Informatik: Eine einführende Darstellung Oldenbourg Wissenschaftsverlag, 2008 Ralf Wimmer, Alexander Kortus, Marc Herbstritt, Bernd BeckerThe Demand for Reliability in Probabilistic Verification 2008 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Shaker Verlag, pages : 99 - 108» show abstract « hide abstract Abstract For formal verification, reliable results are of utmost importance. In model checking of digital systems, mainly incorrect implementations due to logical errors are the source of wrong results. In probabilistic model checking, however, numerical instabilities are an additional source for inconsistent results.\par First we present an example, for which several state-of-the-art probabilistic model checking tools give completely wrong results due to inexact computations. This motivates the investigation at which points inaccuracies are introduced during the model checking process. We then give ideas how, in spite of these inaccuracies, reliable results can be obtained or at least the user be warned about potential problems: (1) to introduce a ``degree of belief'' for each sub-formula, (2) to use exact (rational) arithmetic, (3) to use interval arithmetic to obtain safe approximations of the actual probabilities, and (4) to provide certificates which testify that the result is correct. Jochen EisingerUpper Bounds on the Automata Size for Integer and Mixed Real and Integer Linear Arithmetic 2008 EACSL Annual Conf. on Computer Science Logic , Springer-Verlag, pages : 430 - 444 back to the year overview Ilia Polian, Alejandro Czutro, Sandip Kundu, Bernd BeckerPower Droop Testing 2007 Test of Computers , volume : 24, issue : 3, pages : 276 - 284 Marc Herbstritt, Bernd Becker, Erika Ábrahám, Christian HerdeOn Variable Selection in SAT-LP-based Bounded Model Checking of Linear Hybrid Automata 2007 IEEE Design and Diagnostics of Electronic Circuits and Systems , IEEE Computer Society, pages : 391 - 396» show abstract « hide abstract Abstract rithmic structure of the combined SAT-LP-solver, and (2) efficiency issues in form of a counter-based design that adapts the popular concept of the VSIDS heuristics of Chaff. An empirical analysis ofour framework shows (1) the importance of guard variables related to real-valued constraints and (2) reveals an ambiguity between the number of calls of the LP-solver and the time used by the LP-solver. As a result, we propose a simple, but efficient, variable selection heuristics that takes our observations into account and experimentally overcomes the limitations of existing ones. Matthew Lewis, Tobias Schubert, Bernd BeckerMultithreaded SAT Solving 2007 ASP Design Automation Conf. , pages : 926 - 921 Bernd Becker, Andreas Podelski, Werner Damm, Martin Fränzle, Ernst-Rüdiger Olderog, Reinhard WilhelmSFB/TR 14 AVACS – Automatic Verification and Analysis of Complex Systems (Der Sonderforschungsbereich/Transregio 14 AVACS – Automatische Verifikation und Analyse komplexer Systeme) 2007 it - Information Technology , Oldenbourg Wissenschaftsverlag GmbH, volume : 49, issue : 2, pages : 118 - 126 Stefan Spinner, Ilia Polian, Bernd Becker, P. Ruther, Oliver PaulA System for the Calibration and Reliability Testing of MEMS Devices Under Mechanical Stress 2007 VDE Microsystem Technology Congress , pages : 861 - 864 John P. Hayes, Ilia Polian, Bernd BeckerAn Analysis Framework for Transient-Error Tolerance 2007 VLSI Test Symp. , pages : 249 - 255 Marc Herbstritt, Vanessa Struve, Bernd BeckerApplication of Lifting in Partial Design Analysis 2007 Int'l Workshop on Microprocessor Test and Verification , IEEE Computer Society, pages : 33 - 38» show abstract « hide abstract Abstract In the past, we have investigated satisfiability-based combinational equivalence checking and bounded model checking of partial circuit designs, i.e., circuit designs where one or more components of the design are not implemented yet. Especially for satisfiability-based bounded model checking, typically a counterexampleis generated in case that the specification property is violated. Minimizing the number of assigned variables in the satisfying assignment that corresponds to such a counterexample is the objective of lifting.In this work we show that lifting is also feasible and profitable for counterexamples obtained via satisfiability-based bounded model checking of partial designs. We provide first experimental results on this issue that show its feasibility.Furthermore, we present a novel application scenario for lifting in the context of automated blackbox synthesis. This is a useful concept that can be applied during combinational equivalence checking of partial circuit designs, where realizability of the missing componentswas already proven, but the functionality of the missing components is still unknown. As a summary, this work provides first experimental results as well as a novel concept regarding the application of lifting for the analysis of partial designs. Tobias Nopper, Christoph Scholl, Bernd BeckerComputation of Minimal Counterexamples by Using Black Box Techniques and Symbolic Methods 2007 IEEE Int'l Conf. on Computer-Aided Design , IEEE Computer Society Press, pages : 273 - 280» show abstract « hide abstract Abstract Computing counterexamples is a crucial task for error diagnosis and debugging of sequential systems. If an implementation does not fulfill its specification, counterexamples are used to explain the error effect to the designer. In order to be understood by the designer, counterexamples should be simple, i.e. they should be as general as possible and assign values to a minimal number of input signals. Here we use the concept of Black Boxes (parts of the design with unknown behaviour) to mask out components for counterexample computation. By doing so, the resulting counterexample will argue about a reduced number of components in the system to facilitate the task of understanding and correcting the error. We introduce the notion of "uniform counterexamples" to provide an exact formalization of simplified counterexamples arguing only about components which were not masked out. Our computation of counterexamples is based on symbolic methods using AIGs (And-Inverter-Graphs). Experimental results using a VLIW processor as a case study clearly demonstrate our capability of providing simplified counterexamples. Ilia Polian, John P. Hayes, Bernd BeckerCost-Efficient Circuit Hardening Based on Critical Soft Error Rate 2007 IEEE Workshop on RTL ATPG and DfT Martin Fränzle, Christian Herde, Tino Teige, S. Ratschan, Tobias SchubertEfficient Solving of Large Non-Linear Arithmetic Constraint Systems with Complex Boolean Structure 2007 Journal on Satisfiability, Boolean Modeling, and Computation , volume : 1, issue : 3-4, pages : 209 - 236 Ilia Polian, John P. Hayes, Damian Nowroth, Bernd BeckerEin kostenbegrenzter Ansatz zur Reduktion der transienten Fehlerrate 2007 GMM/GI/ITG Reliability and Design Conf. , pages : 183 - 184 Ralf Wimmer, Marc Herbstritt, Bernd BeckerForwarding, Splitting, and Block Ordering to Optimize BDD-based Bisimulation Computation 2007 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Shaker Verlag, pages : 203 - 212» show abstract « hide abstract Abstract In this paper we present optimizations for a BDD-based algorithm for the computation of several types of bisimulations which play an important role for minimisation of large systems thus enabling their verification. The basic principle of the algorithm is partition refinement. Our proposed optimizations take this refinement-structure as well as the usage of BDDs for the representation of the system into account: (1) \emph{block forwarding} updates in-situ newly refined blocks of the partition, (2) \emph{split-driven refinement} approximates the blocks that may be refined, and (3) \emph{block ordering} heuristically suggests a good order in which the blocks will be refined.\par We provide substantial experimental results on examples from different applications and compare them to alternative approaches. The experiments clearly show that the proposed optimization techniques result in a significant performance speed-up compared to the basic algorithm as well as to alternative approaches. Ilia Polian, Hideo FujiwaraFunctional Constraints vs. Test Compression in Scan-Based Delay Testing 2007 Jour. Electronic Testing , pages : 445 - 455 Ilia Polian, Hideo FujiwaraFunctional Constraints vs. Test Compression in Scan-Based Delay Testing 2007 Conf. on Design, Automation and Test in Europe , volume : 23, issue : 5, pages : 445 - 455» show abstract « hide abstract Abstract We present an approach to prevent overtesting inscan-based delay test. The test data is transformed with respect tofunctional constraints while simultaneously keeping as manypositions as possible unspecified in order to facilitate testcompression. The method is independent of the employed delay faultmodel, ATPG algorithm and test compression technique, and it is easyto integrate into an existing flow. Experimental results emphasizethe severity of overtesting in scan-based delay test. Influence ofdifferent functional constraints on the amount of the required testdata and the compression efficiency is investigated. To the best ofour knowledge, this is the first systematic study on therelationship between overtesting prevention and test compression. Decker C, Riedel T, Berchtold M, Scholl P, Isomura M, Beigl M, Röhr D, Beuster MGeocoding sensor networks 2007 4th International Conference on Networked Sensing Systems, INSS » show abstract « hide abstract Abstract This demonstration will show how to geocode information originated from sensor networks. Geocoding is a technique to graphically represent information in a "map to be seen" [Bertin 1967]. The demonstration will present: 1.) A toolset to overlay sensor network information including sensor and context data, with geographic maps; 2.) A Sensor network, transformation of sensor network information, technology options for presenting such information on maps and also exemplarily displays format options. Decker C, Riedel T, Berchtold M, Scholl P, Isomura M, Beigl M, Röhr D, Beuster MGeocoding sensor networks 2007 4th International Conference on Networked Sensing Systems, INSS » show abstract « hide abstract Abstract This demonstration will show how to geocode information originated from sensor networks. Geocoding is a technique to graphically represent information in a "map to be seen" [Bertin 1967]. The demonstration will present: 1.) A toolset to overlay sensor network information including sensor and context data, with geographic maps; 2.) A Sensor network, transformation of sensor network information, technology options for presenting such information on maps and also exemplarily displays format options. Decker C, Riedel T, Scholl P, Krohn A, Beigl MGraphically geo-coding of sensor system information 2007 4th International Conference on Networked Sensing Systems, INSS » show abstract « hide abstract Abstract Sensor network information should be presented to the user in a convenient way. For administration, inspection and access, this motivates a location-based interface. We implemented a GoogleMaps interface to wireless sensor networks, which we use in the CoBIs research project and explore the capabilities of such an interface. CoBIs focuses on the detection of hazardous situations in industrial areas, such as chemical plants. Hazards may occur due to misplacing of reactive chemicals. As a consequence, location and proximity information is a highly valuable source for administration of such areas. This paper reports on architecture and technology used and gives insight into the operation of the system and initial experiences within the CoBIs project. The graphical user interface made use of web technologies, esp. RSS, to provide the activation of sensor information by using logic on the map. This novel contribution allows the user to bi-directionally interact with the sensor network deployments. The effort seeks to close the gap between list-based interfaces commonly known in management consoles and data access interfaces and more realworld coupled respective intuitive - because of a strong association - interfaces. Decker C, Riedel T, Scholl P, Krohn A, Beigl MGraphically geo-coding of sensor system information 2007 4th International Conference on Networked Sensing Systems, INSS » show abstract « hide abstract Abstract Sensor network information should be presented to the user in a convenient way. For administration, inspection and access, this motivates a location-based interface. We implemented a GoogleMaps interface to wireless sensor networks, which we use in the CoBIs research project and explore the capabilities of such an interface. CoBIs focuses on the detection of hazardous situations in industrial areas, such as chemical plants. Hazards may occur due to misplacing of reactive chemicals. As a consequence, location and proximity information is a highly valuable source for administration of such areas. This paper reports on architecture and technology used and gives insight into the operation of the system and initial experiences within the CoBIs project. The graphical user interface made use of web technologies, esp. RSS, to provide the activation of sensor information by using logic on the map. This novel contribution allows the user to bi-directionally interact with the sensor network deployments. The effort seeks to close the gap between list-based interfaces commonly known in management consoles and data access interfaces and more realworld coupled respective intuitive - because of a strong association - interfaces. Ilia Polian, Damian Nowroth, Bernd BeckerIdentification of Critical Errors in Imaging Applications 2007 Int'l On-Line Test Symp. , pages : 201 - 202 Bernd Becker, Christian Dax, Jochen Eisinger, Felix KlaedtkeLIRA: Handling Constraints of Linear Arithmetics over the Integers and the Reals 2007 Int'l Conf. on CAV , Springer-Verlag, pages : 312 - 315 , volume : 4415 LNCSLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) ISBN : 9783540712671 Riedel T, Decker C, Scholl P, Krohn A, Beigl M» show abstract « hide abstract Abstract Sensor network technology is pushing towards integration into the business world. By using sensor node hardware to augment real life business items it is possible to capture the world and support processes where they actually happen. Many problems of the business logic running our world can be efficiently implemented "on the item". In order for these smart items to couple back to the virtualized world of business processes it necessary to design a uniform system abstraction for enterprise systems. Service oriented architectures are the tool to describe functionality apart from its concrete implementation. This paper describes a system and the experiences made integrating wirelessly networked smart business items into high-level business processes. © Springer-Verlag Berlin Heidelberg 2007. , volume : 4415 LNCSLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) ISBN : 9783540712671 Riedel T, Decker C, Scholl P, Krohn A, Beigl M» show abstract « hide abstract Abstract Sensor network technology is pushing towards integration into the business world. By using sensor node hardware to augment real life business items it is possible to capture the world and support processes where they actually happen. Many problems of the business logic running our world can be efficiently implemented "on the item". In order for these smart items to couple back to the virtualized world of business processes it necessary to design a uniform system abstraction for enterprise systems. Service oriented architectures are the tool to describe functionality apart from its concrete implementation. This paper describes a system and the experiences made integrating wirelessly networked smart business items into high-level business processes. © Springer-Verlag Berlin Heidelberg 2007. Christian Dax, Jochen Eisinger, Felix KlaedtkeMechanizing the Powerset Construction for Restricted Classes of omega-Automata 2007 Int'l Symp. on Automated Technology for Verification and Analysis , Springer-Verlag, volume : 4762, pages : 223 - 236 Christian Dax, Jochen Eisinger, Felix KlaedtkeMechanizing the Powerset Construction for Restricted Classes of omega-Automata , issue : 228, 2007 Marc Herbstritt, Bernd BeckerOn Combining 01X-Logic and QBF 2007 Int'l Conf. on Computer Aided Systems Theory , Springer Verlag, pages : 531 - 538» show abstract « hide abstract Abstract We discuss how to combine 01X-logic and quantified booleanformulas (QBF) within a homogeneous SAT/QBF-framework in the context of bounded model checking of blackbox designs. The proposed combination allows a flexible handling of blackboxes w.r.t. computational resources. Preliminary results show the scalability of the approach. Ralf Wimmer, Marc Herbstritt, Bernd BeckerOptimization Techniques for BDD-based Bisimulation Minimization 2007 Great Lakes Symp. on VLSI , ACM Press, pages : 405 - 410» show abstract « hide abstract Abstract In this paper we report on optimizations for a BDD-based algorithm for the computation of bisimulations. The underlying algorithmic principle is an iterative refinement of a partition of the state space. The proposed optimizations demonstrate that both, taking into account the algorithmic structure of the problem and the exploitation of the BDD-based representation, are essential to finally obtain an efficient symbolic algorithm for real-world problems. The contributions of this paper are (1) block forwarding to update block refinement as soon as possible, (2) split-driven refinement that over-approximates the set of blocks that must definitely be refined, and (3) block ordering to fix the order of the blocks for the refinement in a clever way. We provide substantial experimental results on examples from different applications and compare them to alternative approaches when possible. The experiments clearly show that the proposed optimization techniques result in a significant performance speed-up compared to the basic algorithm as well as to alternative approaches. Bernd Becker, Jochen Eisinger, Felix KlaedtkeParallelization of Decision Procedures for Automatic Structures 2007 Workshop on Omega-Automata Piet Engelke, Bettina Braitling, Ilia Polian, Michel Renovell, Bernd BeckerSUPERB: Simulator Utilizing Parallel Evaluation of Resistive Bridges 2007 IEEE Asian Test Symp. , pages : 433 - 438» show abstract « hide abstract Abstract A high-performance resistive bridging fault simulator SUPERB (Simulator Utilizing Parallel Evaluation of Resistive Bridges) is proposed. It is based on fault sectioning in combination with parallel-pattern or parallel-fault multiple-stuck-at simulation. It outperforms a conventional interval-based resistive bridging fault simulator by 60X to 120X while delivering identical results. Further competing tools are outperformed by several orders of magnitude. Stefan Spinner, Jie Jiang, Ilia Polian, Piet Engelke, Bernd BeckerSimulating Open-Via Defects 2007 IEEE Asian Test Symp. , pages : 265 - 270» show abstract « hide abstract Abstract Open-via defects are a major systematic failure mechanism in nanoscale manufacturing processes. We present a flow for simulating open-via defects. Electrical parameters are extracted from the layout and technology data and represented in a way which allows efficient simulation on gate level. The simulator takes oscillation caused by open-via defects into account and quantifies its impact on defect coverage. The flow can be employed for manufacturing test as well as for defect diagnosis. Ralf Wimmer, Alexander Kortus, Marc Herbstritt, Bernd BeckerSymbolic Model Checking for DTMCs with Exact and Inexact Arithmetic AVACS Technical Report , issue : 30, 2007 Bernd Becker, Ilia Polian, Sybille Hellebrand, Bernd Straube, Hans-Joachim WunderlichTest und Zuverlässigkeit Nanoelektronischer Systeme 2007 GMM/GI/ITG Reliability and Design Conf. , pages : 139 - 140 Ralf Wimmer, Holger Hermanns, Marc Herbstritt, Bernd BeckerTowards Symbolic Stochastic Aggregation AVACS Technical Report , issue : 16, 2007 back to the year overview Ralf Wimmer, Tobias Nopper, Marc Herbstritt, Christoph Löffler, Bernd BeckerCollaborative Exercise Management 2006 World Conf. on E-Learning in Corporate, Government, Healthcare, and Higher Education , Association for the Advancement of Computing in Education (AACE), pages : 3127 - 3134» show abstract « hide abstract Abstract In this work, we report on a novel tool for managing exercises used in a collaborative course organization environment at university. Our approach focuses on the separation of the content of the exercises and its layout. The latter is based on the typesetting system LaTeX. Content management of the exercises is established by a web-based interface. As a whole, our tool provides a time-saving and unambiguous workflow for collaborative, time- and place-independent exercise organization. Ilia Polian, Alejandro Czutro, Sandip Kundu, Bernd BeckerPower Droop Testing 2006 Int'l Conf. on Computer Design , pages : 243 - 250 Piet Engelke, Ilia Polian, Michel Renovell, Bernd BeckerSimulating Resistive Bridging and Stuck-At Faults 2006 IEEE Trans. on CAD , volume : 25, issue : 10, pages : 2181 - 2192» show abstract « hide abstract Abstract We present a simulator for resistive bridging and stuck-at faults. In contrast to earlier work, it is based on electrical equations rather than table look-up, thus exposing more flexibility. For the first time, simulation of sequential circuits is dealt with; interaction of fault effects in current time frame and earlier timeframes is elaborated on for different bridge resistances. Experimental results are given for resistive bridging and stuck-at faults in combinational and sequential circuits. Different definitions of fault coverage are listed and quantitative results with respect to all these definitions are given for the first time. Michel Renovell, Mariane Comte, Ilia Polian, Piet Engelke, Bernd BeckerAnalyzing the memory effect of resistive open in CMOS random logic 2006 Int'l Conf. on Design and Test of Integrated Systems in Nanoscale Technology , pages : 251 - 256» show abstract « hide abstract Abstract This paper analyzes the electrical behaviour of resistive opens as a function of its unpredictable resistance. It is demonstrated that the electrical behaviour depends on the value of the open resistance. It is also shown that, due to the memory effect detection of the open by a given vector Ti depends on all the vectors that have been applied to the circuit before Ti. An electrical analysis of this memory effect is presented. Piet Engelke, Ilia Polian, Michel Renovell, Bernd BeckerAutomatic Test Pattern Generation for Resistive Bridging Faults 2006 Jour. Electronic Testing , volume : 22, issue : 1, pages : 61 - 69» show abstract « hide abstract Abstract An ATPG for resistive bridging faults in combinational or full-scan circuits is proposed. It combines the advantages of section-based generation and interval-based simulation. In contrast to the solutions introduced so far, it can handle static effects of arbitrary non-feedback bridges between two nodes, including ones detectable at higher bridge resistance and undetectable at lower resistance, and faults requiring more than one vector for detection. The developed tool is applied to ISCAS circuits, and a higher efficiency compared with other resistive bridging fault as well as stuck-at ATPG is reported. Information required for accurate resistive bridging fault simulation is obtained as a by-product. Erika Ábrahám, Marc Herbstritt, Bernd Becker, Martin SteffenMemory-aware Bounded Model Checking for Linear Hybrid Systems 2006 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Fraunhofer IIS/EAS, pages : 153 - 162» show abstract « hide abstract Abstract Bounded Model Checking (BMC) is a successful method for refuting properties of erroneous systems. Initially applied to discrete systems only, BMC could be extended to more complex domains like linear hybrid automata. The increasing complexity coming along with these complex models, but alsorecent optimizations of SAT-based BMC, like excessive conflict learning, reveal a memory explosion problem especially for deep counterexamples. In this paper we introduce parametric data types for the internal solverstructure that, taking advantage of the symmetry of BMC problems, remarkably reduce the memory requirements of the solver. Yuyi Tang, Hans-Joachim Wunderlich, Piet Engelke, Ilia Polian, Bernd Becker, Jürgen Schlöffel, Friedrich Hapke, Michael WittkeX-Masking During Logic BIST and Its Impact on Defect Coverage 2006 IEEE Trans. on VLSI Systems , volume : 14, issue : 2, pages : 193 - 202» show abstract « hide abstract Abstract We present a technique for making a circuit ready for Logic BIST by masking unknown values at its outputs. In order to keep the silicon area cost low, some known bits in output responses are also allowed to be masked. These bits are selected based on a stuck-at n-detection based metric, such that the impact of masking on the defect coverage is minimal. An analysis based on a probabilistic model for resistive short defects indicates that the coverage loss for unmodeled defects is negligible for relatively low values of n. Jan Reineke, Björn Wachter, Stephan Thesing, Reinhard Wilhelm, Jochen Eisinger, Ilia Polian, Bernd BeckerA Definition and Classification of Timing Anomalies 2006 Int'l Workshop on Worst-Case Execution Time John P. Hayes, Ilia Polian, Bernd BeckerA Model for Transient Faults in Logic Circuits 2006 Int'l Design and Test Workshop Michel Renovell, Mariane Comte, Ilia Polian, Piet Engelke, Bernd BeckerA Specific ATPG technique for Resistive Open with Sequence Recursive Dependency 2006 IEEE Asian Test Symp. , pages : 273 - 278» show abstract « hide abstract Abstract This paper analyzes the electrical behavior of resistive opens as a function of their unpredictable resistance. It is demonstrated that the electrical behavior depends on the value of the open resistance. It is also shown that detection of the open by a given vector Ti recursively depends on all the vectors that have been applied to the circuit before Ti. An electrical analysis of this recursive effect is presented and a specific ATPG strategy is proposed. Stefan Spinner, M. Doelle, P. Ruther, Ilia Polian, Oliver Paul, Bernd BeckerA System for Electro-Mechanical Reliability Testing of MEMS Devices 2006 Int'l Symp. for Testing and Failure Analysis , pages : 147 - 152 Erika Ábrahám, Andreas Grüner, Martin SteffenAbstract Interface Behavior of Object-Oriented Languages with Monitors 2006 IFIP Int'l Conf. on Formal Methods for Open Object-based Distributed Systems , Springer-Verlag, volume : 4036, pages : 218 - 232 Marc Herbstritt, Bernd Becker, Christoph SchollAdvanced SAT-Techniques for Bounded Model Checking of Blackbox Designs 2006 Int'l Workshop on Microprocessor Test and Verification , IEEE Computer Society, pages : 37 - 44 Sandip Kundu, Ilia PolianAn Improved Technique for Reducing False Alarms Due to Soft Errors 2006 Int'l On-Line Test Symp. , pages : 105 - 110» show abstract « hide abstract Abstract A significant fraction of soft errors in modern microprocessors has been reported to never lead to a system failure. Any concurrent error detection scheme that raises alarm every time a soft error is detected will not be well heeded because most of these alarms are false and responding to them will affect system performance negatively. This paper improves state of the art in detecting and preventing false alarms. Existing techniques are enhanced by a methodology to handle soft errors on address bits. Furthermore, we demonstrate benefit of false alarm identification in implementing a roll-back recovery system by first calculating the optimum check pointing interval for a roll-back recovery system and then showing that the optimal number of check-points decrease by orders of magnitude when exclusion techniques are used even if the implementation of exclusion technique is not perfect. Marc Herbstritt, Ralf Wimmer, Thomas Peikenkamp, Eckard Böde, Michael Adelaide, Sven Johr, Holger Hermanns, Bernd BeckerAnalysis of Large Safety-Critical Systems: A quantitative Approach AVACS Technical Report , volume : 8, 2006 Jochen Eisinger, Ilia Polian, Bernd Becker, Alexander Metzner, Stephan Thesing, Reinhard WilhelmAutomatic Identification of Timing Anomalies for Cycle-Accurate Worst-Case Execution Time Analysis 2006 IEEE Design and Diagnostics of Electronic Circuits and Systems , IEEE Computer Society, pages : 15 - 20 Erika Ábrahám, Marc Herbstritt, Bernd Becker, Martin SteffenBounded Model Checking with Parametric Data Structures 2006 BMC , volume : 174, issue : 3, pages : 3 - 16 Eckard Böde, Marc Herbstritt, Holger Hermanns, Sven Johr, Thomas Peikenkamp, Reza Pulungan, Ralf Wimmer, Bernd BeckerCompositional Performability Evaluation for STATEMATE 2006 Int'l Conf. on Quantitative Evaluation of Systems , IEEE Computer Society, pages : 167 - 178» show abstract « hide abstract Abstract le timed reachability analysis of uniform continuous-time Markov decision processes, which are generated from Statemate models. We give a detailed explanation of several construction, transformation, reduction, and analysis steps required to make this possible. The entire tool flow has been implemented, and it is applied to a nontrivial example. Bernd Becker, Ilia Polian, Sybille Hellebrand, Bernd Straube, Hans-Joachim WunderlichDFG-Projekt RealTest - Test und Zuverlässigkeit nanoelektronischer Systeme (DFG-Project - Test and Reliability of Nano-Electronic Systems) 2006 it - Information Technology , volume : 48, issue : 5, page : 304 Piet Engelke, Ilia Polian, Hans Manhaeve, Michel Renovell, Bernd BeckerDelta-IddQ Testing of Resistive Short Defects 2006 IEEE Asian Test Symp. , pages : 63 - 68» show abstract « hide abstract Abstract This paper addresses the efficiency of IDDQ and more in particular Delta-IDDQ testing when using a realistic short defect model that properly considers the relation between the resistance of the short and its detectability. The results clearly show that the Delta-IDDQ approach covers a large number of resistive shorts missed by conventional logic testing, requiring only a relative small vector set. In addition a significant number of defects which are proven to be undetectable by logic testing but may deteriorate and result in reliability failures are detected. The Delta-IDDQ threshold and thus the equipment sensitivity is shown to be critical for the test quality. Furthermore, the validity of the traditional IDDQ fault models when considering resistive short defects is found to be limited. For instance, the use of the fault-free next-state function for sequential IDDQ fault simulation is shown to result in a wrong classification of some resistive short defects. This is the first systematic study of IDDQ testing of resistive short defects. The impact of the threshold on the defect coverage is quantified for the first time. Although the simulation results are based upon an older technology, the results and methodology are as well valid for state-of-the-art and NanoTechnologies. Jochen Eisinger, Felix KlaedtkeDon't Care Words with an Application to the Automata-based Approach for Real Addition 2006 Conf. on Computer Aided Verification , volume : 4144, pages : 67 - 80» show abstract « hide abstract Abstract 't cares" for BDDs to word languages as a means to reduce the automata sizes. A general result in this paper shows that the minimal weak deterministic Büchi automaton (WDBA) with respect to a given don't care set with certain restrictions is uniquely determined and can be efficiently constructed. We apply don't cares to mechanize a more effective decision procedure for the first-order logic over the mixed linear arithmetic over the integers and the reals based on WDBAs. Jochen Eisinger, Felix KlaedtkeDon't Care Words with an Application to the Automata-based Approach for Real Addition , issue : 223, 2006» show abstract « hide abstract Abstract Automata are a useful tool in infinite state model checking,since they can represent infinite sets of integers and reals. However, analogous to BDDs to represent finite sets, an obstacle of an automata-based set representation is the sizes of the automata. In this paper, we generalizethe notion of "don't cares" for BDDs to word languages as a means to reduce the automata sizes. A general result in this paper shows that the minimal weak deterministic Büchi automaton (WDBA) with respect to a given don't care set with certain restrictions is uniquely determined and can be efficiently constructed. We apply don't cares to mechanize amore effective decision procedure for the first-order logic over the mixed linear arithmetic over the integers and the reals based on WDBAs. Erika Ábrahám, Andreas Grüner, Martin SteffenDynamic Heap-Abstraction for Open Object-Oriented Systems with Thread Classes , issue : TR-0601, 2006 Stefan Spinner, J. Bartholomeyczik, Bernd Becker, M. Doelle, Oliver Paul, Ilia Polian, P. Roth, K. Seitz, P. RutherElectromechanical Reliability Testing of Three-Axial Force Sensors 2006 Design, Test, Integration and Packaging of MEMS/MOEMS , pages : 77 - 82 Piet Engelke, Ilia Polian, Hans Manhaeve, Bernd BeckerIddQ Testing of Resistive Bridging Defects 2006 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” , pages : 123 - 124 Erika Ábrahám, Frank S. de Boer, Willem-Paul de Roever, Martin SteffenInductive Proof Outlines for Exceptions in Multithreaded Java 2006 IPM Int'l Workshop on Foundations of Software Engineering , volume : 159, pages : 281 - 279 Martin Fränzle, Christian Herde, S. Ratschan, Tobias Schubert, Tino TeigeInterval Constraint Solving Using Propositional SAT Solving Techniques 2006 Int'l Workshop on the Integration of SAT and CP Techniques , pages : 81 - 95 Ilia Polian, Bernd Becker, M. Nakasato, S. Ohtake, Hideo FujiwaraLow-Cost Hardening of Image Processing Applications Against Soft Errors 2006 Int'l Symp. on Defect and Fault Tolerance , pages : 274 - 279» show abstract « hide abstract Abstract s functionality. We call a soft error uncritical if its impact is provably limited to image perturbations during a very short period of time (number of cycles) and the system is guaranteed to recover thereafter. Uncritical errors do not require hardening as their effects are imperceivable for the human user of the system. We focus on soft errors in the motion estimation subsystem of MPEG-2 and introduce different definitions of uncritical soft errors in that subsystem. We propose a method to automatically determine uncritical errors and provide experimental results for various parameters. The concept can be adapted to further systems and enhance existing methods. Ralf Wimmer, Marc Herbstritt, Bernd BeckerMinimization of Large State Spaces using Symbolic Branching Bisimulation 2006 IEEE Design and Diagnostics of Electronic Circuits and Systems , IEEE Computer Society Press, pages : 9 - 14» show abstract « hide abstract Abstract Bisimulations in general are a powerful concept to minimize large finite state systems regarding some well-defined observational behavior. In contrast to strong bisimulation, for branching bisimulation there are only tools available that work on an explicit state space representation. In this work we present for the first time a symbolic approach for branching bisimulation that uses BDDs as basic data structure and that is based on the concept of signature refinement. First experimental results for problem instances derived from process algebraic system descriptions show the feasibility and the robustness of our approach. Thomas Eschbach, Wolfgang Günther, Bernd BeckerOrthogonal Hypergraph Drawing for Improved Visibility 2006 Journal of Graph Algorithms and Applications , volume : 10, issue : 2, pages : 141 - 157» show abstract « hide abstract Abstract Visualization of circuits is an important research area in electronic design automation. One commonly accepted method to visualize a circuit aligns the gates to layers and uses orthogonal lines to connect the gates. In our model we assume that between two consecutive layers every net is allowed to occupy only one track. This avoids unnecessary bends in the wires and helps to improve the clarity of the drawing. Then a crossing reduction step is applied to further improve the readability of the circuit schematics.First we assume that the nodes have already been fixed on alayered hypergraph structure. We consider the problem of assigning the hyperedges between two layers to tracks. The idea is to minimize the total number of hyperedge crossings. We prove that finding the best solution is NP-hard. Then, in contrast to many other approaches which route all the wiring after placing all nodes we focus on a new approach which dynamically reorders the nodes within the layers to further reduce the number of hyperedge crossings.An efficient algorithm is presented that minimizes the hyperedge crossings. Experimental results are provided which show that the drawings can be improved significantly while the run time remains moderate. Erika Ábrahám, Tobias Schubert, Bernd Becker, Martin Fränzle, Christian HerdeParallel SAT-Solving in Bounded Model Checking 2006 Int'l Workshop on Parallel and Distributed Methods in Verification , Springer-Verlag, volume : 4346, pages : 301 - 315» show abstract « hide abstract Abstract Bounded Model Checking (BMC) is an incremental refutation technique to search for counterexamples of increasing length. The existence of a counterexample of a fixed length is expressed by a first-order logic formula that is checked for satisfiability using a suitable solver.We apply communicating parallel solvers to check satisfiability of the BMC formulae. In contrast to other parallel solving techniques, our method does not parallelize the satisfiability check of a single formula, but the parallel solvers work on formulae for different counterexample lengths. We adapt the method of constraint sharing and replication of Shtrichman, originally developed for sequential BMC, to the parallel setting. Since the learning mechanism is now parallelized, it is not obvious whether there is a benefit from the concepts of Shtrichman in the parallel setting. We demonstrate on a number of benchmarks that adequate communication between the parallel solvers yields the desired results. Ilia Polian, Bernd Becker, M. Nakasato, S. Ohtake, Hideo FujiwaraPeriod of Grace: A New Paradigm for Efficient Soft Error Hardening 2006 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Stefan Spinner, J. Bartholomeyczik, Bernd Becker, M. Doelle, Oliver Paul, Ilia Polian, R. Roth, K. Seitz, P. RutherReliability Testing of Three-Dimensional Silicon Force Sensors 2006 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Ralf Wimmer, Marc Herbstritt, Holger Hermanns, Kelley Strampp, Bernd BeckerSigref - A Symbolic Bisimulation Tool Box 2006 Int'l Symp. on Automated Technology for Verification and Analysis , Springer-Verlag, volume : 4218, pages : 477 - 492» show abstract « hide abstract Abstract We present a uniform signature-based approach to compute the most popular bisimulations. Our approach is implemented symbolically using BDDs, which enables the handling of very large transition systems. Signatures for the bisimulations are built up from a few generic building blocks, which naturally correspond to efficient BDD operations. Thus, the definition of an appropriate signature is the key for a rapid development of algorithms for other types of bisimulation. We provide experimental evidence of the viability of this approach by presenting computational results for many bisimulations on real-world instances. The experiments show cases where our framework can handle state spaces efficiently that are far too large to handle for any tool that requires an explicit state space description. back to the year overview Marc Herbstritt, Bernd BeckerOn SAT-based Bounded Invariant Checking of Blackbox Designs 2005 Int'l Workshop on Microprocessor Test and Verification , IEEE Computer Society, pages : 23 - 28» show abstract « hide abstract Abstract Design verification by property checking is a mandatory taskduring circuit design. In this context, Bounded Model Checking (BMC) has become popular for falsifying erroneous designs. Additionally, the analysis of partial designs, i.e., circuits that are not fully specified, has recently gained attraction. In this work we analyze how BMC can be applied to such partial designs. Our experiments show that even with the most simple modelling scheme, namely 01X-simulation, a relevant number of errors can be detected. Additionally, we propose a SAT-solver that directly can handle 01X-logic. Martina Welte, Thomas Eschbach, Bernd BeckerAutomated Text Extraction And Indexing Of Video Presentation Recordings For Keyword Search Via A Web Interface 2005 Workshop eLectures - Einsatzmöglichkeiten, Herausforderungen und Forschungsperspektiven , Logos Verlag Berlin Matthew Lewis, Tobias Schubert, Bernd BeckerSpeedup Techniques Utilized in Modern SAT Solvers - An Analysis in the MIRA Environment 2005 Theory and Applications of Satisfiability Testing , Springer, volume : 3569, pages : 437 - 443» show abstract « hide abstract Abstract s 28. Ilia Polian, Alejandro Czutro, Bernd BeckerEvolutionary Optimization in Code-Based Test Compression 2005 Conf. on Design, Automation and Test in Europe , pages : 1124 - 1129 Ilia Polian, Piet Engelke, Michel Renovell, Bernd BeckerModeling feedback bridging faults with non-zero resistance. 2005 Jour. Electronic Testing , volume : 21, issue : 1, pages : 57 - 69» show abstract « hide abstract Abstract We study the behavior of feedback bridging faults with non-zero bridge resistance in both combinational and sequential circuits. We demonstrate that a test vector may detect the fault, not detect the fault or lead to oscillation, depending on bridge resistance. Moreover, the resistance intervals in which a particular behavior is observed are not necessarily contiguous. Even loops going through a gate with controlling values on its side inputs (which we call disabled loops) expose non-trivial behavior. We outline the multiple strengths problem which arises due to the fact that a critical bridge resistance depends on the strengths of the signals driving the bridge, which in turn are functions of the number of the on-transistors, these again depending on the bridge resistance, making such a fault very hard to resolve. For sequential circuits, we describe additional difficulties caused by the need to account for implications on bridge behavior, which have originated in the previous time frames. We conclude that the complexity of resistive feedback bridging fault simulation accurate enough to resolve such situations will probably be prohibitively high and propose possible simplifying assumptions. We present simulation results for ISCAS benchmarks using these assumptions with and without taking oscillation into account. John P. Hayes, Ilia Polian, Thomas Fiehn, Bernd BeckerA Family of Logical Fault Models for Reversible Circuits 2005 IEEE European Test Symp. , pages : 65 - 70» show abstract « hide abstract Abstract Reversibility is of interest in achieving extremely low power dissipation; it is also an inherent design requirement of quantum computation. Logical fault models for conventional circuits such as stuck-at models are not well-suited to quantum circuits. We derive a family of logical fault models for reversible circuits composed of k-CNOT (k-input controlled-NOT) gates and implementable by many technologies. The models are extensions of the previously proposed single missing-gate fault (MGF) model, and include multiple and partial MGFs. We study the basic detection requirements of the new fault types and derive bounds on the size of their test sets. We also present optimal test sets computed via integer linear programming for various benchmark circuits. These results indicate that, although the test sets are generally very small, partial MGFs may need significantly larger test sets than single MGFs. Ilia Polian, John P. Hayes, Thomas Fiehn, Bernd BeckerA Family of Logical Fault Models for Reversible Circuits 2005 IEEE Asian Test Symp. , pages : 422 - 427» show abstract « hide abstract Abstract Reversibility is of interest in achieving extremely low power dissipation; it is also an inherent design requirement of quantum computation. Logical fault models for conventional circuits such as stuck-at models are not well-suited to quantum circuits. We derive a family of logical fault models for reversible circuits composed of k-CNOT (k-input controlled-NOT) gates and implementable by many technologies. The models are extensions of the previously proposed single missing-gate fault (MGF) model, and include multiple and partial MGFs. We study the basic detection requirements of the new fault types and derive bounds on the size of their test sets. We also present optimal test sets computed via integer linear programming for various benchmark circuits. These results indicate that, although the test sets are generally very small, partial MGFs may need significantly larger test sets than single MGFs. Tobias Schubert, Bernd BeckerA Hardware Lab Anywhere at Any Time 2005 Journal of Systemics, Cybernetics, and Informatics: JSCI , volume : 2, issue : 6 Sandip Kundu, Matthew Lewis, Ilia Polian, Bernd BeckerA Soft Error Emulation System for Logic Circuits 2005 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” , pages : 10 - 14» show abstract « hide abstract Abstract In nanometer technologies, soft errors in logic circuits are increasingly important. Since the failure in time (FIT) rates for these circuits are very low, millions of test vectors are required for a realistic analysis of soft errors. This exceeds the capabilities of software simulation tools. We propose an FPGA emulation architecture that can apply millions of vectors within seconds. Comprehensive soft error profiling was done for ISCAS 89 circuits. Soft errors were assigned to four different classes, and their latency and recovery time were obtained. This information is useful for understanding the vulnerability of the system to soft errors and hardening it against such errors. Sandip Kundu, Matthew Lewis, Ilia Polian, Bernd BeckerA Soft Error Emulation System for Logic Circuits 2005 Conf. on Design of Circuits and Integrated Systems , page : 137» show abstract « hide abstract Abstract In nanometer technologies, soft errors in logic circuits are increasingly important. Since the failure in time (FIT) rates for these circuits are very low, millions of test vectors are required for a realistic analysis of soft errors. This exceeds the capabilities of software simulation tools. We propose an FPGA emulation architecture that can apply millions of vectors within seconds. Comprehensive soft error profiling was done for ISCAS 89 circuits. Soft errors were assigned to four different classes, and their latency and recovery time were obtained. This information is useful for understanding the vulnerability of the system to soft errors and hardening it against such errors. M. Doelle, Stefan Spinner, P. Ruther, Ilia Polian, Oliver Paul, Bernd BeckerA System for Determining the Impact of Mechanical Stress on the Reliability of MEMS 2005 IEEE European Test Symp. , pages : 57 - 61» show abstract « hide abstract Abstract This paper reports on an experimental test system which enables the automated analysis of mechanical stress impact on the reliability of microelectromechanical systems (MEMS). With this system, in-situ electrical characterization and optical inspection are performed while subjecting MEMS devices to defined mechanical loads. Impact objects of various geometries, e.g., contact probes or wafer prober needles, can be applied which are aligned relative to the MEMS device using an xyz-nanopositioning stage with an accuracy of 25 nm. This positioning stage enables programmable static forces up to 5 N and dynamic loads at frequencies up to 800 Hz. With this highly flexible system reliability tests, postmanufacturing tests and stress screens can be performed on single chips as well as on whole wafers with diameters up to 6 inch. Preliminary results on long-term reliability tests using CMOS-based stress sensors exploiting the piezoresistive effect in field effect transistors are presented. M. Doelle, Stefan Spinner, P. Ruther, Ilia Polian, Oliver Paul, Bernd BeckerA System for Determining the Impact of Mechanical Stress on the Reliability of MEMS 2005 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” , pages : 88 - 89» show abstract « hide abstract Abstract This paper reports on an experimental test system which enables the automated analysis of mechanical stress impact on the reliability of microelectromechanical systems (MEMS). With this system, in-situ electrical characterization and optical inspection are performed while subjecting MEMS devices to defined mechanical loads. Impact objects of various geometries, e.g., contact probes or wafer prober needles, can be applied which are aligned relative to the MEMS device using an xyz-nanopositioning stage with an accuracy of 25 nm. This positioning stage enables programmable static forces up to 5 N and dynamic loads at frequencies up to 800 Hz. With this highly flexible system reliability tests, postmanufacturing tests and stress screens can be performed on single chips as well as on whole wafers with diameters up to 6 inch. Preliminary results on long-term reliability tests using CMOS-based stress sensors exploiting the piezoresistive effect in field effect transistors are presented. Tobias Schubert, Bernd BeckerAccelerating Boolean SAT Engines Using Hyper-Threading Technology 2005 Asian Applied Computing Conf. Erika Ábrahám, Andreas Grüner, Martin SteffenAn Open Structural Operational Semantics for an Object-Oriented Calculus with Thread Classes , issue : TR-0505, 2005 Gang Chen, Sudhakar M. Reddy, Irith Pomeranz, Janusz Rajski, Piet Engelke, Bernd BeckerAn Unified Fault Model and Test Generation Procedure for Interconnect Opens and Bridges 2005 IEEE European Test Symp. , pages : 22 - 27 Martina Welte, Thomas Eschbach, Bernd BeckerAutomated Text Extraction And Indexing Of Video Presentation Recordings For Keyword Search Via A Web Interface 2005 AACE World Conf. on E-Learning in Corporate, Government, Healthcare, and Higher Education , AACE Press, pages : 3200 - 3205 Bernd Becker, Markus Behle, Friedrich Eisenbrand, Ralf WimmerCut Framework 2005 Int'l Workshop on Efficient and Experimental Algorithms , Springer Verlag, volume : 3503, pages : 452 - 463» show abstract « hide abstract Abstract Cut is today's state-of-the-art method to solve 0/1-integer linear programs. Important for the success of this method is the generation of strong valid inequalities, which tighten the linear programming relaxation of 0/1-IPs and thus allow for early pruning of parts of the search tree.\par In this paper we present a novel approach to generate valid inequalities for 0/1-IPs which is based on Binary Decision Diagrams (BDDs). BDDs are a datastructure which represents 0/1-vectors as paths of a certain acyclic graph. They have been successfully applied in computational logic, hardware verification and synthesis.\par We implemented our BDD cutting plane generator in a branch-and-cut framework and tested it on several instances of the MAX-ONES problem and randomly generated 0/1-IPs. Our computational results show that we have developed competitive code for these problems, on which state-of-the-art MIP-solvers fall short. Ilia Polian, Hideo FujiwaraFunctional Constraints vs. Test Compression in Scan-Based Delay Testing 2005 IEEE Int'l GHz/Gbps Test Workshop , pages : 91 - 100 Erika Ábrahám, Andreas Grüner, Martin SteffenHeap-Abstraction for an Object-Oriented Calculus with Thread Classes , issue : TR-0505, 2005 Tobias Schubert, Bernd BeckerKnowledge Sharing in a Microcontroller based Parallel SAT Solver 2005 Int'l Conf. on Parallel and Distributed Processing Techniques and Applications Tobias Schubert, Bernd BeckerLemma Exchange in a Microcontroller based Parallel SAT Solver 2005 IEEE Int'l Symp. on VLSI Ilia Polian, Piet Engelke, Michel Renovell, Bernd BeckerModeling feedback bridging faults with non-zero resistance 2005 Jour. Electronic Testing , volume : 21, issue : 1, pages : 57 - 69» show abstract « hide abstract Abstract We study the behavior of feedback bridging faults with non-zero bridge resistance in both combinational and sequential circuits. We demonstrate that a test vector may detect the fault, not detect the fault or lead to oscillation, depending on bridge resistance. Moreover, the resistance intervals in which a particular behavior is observed are not necessarily contiguous. Even loops going through a gate with controlling values on its side inputs (which we call disabled loops) expose non-trivial behavior. We outline the multiple strengths problem which arises due to the fact that a critical bridge resistance depends on the strengths of the signals driving the bridge, which in turn are functions of the number of the on-transistors, these again depending on the bridge resistance, making such a fault very hard to resolve. For sequential circuits, we describe additional difficulties caused by the need to account for implications on bridge behavior, which have originated in the previous time frames. We conclude that the complexity of resistive feedback bridging fault simulation accurate enough to resolve such situations will probably be prohibitively high and propose possible simplifying assumptions. We present simulation results for ISCAS benchmarks using these assumptions with and without taking oscillation into account. Ilia PolianNichtstandardfehlermodelle für digitale Logikschaltkreise: Simulation, prüfgerechter Entwurf, industrielle Anwendungen 2005 it - Information Technology , volume : 47, issue : 3, pages : 172 - 174» show abstract « hide abstract Abstract renden Entwurfsautomatisierungssoftware-Herstellers wird derzeit durchgeführt. Der zweite Teil der Dissertation befasst sich mit Entwurfsmethoden, welche die Testbarkeit des Schaltkreises auf dynamische Defekte erhöhen. Ein Ansatz zur Festlegung mehrerer Prüfpfade und eine Selbsttestarchitektur werden vorgestellt. Zwei Anhänge beschreiben den Zusammenhang zwischen den Nichtstandardfehlermodellen und dem konventionellen stuck-at-Modell und ihren Einsatz in der formalen Verifikation. Sandip Kundu, Piet Engelke, Ilia Polian, Bernd BeckerOn Detection of Resistive Bridging Defects by Low-Temperature and Low-Voltage Testing 2005 IEEE Asian Test Symp. , pages : 266 - 269» show abstract « hide abstract Abstract Resistive defects are gaining importance in very-deep-submicron technologies, but their detection conditions are not trivial. Test application can be performed under reduced temperature and/or voltage in order to improve detection of these defects. This is the first analytical study of resistive bridge defect coverage of CMOS ICs under low-temperature and mixed low-temperature, low-voltage conditions. We extend a resistive bridging fault model in order to account for temperature-induced changes in detection conditions. We account for changes in both the parameters of transistors involved in the bridge and the resistance of the short defect itself. Using a resistive bridging fault simulator, we determine fault coverage for low-temperature testing and compare it to the numbers obtained at nominal conditions. We also quantify the coverage of flaws,i.e. defects that are redundant at nominal conditions but could deteriorate and become early-life failures. Finally, we compare our results to the case of low-voltage testing and comment on combination of these two techniques. Erika Ábrahám, Bernd Becker, Felix Klaedtke, Martin SteffenOptimizing bounded model checking for linear hybrid systems 2005 Int'l Conf. on Verification, Model Checking and Abstract Interpretation , Springer-Verlag, volume : 3385, pages : 396 - 412» show abstract « hide abstract Abstract Bounded model checking (BMC) is an automatic verification method that is based on finitely unfolding the system's transition relation. BMC has been successfully applied, in particular, for discovering bugs in digital system design. Its success is based on the effectiveness of satisfiability solvers that are used to check for a finite unfolding whether a violating state is reachable. In this paper we improve the BMC approach for linear hybrid systems. Our improvements are tailored to lazy satisfiability solving and follow two complementary directions. First, we optimize the formula representation of the finite unfoldings of the transition relations of linear hybrid systems, and second, we accelerate the satisfiability checks by accumulating and generalizing data that is generated during earlier satisfiability checks. Experimental results show that the presented techniques accelerate the satisfiability checks significantly. Thomas Eschbach, Wolfgang Günther, Bernd BeckerOrthogonal Circuit Visualization Improved by Merging the Placement and Routing Phases 2005 Int'l Conf. on VLSI Design , pages : 433 - 438 Tobias Schubert, Bernd Becker, Matthew LewisPaMira - A Parallel SAT Solver with Knowledge Sharing 2005 Int'l Workshop on Microprocessor Test and Verification , IEEE Computer Society, volume : 00, pages : 29 - 36» show abstract « hide abstract Abstract In this paper we describe PaMira, a powerful distributed SAT solver. PaMira is based on the highly optimized, sequential SAT engine Mira, incorporating all essential optimization techniques modern algorithms utilize to maximize performance. For the distributed execution an efficient work stealing method has been implemented. PaMira also employs the exchange of conflict clauses between the processes to guide the search more efficiently. We provide experimental results showing linear speedup on a multiprocessor environment with four AMD Opteron processors. Ilia Polian, Sandip Kundu, Jean-Marc Galliere, Piet Engelke, Michel Renovell, Bernd BeckerResistive Bridge Fault Model Evolution From Conventional to Ultra Deep Submicron Technologies 2005 VLSI Test Symp. , pages : 343 - 348» show abstract « hide abstract Abstract We present three resistive bridging fault models valid for different CMOS technologies. The models are partitioned into a general framework (which is shared by all three models) and a technology-specific part. The first model is based on Shockley equations and is valid for conventional but not deep submicron CMOS. The second model is obtained by fitting SPICE data. The third resistive bridging fault model uses Berkeley Predictive Technology Model and BSIM4; it is valid for CMOS technologies with feature sizes of 90nm and below, accurately describing non-trivial electrical behavior in that technologies. Experimental results for ISCAS circuits show that the test patterns obtained for the Shockley model are still valid for the Fitted model, but lead to coverage loss under the Predictive model. Jochen Eisinger, Peter Winterer, Bernd BeckerSecuring Wireless Networks in a University Environment 2005 IEEE Int'l Conf. on Pervasive Computing and Communications Workshops , IEEE Computer Society, pages : 312 - 316 Piet Engelke, Valentin Gherman, Ilia Polian, Yuyi Tang, Hans-Joachim Wunderlich, Bernd BeckerSequence Length, Area Cost and Non-Target Defect Coverage Tradeoffs in Deterministic Logic BIST 2005 IEEE Design and Diagnostics of Electronic Circuits and Systems , pages : 43 - 48 Piet Engelke, Valentin Gherman, Ilia Polian, Yuyi Tang, Hans-Joachim Wunderlich, Bernd BeckerSequence Length, Area Cost and Non-Target Defect Coverage Tradeoffs in Deterministic Logic BIST 2005 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” , pages : 16 - 20 Piet Engelke, Valentin Gherman, Ilia Polian, Yuyi Tang, Hans-Joachim Wunderlich, Bernd BeckerSequence Length, Area Cost and Non-Target Defect Coverage Tradeoffs in Deterministic Logic BIST 2005 IEEE Int'l Workshop on Current and Defect-Based Testing , pages : 43 - 48 Ilia Polian, John P. Hayes, Sandip Kundu, Bernd BeckerTransient Fault Characterization in Dynamic Noisy Environments 2005 Int'l Test Conf. , pages : 10 pp. - 1048 back to the year overview Erika Ábrahám, Bernd Becker, Felix Klaedtke, Martin SteffenOptimizing Bounded Model Checking for Linear Hybrid Systems , issue : 214, 2004» show abstract « hide abstract Abstract Bounded model checking (BMC) is an automatic verification method that is based on finitely unfolding the system's transition relation. BMC has been successfully applied, in particular, for discovering bugs in digital system design. Its success is based on the effectiveness of satisfiability solvers that are used to check for a finite unfolding whether a violating state is reachable. In this report we improve the BMC approach for linear hybrid systems. Our improvements are tailored to lazy satisfiability solving and follow two complementary directions. First, we optimize the formula representation of the finite unfoldings of the transition relations of linear hybrid systems, and second, we accelerate the satisfiability checks by accumulating and generalizing data that is generated during earlier satisfiability checks. Experimental results show that the presented techniques accelerate the satisfiability checks significantly. Marc Herbstritt, Thomas Kmieciak, Bernd BeckerOn the Impact of Structural Circuit Partitioning on SAT-based Combinational Circuit Verification 2004 Int'l Workshop on Microprocessor Test and Verification , IEEE Computer Society, pages : 50 - 55» show abstract « hide abstract Abstract In this work we present an approach for SAT-based combinational circuit verification using partitionings of the set of primary outputs. We formally analyze the applied partitioning heuristics for the first time and present a closed verification framework incorporating traditional techniques. We report on experiments using our partitioning-based verification procedure that result in speedups of 276% on the average compared to traditional techniques. Marc Herbstritt, Thomas Kmieciak, Bernd BeckerCircuit Partitioning for SAT-based Combinational Circuit Verification -- A Case Study , issue : 206, 2004» show abstract « hide abstract Abstract Hardware verification is nowadays one of the most time-consuming tasks during chip design. In the last few years SAT-based methods have become a core technology in hardware design, especially for the verification of combinational parts of the circuits. Verifying the equivalence of some specification and a corresponding implementation is typically done by building a so-called miter. In practice this miter is often build for each pair of primary outputs or for all primary outputs at once. In this work we have a closer look on partitioning the primary outpouts of the circuit and how structural partitionings can speed-up the verification process when SAT-based methods are used. Ilia Polian, Bernd Becker, Alejandro CzutroCompression Methods for Path Delay Fault Test Pair Sets: A Comparative Study 2004 IEEE European Test Symp. , pages : 263 - 264 Matthew Lewis, Tobias Schubert, Bernd BeckerEarly Conflict Detection Based BCP for SAT Solving 2004 Int'l Conf. on Theory and Applications of Satisfiability Testing , pages : 29 - 36» show abstract « hide abstract Abstract Abstract. This paper describes a new BCP algorithm that improves the performance of Chaff class solvers by reducing the total number of clauses the BCP procedure must evaluate. This is done by: detecting conflicts earlier; evaluating clauses better; and by increasing the controllability of the conflicts which the BCP procedure finds. Solvers like Limmat [10] include a simple Early Conflict Detection BCP (ECDB), however we introduce a new aggressive ECDB procedure and the MIRA solver that efficiently incorporates it while easily facilitating comparisons between ECDB modes. With the full ECDB procedure enabled, MIRA was able to reduce the number of evaluated clauses by 59% on average compared to the disabled ECDB version. This new procedure and other speedup techniques discussed here allow MIRA to solve problems 3.7 times faster on average than zChaff. Lastly, this paper shows how significant speedup can be attained relatively easily by incorporating a certain level of ECDB into other solvers. Ilia Polian, Irith Pomeranz, Sudhakar M. Reddy, Bernd BeckerOn the use of maximally dominating faults in n-detection test generation 2004 IEE Proceedings Computers and Digital Techniques , volume : 151, issue : 3, pages : 235 - 244» show abstract « hide abstract Abstract The size of an n-detection test set increases approximately linearly with n. This increase in size may be too fast when an upper bound on test set size must be satisfied. We propose a test generation method for obtaining a more gradual increase in the sizes of n-detection test sets, while still ensuring that every additional test would be useful in improving the test set quality. The method is based on the use of fault dominance relations to identify a small subset of faults (called maximally dominating faults) whose numbers of detections are likely to have a high impact on the defect coverage of the test set. We use structural analysis to obtain a superset of the maximally dominating fault set. We also propose a method for determining exact sets of maximally dominating faults. We define new types of n-detection test sets based on the approximate and exact sets of maximally dominating faults. The test sets are called (n,n2)-detection test sets and (n,n2,n3)-detection test sets. We present experimental results to demonstrate the usefulness of these test sets in producing high-quality n-detection test sets for the combinational logic of ISCAS-89 benchmark circuits. Ilia Polian, Bernd BeckerScalable Delay Fault BIST For Use With Low-Cost ATE 2004 Jour. Electronic Testing , volume : 20, issue : 2, pages : 181 - 197» show abstract « hide abstract Abstract We present a BIST architecture based on Multi-Input Signature Register (MISR) expanding single input vectors into sequences, which are used for testing of delay faults. Input vectors can be stored on-chip or in the ATE; in the latter case, a low speed tester can be employed though the sequences are applied at-speed to block-under-test. The number of input vectors (and thus the area demand on-chip or ATE memory requirements) can be traded for the test application time. The block-under-test can be switched off for some amount of time between application of consecutive input vectors. We provide arguments why this approach may be the only way to meet thermal and power constraints. Furthermore, we demonstrate how the BIST scheme can use these cool-down breaks for re-configuration. We propose two methods for generating input vectors, a straight-forward one and one using BDDs and symbolic state traversals. As both require only a two-pattern test as input, IP cores can be handled by these methods. Yuyi Tang, Hans-Joachim Wunderlich, Harald Vranken, Friedrich Hapke, Michael Wittke, Piet Engelke, Ilia Polian, Bernd BeckerX-masking during logic BIST and its impact on defect coverage 2004 IEEE Int'l Workshop on Test Resource Partitioning , pages : 442 - 451 Ilia PolianOn Non-standard Fault Models for Logic Digital Circuits: Simulation, Design for Testability, Industrial Applications VDI-Verlag, 2004 Tobias Schubert, Bernd BeckerA Distributed SAT Solver for Microchip Microcontroller 2004 Workshop on Parallel Systems and Algorithms Piet Engelke, Ilia Polian, Michel Renovell, Bernd BeckerAutomatic test pattern generation for resistive bridging faults 2004 IEEE European Test Symp. , pages : 160 - 165 Piet Engelke, Ilia Polian, Michel Renovell, Bernd BeckerAutomatic test pattern generation for resistive bridging faults 2004 IEEE Int'l Workshop on Current and Defect-Based Testing , pages : 89 - 94 Bernd Becker, Markus Behle, Friedrich Eisenbrand, Martin Fränzle, Marc Herbstritt, Christian Herde, Jörg Hoffmann, Daniel Kröning, Bernhard Nebel, Ilia Polian, Ralf WimmerBounded Model Checking and Inductive Verification of Hybrid Discrete-continuous Systems 2004 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Shaker Verlag, pages : 65 - 75» show abstract « hide abstract Abstract We present a concept to significantly advance the state of the art for bounded model checking (BMC) and inductive verification (IV) of hybrid discrete-continuous systems. Our approach combines the expertise of partners coming from different domains, like hybrid systems modeling and digital circuit verification, bounded planning and heuristic search, combinatorial optimization and integer programming. After sketching the overall verification flow we present first results indicating that the combination and tight integration of different verification engines is a first step to pave the way to fully automated BMC and IV of medium to large-scale networks of hybrid automata. Matthew Lewis, Tobias Schubert, Bernd BeckerEarly Conflict Detection Based SAT Solving 2004 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” Ilia PolianOn Non-standard Fault Models for Logic Digital Circuits: Simulation, Design for Testability, Industrial Applications. GI, volume : D-4, pages : 169 - 178, 2004» show abstract « hide abstract Abstract Konventionelle Testverfahren für integrierte Schaltungen sind zunehmend nicht mehr in der Lage, akzeptable Produktqualität zu gewährleisten. Ein möglicher Ausweg ist der Einsatz von verbesserten Modellen für Fertigungsdefekte (Nichtstandardfehlermodelle). Der erste Teil der Dissertation beschäftigt sich deshalb mit der Modellierung von so genannten Brückenfehlern, welche insbesondere partikelinduzierte Kurzschlüsse modellieren und daher näher an realistischen Defekten sind als die gewöhnlich betrachteten stuck-at-Fehler. Insbesondere die Berücksichtigung des Einflusses des Kurzschlusswiderstandes spiegelt die Gegebenheiten moderner Deep-Submicron-Technologien wider. Obwohl ein Kontinuum von Defekten unter Berücksichtigung nichttrivialer elektrischer Zusammenhänge modelliert wird, sind effiziente diskrete Simulationsalgorithmen möglich. Die einfachsten der vorgestellten Modelle wurden für den industriellen Einsatz optimiert; die Integration der komplexeren resistiven Modelle in die Werkzeuge eines führenden Entwurfsautomatisierungssoftware-Herstellers wird derzeit durchgeführt. Eine weitere zunehmend wichtige Defektklasse stellen die Verzögerungsdefekte dar, welche so genanntes Zweimustertesten erfordern. Der zweite Teil der Dissertation befasst sich mit Entwurfsmethoden, welche die Testbarkeit des Schaltkreises auf dynamische Defekte erhöhen. Ein Ansatz zur Festlegung mehrerer Prüfpfade und eine Selbsttestarchitektur werden vorgestellt. Zwei Anhänge beschreiben den Zusammenhang zwischen den Nichtstandardfehlermodellen und dem konventionellen stuck-at-Modell und ihren Einsatz in der formalen Verifikation. Thomas Eschbach, Wolfgang Günther, Bernd BeckerOrthogonal Hypergraph Routing for Improved Visibility 2004 Great Lakes Symp. on VLSI , pages : 385 - 388 Tobias Schubert, Bernd BeckerPICHAFF2 - A Hierarchical Parallel SAT Solver 2004 Int'l Workshop on Microprocessor Test and Verification Tobias Schubert, Bernd BeckerParallel SAT Solving with Microcontrollers 2004 Asian Applied Computing Conf. Thomas Eschbach, Rolf Drechsler, Bernd BeckerPlacement and Routing Optimization for Circuits Derived from BDDs 2004 IEEE Int'l Symp. on Circuits and Systems , pages : V229 - V232 John P. Hayes, Ilia Polian, Bernd BeckerTesting for Missing-Gate Faults in Reversible Circuits 2004 IEEE Asian Test Symp. , pages : 100 - 105» show abstract « hide abstract Abstract Logical reversibility occurs in low-power applications and is an essential feature of quantum circuits. Of special interest are reversible circuits constructed from a class of reversible elements called k-CNOT (controllable NOT) gates. We review the characteristics of k-CNOT circuits and observe that traditional fault models like the stuck-at model may not accurately represent their faulty behavior or test requirements. A new fault model, the missing gate fault (MGF) model, is proposed to better represent the physical failure modes of quantum technologies. It is shown that MGFs are highly testable, and that all MGFs in an N-gate k-CNOT circuit can be detected with from one to | N / 2 | test vectors. A design-for-test (DFT) method to make an arbitrary circuit fully testable for MGFs using a single test vector is described. Finally, we present simulation results to determine (near) optimal test sets and DFT configurations for some benchmark circuits. Piet Engelke, Ilia Polian, Michel Renovell, Bharath Seshadri, Bernd BeckerThe Pros and Cons of Very-Low-Voltage Testing: An Analysis Based on Resistive Short Defects 2004 VLSI Test Symp. , pages : 171 - 178 Piet Engelke, Ilia Polian, Michel Renovell, Bharath Seshadri, Bernd BeckerThe Pros and Cons of Very-Low-Voltage Testing: An Analytical View 2004 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” , pages : 149 - 153 Yuyi Tang, Hans-Joachim Wunderlich, Harald Vranken, Friedrich Hapke, Michael Wittke, Piet Engelke, Ilia Polian, Bernd BeckerX-masking during logic BIST and its impact on defect coverage 2004 Int'l Test Conf. , pages : 442 - 451» show abstract « hide abstract Abstract We present a technique for making a circuit ready for Logic BIST by masking unknown values at its outputs. In order to keep the silicon area cost low, some known bits in output responses are also allowed to be masked. These bits are selected based on a stuck-at n-detection based metric, such that the impact of masking on the defect coverage is minimal. An analysis based on a probabilistic model for resistive short defects indicates that the coverage loss for unmodeled defects is negligible for relatively low values of n. back to the year overview Tobias Schubert, Bernd BeckerPICHAFF: A Distributed SAT Solver for Microcontrollers 2003 Euromicro Conf. Thomas Eschbach, Wolfgang Günther, Bernd BeckerCrossing Reduction for Orthogonal Circuit Visualization 2003 Int'l Conf. on VLSI , CSREA Press, pages : 107 - 113 Matthew Lewis, S. SimmonsA VLSI Implementation of a Cryptographic Processor 2003 CCECE , volume : 2, pages : 821 - 826 Marc Herbstritt, Bernd BeckerConflict-based Selection of Branching Rules 2003 Int'l Conf. on Theory and Applications of Satisfiability Testing , Springer, volume : 2919, pages : 441 - 451 Marc Herbstritt, Bernd BeckerConflict-based Selection of Branching Rules in SAT-Algorithms 2003 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 189 - 198 Tobias Schubert, Bernd BeckerA Hardware Lab Anywhere At Anytime 2003 Int'l Conf. on Education and Information Systems: Technologies and Applications , pages : 130 - 135 Andreas Hett, Tobias SchubertA Hardware Lab in a Pocket 2003 World Conf. on E-Learning in Corporate, Government, Healthcare, and Higher Education Tobias Schubert, Andreas HettA Hardware Lab in a Pocket 2003 Int'l Conf. on Multimedia and Information and Communication Technologies in Education Ilia Polian, Bernd BeckerConfiguring MISR-Based Two-Pattern BIST Using Boolean Satisfiability 2003 IEEE Design and Diagnostics of Electronic Circuits and Systems , pages : 73 - 80» show abstract « hide abstract Abstract Most known two-pattern BIST architectures require prohibitive test application times in order to obtain high delay fault coverage. In an earlier work, a BIST TPG block based on a MISR expanding input vectors into test sequences of length 2^n has been refined such that shorter sequences could be used. By doing so, spectacular reductions of test application time were achieved. However, the procedures for computing input vectors were based on BDD-backed state traversal or extensive simulation, and their run-time requirements were tremendous. In this work, we propose a SAT-based input vector generation method, which leads to test application times that are similar to the earlier method, but with significantly reduced computational effort. For the first time, we present experimental results for larger ISCAS benchmarks, which previous methods could not treat. Furthermore, we discuss the application of our method when the requirement to apply the complete test pair set is relaxed. Tobias Schubert, Bernd BeckerDas Mobile Hardware-Praktikum 2003 European Conf. on Media in Higher Education Ilia Polian, Bernd Becker, Sudhakar M. ReddyEvolutionary Optimization of Markov Sources for Pseudo Random Scan BIST 2003 Conf. on Design, Automation and Test in Europe , pages : 1184 - 1185 Frank Schmiedle, Rolf Drechsler, Bernd BeckerExact Routing with Search Space Reduction 2003 IEEE Trans. on Computers , volume : 52, issue : 6, pages : 815 - 825 Frank SchmiedleExakte Verdrahtung mit symbolischen Methoden Logos Verlag, Berlin, 2003» show abstract « hide abstract Abstract Verdrahtung ist eine wichtige Aufgabe im Bereich derphysikalischen Synthese des computergestützten Schaltkreisentwurfs.Obwohl heuristische Ansätze oft brauchbare Resultateliefern, treten gerade in der Praxis auch immerwieder Probleminstanzen auf, bei denen die durchheuristische Methoden gefundenen Lösungenunzureichend sind.In diesem Buch wird ein exaktes Verdrahtungsverfahren behandelt. Dieses bedient sich symbolischer Methoden, was in diesem Fall bedeutet, dass zur Repräsentation von Verdrahtungsregionen Entscheidungsdiagrammeverwendet werden.Im ersten Teil des Buches wird in erster Linieauf die hier relevanten Varianten von Entscheidungsdiagrammen eingegangen - in diesem Rahmen werden auch ausgewählteVerfahren zu deren Minimierung diskutiert.Der zweite Teil beschäftigt sich mit der lokalen Verdrahtung. Neben einer detailliertenBeschreibung des exakten Verdrahtungsansatzeswerden hier auch die Möglichkeiten der praktischenAnwendung des Verfahrens beleuchtet. Ilia Polian, Piet Engelke, Michel Renovell, Bernd BeckerModelling Feedback Bridging Faults With Non-Zero Resistance 2003 European Test Workshop , pages : 91 - 96» show abstract « hide abstract Abstract We study the behavior of feedback bridging faults with non-zero bridge resistance in both combinational and sequential circuits. We demonstrate that a test vector may detect the fault, not detect the fault or lead to oscillation, depending on bridge resistance. Moreover, the resistance intervals in which a particular behavior is observed are not necessarily contiguous. Even loops going through a gate with controlling values on its side inputs (which we call disabled loops) expose non-trivial behavior. We outline the multiple strengths problem which arises due to the fact that a critical bridge resistance depends on the strengths of the signals driving the bridge, which in turn are functions of the number of the on-transistors, these again depending on the bridge resistance, making such a fault very hard to resolve. For sequential circuits, we describe additional difficulties caused by the need to account for implications on bridge behavior, which have originated in the previous time frames. We conclude that the complexity of resistive feedback bridging fault simulation accurate enough to resolve such situations will probably be prohibitively high and propose possible simplifying assumptions. We present simulation results for ISCAS benchmarks using these assumptions with and without taking oscillation into account. Ilia Polian, Bernd BeckerMultiple Scan Chain Design for Two-Pattern Testing 2003 Jour. Electronic Testing , volume : 19, issue : 1, pages : 37 - 48» show abstract « hide abstract Abstract Non-standard fault models often require the application of two-pattern testing. A fully-automated approach for generating a multiple scan chain-based architecture is presented so that two-pattern test sets generated for the combinational core can be applied to the sequential circuit. Test time and area overhead constraints are considered. Ilia Polian, Wolfgang Günther, Bernd BeckerPattern-Based Verification of Connections to Intellectual Property Cores 2003 INTEGRATION, the VLSI Jour. , volume : 35, issue : 1, pages : 25 - 44» show abstract « hide abstract Abstract Verification of designs containing pre-designed cores is a challenging topic in modern IC design, since traditional approaches generally do not use the information that parts of the design (like IP cores) are already verified. The Port Order Fault Model (POF) has recently been introduced for detecting design errors occurring during integration of a core into a System-on-Chip or during test logic insertion. In this work, we generate verification patterns with 100 percent coverage of a sub-class of POF, called 2-POF. We provide theoretical arguments and experimental results backing the efficiency of these patterns also for detecting higher-order POFs. Moreover, verification pattern sets generated by our approach are more compact compared to the results published before. Martin Keim, M. Martin, Bernd Becker, Rolf Drechsler, Paul MolitorPolynomial Formal Verification of Multipliers 2003 Formal Methods in System Design , volume : 22, issue : 1, pages : 39 - 58 Rolf Drechsler, Wolfgang Günther, Thomas Eschbach, L. Linhard, G. AngstRecursive Bi-Partitioning of Netlists for Large Number of Partitions 2003 Journal of Systems Architecture , volume : 49, issue : 12-15, pages : 521 - 528 Ilia Polian, Bernd BeckerReducing ATE Cost in System-on-Chip 2003 IFIP VLSI-SoC , pages : 337 - 342 Ilia Polian, Bernd BeckerReducing ATE Cost in System-on-Chip Test 2003 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” , pages : 34 - 37 Ilia Polian, Bernd BeckerReducing ATE Cost in System-on-Chip Test 2003 IEEE Int'l Workshop on Test Resource Partitioning » show abstract « hide abstract Abstract Traditional SoC test scheduling approaches minimize test time under additional constraints. We argue that test costs are not determined by test time alone. Indeed, the speed of used ATE channels influences both cost and test time. We present a case for using a mixture of high-speed and low-cost ATE channels. Two heuristics and an exact algorithm are used. Experimental results show that such a mixture scenario can reduce the cost with no impact on test time. J. Bradford, H. Delong, Ilia Polian, Bernd BeckerSimulating Realistic Bridging and Crosstalk Faults in an Industrial Setting 2003 Jour. Electronic Testing , volume : 19, issue : 4, pages : 387 - 395» show abstract « hide abstract Abstract Three different techniques for simulating realistic faults generated from IC layout are discussed. Two of them deal with bridging faults, and the third one handles crosstalk faults. The simulation is performed on top of a commercial simulator and thus is very well applicable in an industrial environment. No change of the design database and only minimal changes of the test shell are required. Experimental results are reported for a library cell and a block from a full-custom design. Piet Engelke, Ilia Polian, Michel Renovell, Bernd BeckerSimulating Resistive Bridging Faults 2003 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” , pages : 92 - 97» show abstract « hide abstract Abstract We present a simulator for resistive bridging and stuck-at faults. In contrast to earlier work, it is based on electrical equations rather than table look-up, thus exposing more flexibility. For the first time, simulation of sequential circuits is dealt with; reciprocal action of fault effects in current time frame and earlier time frames is elaborated on for different bridge resistances. Experimental results are given for resistive bridging and stuck-at faults in combinational and sequential circuits. Different definitions of fault coverage are listed and quantitative results with respect to all these definitions are given for the first time. Piet Engelke, Ilia Polian, Michel Renovell, Bernd BeckerSimulating Resistive Bridging and Stuck-At Faults 2003 Int'l Test Conf. , pages : 1051 - 1059 Piet Engelke, Ilia Polian, Michel Renovell, Bernd BeckerSimulating Resistive Bridging and Stuck-at Faults 2003 IEEE Int'l Workshop on Current and Defect-Based Testing , pages : 49 - 56» show abstract « hide abstract Abstract We present a simulator for resistive bridging and stuck-at faults. In contrast to earlier work, it is based on electrical equations rather than table look-up, thus exposing more flexibility. For the first time, simulation of sequential circuits is dealt with; reciprocal action of fault effects in current time frame and earlier time frames is elaborated on for different bridge resistances. Experimental results are given for resistive bridging and stuck-at faults in combinational and sequential circuits. Different definitions of fault coverage are listed and quantitative results with respect to all these definitions are given for the first time. Ilia Polian, Wolfgang Günther, Bernd BeckerThe Case For 2-POF 2003 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 164 - 173» show abstract « hide abstract Abstract The Port Order Fault Model (POF) has recently been introduced for detecting design errors occuring during integration of a core into a System-on-Chip or during test logic insertion. In an earlier work, we generated verification vectors with 100 percent coverage of a sub-class of POF, called 2-POF. Demonstrating empirically that these sets of verification vectors also cover an other sub-class of POF, 3-POF, almost completely, we concluded their efficiency also for higher-order POFs. In this paper, we provide theoretical arguments and more empirical results supporting the claim that verification vectors with high coverage of lower-order POFs are very likely to be also efficient in detecting higher-order POFs. Ilia Polian, Wolfgang Günther, Bernd BeckerThe Case For 2-POF 2003 IEEE Design and Diagnostics of Electronic Circuits and Systems , pages : 291 - 292 back to the year overview Christoph Scholl, Bernd BeckerChecking Equivalence for Circuits Containing Incompletely Specified Boxes 2002 Int'l Conf. on Computer Design , page : 56» show abstract « hide abstract Abstract We consider the problem of checking whether an implementation which contains parts with incomplete information is equivalent to a given full specification. We study implementations which are not completely specified, but contain boxes which are associated with incompletely specified functions (called Incompletely Specified Boxes or IS--Boxes). After motivating the use of implementations with Incompletely Specified Boxes we define our notion of equivalence for this kind of implementations and present a method to solve the problem. A series of experimental results demonstrates the effectiveness and feasibility of the methods presented. Thomas Eschbach, Wolfgang Günther, Rolf Drechsler, Bernd BeckerCrossing Reduction by Windows Optimization 2002 Int'l Symp. on Graph Drawing , volume : 2528, pages : 285 - 294 Ilia Polian, Piet Engelke, Bernd BeckerEfficient Bridging Fault Simulation of Sequential Circuits Based on Multi-Valued Logics 2002 Int'l Symp. on Multi-Valued Logic , pages : 216 - 222» show abstract « hide abstract Abstract We present the concept of a multi-valued logic simulator for bridging faults in sequential circuits. Different models for the handling of intermediate values in flip-flops on the digital design level can be integrated and result in an Expected realistic behavior area for bridging faults. Several experimental results are given to underline properties and advantages of the simulation technique Christoph Scholl, Bernd BeckerEquivalence Checking in the Presence of Incompletely Specified Boxes 2002 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract We consider the problem of checking whether an implementation which contains parts with incomplete information is equivalent to a given full specification. We study implementations which are not completely specified, but contain boxes which are associated with incompletely specified functions (called Incompletely Specified Boxes or IS--Boxes). After motivating the use of implementations with Incompletely Specified Boxes we define our notion of equivalence for this kind of implementations and present a method to solve the problem. A series of experimental results demonstrates the effectiveness and feasibility of the methods presented. Ilia Polian, Irith Pomeranz, Bernd BeckerExact Computation of Maximally Dominating Faults and Its Application to n-Detection Tests 2002 IEEE Asian Test Symp. , pages : 9 - 14» show abstract « hide abstract Abstract n-detection test sets for stuck-at faults have been shown to be useful in detecting unmodeled defects. It was also shown that a set of faults, called maximally dominating faults, can play an important role in controlling the increase in the size of an n-detection test set as n is increased. In an earlier work, a superset of the maximally dominating fault set was used. In this work, we propose a method to determine exact sets of maximally dominating faults. We also define a new type of n-detection test sets based on the exact set of maximally dominating faults. We present experimental results to demonstrate the usefulness of this exact set in producing high-quality n-detection test sets. Ilia Polian, Irith Pomeranz, Bernd BeckerExact Computation of Maximally Dominating Faults and Its Application to n-Detection Tests 2002 European Test Workshop Ilia Polian, Bernd BeckerGo BIST 2002 IEEE Int'l Online Testing Workshop , pages : 147 - 151» show abstract « hide abstract Abstract A BIST method enabling two-pattern testing at-speed without violating thermal constraints by introducing cool down periods is proposed. The application of the method is demonstrated based on a scalable BIST architecture. Applicability on IP cores is given since only a two-pattern test set is required as input. Ilia Polian, Bernd BeckerOptimal Bandwidth Allocation in Concurrent SoC Test Under Pin Number Constraints 2002 Workshop on RTL and High Level Testing , pages : 12 - 17» show abstract « hide abstract Abstract A method for assigning pins to cores in concurrent System-on-Chip testing is proposed. For each core, the number of assigned pins is traded for test application time using multiple scan chains. Given a pin number constraint, the overall test time is minimized. The used algorithm is formally proven to be optimal, although it has low polynomial computational complexity. The approach is applicable to any kind of testing which requires transport of test data from test equipment to the cores, including scan testing and scan-based BIST. J. Bradford, H. Delong, Ilia Polian, Bernd BeckerRealistic Fault Simulation in an Industrial Setting 2002 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Rolf Drechsler, Wolfgang Günther, Thomas Eschbach, L. Linhard, G. AngstRecursive Bi-Partitioning of Netlists for Large Number of Partitions 2002 Euromicro Conf. , pages : 38 - 44 Ilia Polian, Martin Keim, Nicolai Mallig, Bernd BeckerSequential n-Detection Criteria: Keep It Simple! 2002 IEEE Int'l Online Testing Workshop , pages : 189 - 190» show abstract « hide abstract Abstract The idea of n-detection is to exposure a target fault in n different ways. To denote several detections as `sufficiently different', formal criteria seem to be necessary. In this article, we study the properties of test sequences with respect to four new criteria, which we call difference types. We try to guide the n-detection test sequence generation process for stuck-at faults by difference types and evaluate coverage of a Surprisingly, we have found out that the effectiveness in detecting non-target faults is not affected beyond statistical noise. In our opinion this result underlines the effectiveness of n-detection without sophisticated differentiation criteria. J. Bradford, H. Delong, Ilia Polian, Bernd BeckerSimulating Realistic Bridging and Crosstalk Faults in an Industrial Setting 2002 European Test Workshop , pages : 75 - 80 Klaus-Jürgen Englert, Rolf Drechsler, Bernd BeckerVerification of HDLs using Symbolic Set Representation 2002 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” back to the year overview Rolf Drechsler, J. RömmlerImplementation and Visualization of a BDD Package in JAVA , 2001 Marc HerbstrittzChaff: Modifications and Extensions , issue : 188, 2001 Wolfgang Günther, Andreas Hett, Bernd BeckerApplication of Linearly Transformed BDDs in Sequential Verification 2001 ASP Design Automation Conf. , pages : 91 - 96» show abstract « hide abstract Abstract The computation of the set of reachable states is the key problem of many applications in sequential verification. Binary Decision Diagrams (BDDs) are extensively used in this domain, but tend to blow up for larger instances. To increase the computational power of BDDs, linearly transformed BDDs (LTBDDs) have been proposed. In this paper we show how this concept can be incorporated into the sequential verification domain by restricting dynamic reordering in a way that the relational product can still be carried out efficiently. Experimental results are given to show the efficiency of our approach. Christoph Scholl, Bernd BeckerChecking Equivalence for Partial Implementations 2001 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 31 - 43 Christoph Scholl, Bernd BeckerChecking Equivalence for Partial Implementations 2001 IEEE Design Automation Conference , IEEE Computer Society, pages : 238 - 243 Bernd Becker, Christoph Meinel, Masahiro Fujita, Fabio SomenziComputer Aided Design and Test, BDDs versus SAT, Dagstuhl-Seminar-Report 297, 28.01.-02.02.01 2001 Saarbrücken: Geschäftsstelle Schloss Dagstuhl Christoph Scholl, Marc Herbstritt, Bernd BeckerDon't Care Minimization of *BMDs: Complexity and Algorithms 2001 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 45 - 57 Rolf DrechslerEducational Software for Computer Engineering: A Case Study of an Interactive BDD Tool 2001 IEEE Computer Society Learning Technology Task Force , volume : 3, issue : 3 Ilia Polian, Wolfgang Günther, Bernd BeckerEfficient Pattern-Based Verification of Connections to Intellectual Property Cores 2001 IEEE Asian Test Symp. , pages : 443 - 448» show abstract « hide abstract Abstract Verification of designs containing pre-designed cores is a challenging topic in modern IC design, since traditional approaches generally do not use the information that parts of the design are already verified (like IP cores). To verify the connectivity between the surrounding design and IP cores, we propose a method that is based on test patterns. Using only those patterns for simulation, in almost all cases 100 percent of the errors can be detected. Existing test access logic is employed for the application of the patterns. A large set of experimental results is given to demonstrate the efficiency of the approach. Ilia Polian, Wolfgang Günther, Bernd BeckerEfficient Pattern-Based Verification of Connections to Intellectual Property Cores 2001 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : I:111 - 120 Christoph Scholl, Marc Herbstritt, Bernd BeckerExploiting Don't Cares to Minimize *BMDs 2001 IEEE Int'l Symp. on Circuits and Systems , volume : 5, pages : 191 - 194 Christoph SchollFunctional Decomposition with Application to FPGA Synthesis Kluwer Academic Publishers, 2001 Bernd Becker, Rolf Drechsler, Thomas Eschbach, Wolfgang GüntherGREEDY IIP: Partitioning Large Graphs by Greedy Iterative Improvement 2001 Euromicro Conf. , pages : 54 - 60 Nicole Drechsler, Frank Schmiedle, Daniel Große, Rolf DrechslerHeuristic Learning based on Genetic Programming 2001 EuroGP , Springer Verlag, volume : 2038, pages : 1 - 10 Rolf Drechsler, Wolfgang Günther, L. Linhard, G. AngstLevel Assignment for Displaying Combinational Logic 2001 Euromicro Conf. , page : 148 Ilia Polian, Bernd BeckerMultiple Scan Chain Design for Two-Pattern Testing 2001 VLSI Test Symp. , pages : 88 - 93 Ilia Polian, Bernd BeckerMultiple Scan Chain Design for Two-Pattern Testing 2001 Latin-American Test Workshop , pages : 156 - 161 Ilia Polian, Bernd BeckerMultiple Scan Chain Design for Two-Pattern Testing 2001 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Wolfgang Günther, Rolf DrechslerPerformance Driven Optimization for MUX based FPGAs 2001 Int'l Conf. on VLSI Design , pages : 311 - 316» show abstract « hide abstract Abstract Logic synthesis and technology mapping for Multiplexor (MUX) based Field Programmable Gate Arrays (FPGAs) have widely been considered. In this paper, an algorithm is proposed that applies techniques from logic synthesis during mapping to minimize the delay. By this, the target technology is considered during the minimization process. Binary Decision Diagrams (BDDs) are used as an underlying data structure for netlists, combining both structural and functional properties. To evaluate a netlist, a fast technology mapper is used. Since most of the changes to a netlist are local, also re-mapping can be done locally, allowing a fast but reliable evaluation after each modification. We give experimental results that show large improvements for most of the instances we considered. Frank Schmiedle, Nicole Drechsler, Daniel Große, Rolf DrechslerPriorities in Multi-Objective Optimization for Genetic Programming 2001 Conf. on Genetic and Evolutionary Computation , pages : 129 - 136» show abstract « hide abstract Abstract A new technique for multi-objective optimization is presented that allows to include priorities. But in contrast to previous techniques they can be included very easily and do not require much user interaction. The new approach is studied from a theoretical and practical point of view. The main differences to existing methods, like relation dominate and favor, are discussed. An experimental study of applying priorities in heuristics learning based on Genetic Programming (GP) is described. The experiments confirm the advantages presented in comparison to several other techniques. Frank Schmiedle, Wolfgang Günther, Rolf DrechslerSelection of Efficient Re-Ordering Heuristics for MDD Construction 2001 Int'l Symp. on Multi-Valued Logic , pages : 299 - 304» show abstract « hide abstract Abstract Multi-valued decision diagrams (MDDs) are a generalization of binary decision diagrams (BDDs). They are suitable for several applications in synthesis and verification of integrated circuits since often, functions with multi-valued input variables can be represented efficiently by MDDs. Their sizes counted in number of nodes vary from linear to exponential dependent on the variable ordering used. Therefore sifting, i.e. dynamic variable re-ordering, has to be applied frequently while an MDD is built in order to keep the number of nodes needed during the process small. Often most of the runtime for MDD construction is spent for sifting. We present a new method that speeds up MDD construction and also reduces memory consumption. It is based on the selection of re-ordering heuristics dependent on the history of the construction process. Success of previous re-ordering steps as well as the frequency of sifting calls in the past are used to determine a variation of sifting that is applied next. Experimental results are given to demonstrate that runtimes and memory consumption can be reduced by 30 percent on average when the proposed selection methods are used during MDD construction. M. Thornton, Rolf Drechsler, D. M. MillerSpectral Techniques in VLSI CAD Kluwer Academic Publisher, 2001 Andreas Hett, Bernd BeckerSupervised Dynamic Reordering in Model Checking 2001 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , pages : 21 - 30 Christoph Scholl, Bernd Becker, A. BrogleThe Multiple Variable Order Problem for Binary Decision Diagrams: Theory and Practical Application 2001 ASP Design Automation Conf. , pages : 85 - 90» show abstract « hide abstract Abstract Reduced Ordered Binary Decision Diagrams (ROBDDs) gained widespread use in logic design verification, test generation, fault simulation, and logic synthesis [MWBS:88,Bry:92]. Since the size of an ROBDD heavily depends on the variable order used, there is a strong need to find variable orders that minimize the number of nodes in an ROBDD. In certain applications we have to cope with ROBDDs with different variable orders, whereas further manipulations of these ROBDDs require common variable orders. In this paper we give a theoretical background for this "Multiple Variable Order problem". Moreover, we solve the problem to transform ROBDDs with different variable orders into a good common variable order using dynamic variable ordering techniques. Frank Schmiedle, Daniel Große, Rolf Drechsler, Bernd BeckerToo Much Knowledge Hurts: Acceleration of Genetic Programs for Learning Heuristics 2001 Int'l Conf. on Computational Intelligence , volume : 2206, pages : 479 - 491» show abstract « hide abstract Abstract Among many other applications, evolutionary methods have been used to develop heuristics for several optimization problems in VLSI CAD in recent years. Although learning is performed according to a set of training benchmarks, it is most important to generate heuristics that have a good generalization behaviour and hence are well suited to be applied to unknown examples. Besides large runtimes for learning, the major drawback of these approaches is that they are very sensitive to a variety of parameters for the learning process. In this paper, we study the impact of different parameters, like e.g. stopping conditions, on the quality of the results for learning heuristics for BDD minimization. If learning takes too long, the developed heuristics become too specific for the set of training examples and in that case results of application to unknown problem instances deteriorate. It will be demonstrated here that runtime can be saved while even improving the generalization behaviour of the heuristics. Rolf Drechsler, Wolfgang Günther, Fabio SomenziUsing Lower Bounds during Dynamic BDD Minimization 2001 IEEE Trans. on CAD , volume : 20, issue : 1, pages : 51 - 57 P. Johannsen, Rolf DrechslerUtilizing High-Level Information for Formal Hardware Verification (Invited Talk) 2001 Advanced Computer Systems , pages : 419 - 431 Frank Schmiedle, A. Markert, Bernd BeckerXMaDRE: A Routing Environment with Visualization based on Ray-Tracing 2001 Conf. on Design, Automation and Test in Europe Wolfgang Günther, Rolf Drechslerlementation of Read-k-times BDDs on top of standard BDD packages 2001 Int'l Conf. on VLSI Design , pages : 173 - 178» show abstract « hide abstract Abstract Ordered Binary Decision Diagrams (OBDDs) are the state-of-the-art data structure in VLSI CAD for representation and manipulation of Boolean functions. But due to the ordering restriction, many Boolean functions cannot be represented efficiently. As one alternative Read-k-times BDDs have been proposed. They are a generalization of OBDDs in the way that variables may occur up to k times on each path, while they may only occur once in OBDDs. More functions can be represented by Read-k-times BDDs in polynomial space than by OBDDs, while many operations, like synthesis and satisfiability, still have polynomial worst case behavior. In this paper, we present a new technique for implementation of Read-k-times BDD packages on top of standard OBDD implementations. Thus, highly optimized OBDD packages can be used and only few changes in the code are needed, while the new type of decision diagram allows much smaller representations. Experimental results are given to demonstrate the efficiency of the approach. back to the year overview Christoph Scholl, Bernd BeckerChecking Equivalence for Partial Implementations , issue : 145, 2000» show abstract « hide abstract Abstract We consider the problem of checking whether a partial implementation can (still) be extended to a complete design which is equivalent to a given full specification. Several algorithms trading off accuracy and computational resources are presented: Starting with a simple 0,1,X-based simulation, which allows approximate solutions, but is not able to find all errors in the partial implementation, we consider more and more exact methods finally covering all errors detectable in the partial implementation. The exact algorithm reports no error if and only if the current partial implementation conforms to the specification, i.e. it can be extended to a full implementation which is equivalent to the specification. We give a series of experimental results demonstrating the effectiveness and feasibility of the methods presented. Christoph Scholl, Marc Herbstritt, Bernd BeckerExploiting Don't Cares to Minimize *BMDs , issue : 141, 2000» show abstract « hide abstract Abstract We present for the first time methods to minimize *BMDs exploiting don't care conditions. These minimization methods can be used during the verification of circuits by *BMDs. By changing function values for input vectors, which are in the don't care set, smaller *BMDs can be computed to keep peak memory consumption during *BMD construction as low as possible. We determine the complexity of the problem of don't care minimization for *BMDs and thus justify the use of heuristics to approximate the solution. Preliminary experimental results prove our heuristcs to be very effective in minimizing *BMD sizes. M. A. Thornton, Rolf Drechsler, Wolfgang GüntherA Method for Approximate Equivalence Checking 2000 Int'l Symp. on Multi-Valued Logic , pages : 447 - 452» show abstract « hide abstract Abstract A probabilistic equivalence checking method is developed based on the use of partial Haar Spectral Diagrams (HSDs). Partial HSDs are defined and used to represent a subset of Haar spectral coefficients for two Boolean functions. The resulting coefficients are then used to compute and to iteratively refine the probability that two functions are equivalent. This problem has applications in both logic synthesis and verification. The method described here can be useful for the case where two candidate functions require extreme amounts of memory for a complete BDD representation. Experimental results are provided to validate the effectiveness of this approach. Piet Engelke, Bernd Becker, Martin KeimA Parameterizable Fault Simulator for Bridging Faults 2000 European Test Workshop , pages : 63 - 68 Martin Keim, Piet Engelke, Bernd BeckerA Parameterizable Fault Simulator for Bridging Faults 2000 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Wolfgang Günther, Rolf DrechslerACTion: Combining Logic Synthesis and Technology Mapping for MUX based FPGAs 2000 Journal of Systems Architecture , volume : 46, issue : 14, pages : 1321 - 1334» show abstract « hide abstract Abstract Technology mapping for Multiplexor (MUX) based Field Programmable Gate Arrays (FPGAs) has widely been considered. Here, a new algorithm is proposed that applies techniques from logic synthesis during technology mapping, i.e. the target technology is considered in the minimization process. Binary Decision Diagrams (BDDs) are used as an underlying data structure combining both structural and functional properties. The algorithm uses local don't cares obtained by a greedy algorithm. To evaluate a netlist, a fast technology mapper is used. Since most of the changes to a netlist are local, also re-mapping can be done locally, allowing a fast but reliable evaluation after each modification. Both area and delay minimization are addressed in this paper. We compare the approach to several previously published algorithms. In most cases these results can be further improved. Compared to SIS, an improvement of 23 percent for area and 18 percent for delay can be observed on average. Wolfgang Günther, Rolf DrechslerACTion: Combining Logic Synthesis and Technology Mapping for MUX based FPGAs 2000 Int'l Workshop on Logic Synth. Wolfgang Günther, Rolf DrechslerACTion: Combining Technology Mapping and Logic Synthesis for MUX based FPGAs 2000 Euromicro Conf. , pages : 130 - 137» show abstract « hide abstract Abstract Technology mapping for Multiplexor (MUX) based Field Programmable Gate Arrays (FPGAs) has widely been considered. Here, a new algorithm is proposed that applies techniques from logic synthesis during mapping. By this, the target technology is considered in the minimization process. Binary Decision Diagrams (BDDs) are used as an underlying data structure due to the close relation between BDDs and MUX netlists. The algorithm uses local don't cares obtained by a greedy algorithm. The mapping is sped up by computing signatures. A trade-off quality versus runtime can be specified by the user by setting different parameters. Experimental results comparing the approach to the best known results show improvements of more than 30 percent for area and 40 percent for delay for many instances. Rolf Drechsler, Nicole Drechsler, Elke Mackensen, Tobias Schubert, Bernd BeckerDesign Reuse by Modularity: A Scalable Dynamical (Re)Configurable Multiprocessor System 2000 Euromicro Conf. , pages : 1:425 - 431 Andreas Hett, Christoph Scholl, Bernd BeckerDistance Driven Finite State Machine Traversal 2000 IEEE Design Automation Conference , pages : 39 - 42» show abstract « hide abstract Abstract Symbolic techniques have revolutionized reachability analysis in the last years. Extending their applicability to handle large, industrial designs is a key issue, involving the need to focus on memory consumption for BDD representation as well as time consumption to perform symbolic traversals of Finite State Machines (FSMs). We address the problem of reachability analysis for large FSMs, introducing a novel technique that performs reachability analysis using a sequence of "distance driven" partial traversals based on dynamically chosen prunings of the transition relation. Experiments are given to demonstrate the efficiency and robustness of our approach: We succeed in completing reachability problems with significantly smaller memory requirements and improved time performance. Frank Schmiedle, Wolfgang Günther, Rolf DrechslerDynamic Re-Encoding During MDD Minimization 2000 Int'l Symp. on Multi-Valued Logic , pages : 239 - 244» show abstract « hide abstract Abstract Multi-valued decision diagrams (MDDs) are a generalization of binary decision diagrams (BDDs). They often allow efficient representation of functions with multi-valued input variables similar to BDDs in the binary case. Therefore they are suitable for several applications in synthesis and verification of integrated circuits. MDD sizes counted in number of nodes vary from linear to exponential dependent on the variable ordering used. In all these applications, minimization of MDDs is crucial. In many cases, multi-valued variables are composed by a certain number of binary variables, and so the multi-valued inputs arise by grouping binary variables. The selection of these groups, that is, the decision which variables to merge, has enormous impact on MDD sizes. Techniques for finding variable groupings before starting MDD minimization have been proposed recently. In this paper, we present a new method that uses re-encoding, i.e. dynamic variable grouping. We don't choose one fixed variable grouping before minimizing MDDs, but allow to change the binary variables to be considered together during the minimization process. This is possible since MDDs are simulated on top of BDDs. By this, the underlying binary variables remain accessible throughout the minimization process. This technique is described in detail and we also show experimental results that demonstrate the efficiency of our approach. Wolfgang Günther, Rolf Drechsler, S. HörethEfficient Dynamic Minimization of Word-Level DDs based on Lower Bound Computation 2000 Int'l Workshop on Logic Synth. Wolfgang Günther, Rolf Drechsler, S. HörethEfficient Dynamic Minimization of Word-Level DDs based on Lower Bound Computation 2000 Int'l Conf. on Computer Design , pages : 383 - 388» show abstract « hide abstract Abstract Word-Level Decision Diagrams (WLDDs), like *BMDs or K*BMDs, have been introduced to overcome the limitations of Binary Decision Diagrams (BDDs), which are the state-of-the-art data structure to represent and manipulate Boolean functions. However, the size of these graph types largely depends on the variable ordering, i.e. it may vary from linear to exponential. In the meantime, dynamic approaches to find a good variable ordering are also known for WLDDs. In this paper we show how these approaches can be accelerated significantly using a combination of a lower bound computation and synthesis operations. In the experiments it turned out that by this technique, the runtime for dynamic minimization can be reduced by more than 40 percent on average without loss of quality. Rolf Drechsler, Wolfgang GüntherEvolutionary Synthesis of Multiplexor Circuits under Hardware Constraints 2000 Conf. on Genetic and Evolutionary Computation , pages : 513 - 518» show abstract « hide abstract Abstract Multiplexor based Field Programmable Gate Arrays (FPGAs) are a very popular design style. However, during circuit design only limited hardware resources are available. In this paper we present an approach to multiplexor (MUX) based circuit evolution under hardware constraints. Binary Decision Diagrams (BDDs) are used as the underlying data structure. An evolutionary technique is proposed that allows to stay within given hardware resources while minimizing the error in parallel. Starting from a correct circuit, i.e. a BDD representing the complete function, some parts are removed, based on evolutionary principles, until all constraints are met. The experimental results presented demonstrate the possibility to handle for the first time instances composed of several thousand gates. Frank Schmiedle, D. Unruh, Bernd BeckerExact Switchbox Routing with Search Space Reduction 2000 Int'l Symp. on Physical Design , pages : 26 - 32» show abstract « hide abstract Abstract We present an approach for exact switchbox routing that complements traditional routing techniques. It is particularly well suited for an application to dense problem instances and the completion of routing in subregions which turn out to be difficult for routing tools based on heuristic methods. The exact router proposed uses symbolic methods i.e. MDDs (Multi-valued Decision Diagrams) for representation of the routing space. All possible solutions to the routing problem are represented by one single MDD and once this MDD is given, routability can be decided within constant time. To reduce the search space of possible routing solutions, so-called forced cells are computed. Finally, experimental results are given. They show the feasibility and the practicability of the approach. Rolf Drechsler, Nicole Drechsler, Wolfgang GüntherFast Exact Minimization of BDDs 2000 IEEE Trans. on CAD , volume : 19, issue : 3, pages : 384 - 389» show abstract « hide abstract Abstract We present a new exact algorithm for finding the optimal variable ordering for reduced ordered Binary Decision Diagrams (BDDs). The algorithm makes use of a lower bound technique known from VLSI design. Up to now this technique has been used only for theoretical considerations and it is adapted here for our purpose. Furthermore, the algorithm supports symmetry aspects and uses a hashing based data structure. Experimental results are given to demonstrate the efficiency of our approach. We succeeded in minimizing several functions, including adders with up to 64 variables, for which all other previously presented approaches fail. Rolf DrechslerFormal Verification of Circuits Kluwer Academic Publishers, 2000 Wolfgang Günther, Rolf DrechslerImproving EAs for Sequencing Problems 2000 Conf. on Genetic and Evolutionary Computation , pages : 175 - 180» show abstract « hide abstract Abstract Sequencing problems have to be solved very often in VLSI CAD. To obtain results of high quality, Evolutionary Algorithms (EAs) have been successfully applied in many cases. However, they often suffer from the high CPU time which is necessary for the computation. In this paper we propose three techniques to speed up EAs without loss of quality. We give a case study for the problem of optimizing the variable ordering of Binary Decision Diagrams (BDDs). Experimental results are given to demonstrate the efficiency of the approach. Dragan Janković, Wolfgang Günther, Rolf DrechslerLower Bound Sifting for MDDs 2000 Int'l Symp. on Multi-Valued Logic , pages : 193 - 198» show abstract « hide abstract Abstract Decision Diagrams (DDs) are a data structure for the representation and manipulation of discrete logic functions often applied in VLSI CAD. Common DDs to represent Boolean functions are Binary Decision Diagrams (BDDs). Multiple-valued logic functions can be represented by Multiple-valued Decision Diagrams (MDDs). The efficiency of a DD representation strongly depends on the variable ordering; the size may vary from linear to exponential. Finding a good ordering is an NP-hard problem that has received considerable attention resulting in many minimization methods. Sifting is the most popular heuristic for dynamic DD minimization. In this paper we give lower bounds for sifting of MDDs. Based on them, both lower bound sifting for MDD minimization and lower bound group sifting for BDD minimization are proposed. By the computation of good lower bounds large parts of the search space can be pruned resulting in very fast computations. This is demonstrated by experimental results. Wolfgang GüntherMinimization of Free BDDs using Evolutionary Techniques 2000 Int'l Workshop on Logic Synth. , pages : 167 - 172 Wolfgang Günther, Rolf DrechslerOn the Computational Power of Linearly Transformed BDDs 2000 Information Processing Letters , volume : 75, issue : 3, pages : 119 - 125» show abstract « hide abstract Abstract Binary Decision Diagrams (BDDs) are a powerful tool and are frequently used in many applications in VLSI CAD, like synthesis and verification. However, they are very sensitive to the variable ordering and their size often becomes infeasible. Recently, a new approach for BDD minimization based on linear transformations, i.e. a special type of spectral techniques, has been proposed. In this work we study linear transformations from a theoretical point of view. We prove for a family of Boolean functions that the concept of linear transformations is really more powerful than "pure" BDDs, i.e. there is a class of Boolean functions for which an exponential difference in size exists. Therefore, linear transformations can prevent an exponential blow-up of the BDD. Rolf Drechsler, Wolfgang GüntherOptimization of Sequential Verification by History-Based Dynamic Minimization of BDDs 2000 IEEE Int'l Symp. on Circuits and Systems , pages : IV:737 - IV:740» show abstract « hide abstract Abstract Binary Decision Diagrams (BDDs) are the state-of-the-art data structure in VLSI CAD, especially in sequential verification tasks, like state-space exploration and image computation. Since their size largely depends on the chosen variable ordering, dynamic variable reordering methods, like sifting, often have to be applied while the BDD is constructed. Usually sifting is called each time a given node limit is reached and it is therefore called frequently during the construction of large BDDs. Often most of the runtime is spent for sifting while the BDD is built. Recently an approach to reduce runtime during BDD construction for combinational circuits by using history-based decision procedures has been proposed. In this paper we show that for sequential circuits different criteria should be used to select the type of sifting that involve knowledge from the sequential domain. The algorithm has been included in VIS. Experimental results show that the overall runtime can be reduced significantly and is also clearly superior to the combinational approach. Tobias Schubert, Elke Mackensen, Nicole Drechsler, Rolf Drechsler, Bernd BeckerSpecialized Hardware for Implementation of Evolutionary Algorithms 2000 Int'l Workshop on Boolean Problems , pages : 175 - 182 Tobias Schubert, Elke Mackensen, Nicole Drechsler, Rolf Drechsler, Bernd BeckerSpecialized Hardware for Implementation of Evolutionary Algorithms 2000 Conf. on Genetic and Evolutionary Computation , page : 369 Wolfgang GüntherSpeeding up Dynamic Minimization of Linearly Transformed BDDs , issue : 144, 2000» show abstract « hide abstract Abstract Binary Decision Diagrams (BDDs) are frequently used in many applications in VLSI CAD. However, they are very sensitive to the variable ordering and their size often becomes infeasible. To reduce the BDD sizes, Linear Transformations (LTs), a special type of spectral techniques, have been successfully applied in many cases. The most commonly used heuristic to find a good LT, i.e. an LT which leads to a small BDD representation, is called linear sifting. It can be seen as an extension of the widely used dynamic BDD minimization algorithm called sifting by a linear operator. In this paper, we prove lower bounds for Linearly Transformed BDDs (LTBDDs) under certain conditions and show that it is possible to speed up linear sifting significantly using these bounds. Experimental results are given to show the efficiency of our approach. Andreas Hett, Christoph Scholl, Bernd BeckerState Traversal guided by Hamming Distance Profiles 2000 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , VDE Verlag, pages : 57 - 66 Rolf Drechsler, Wolfgang Günther, Bernd BeckerTestability of Circuits Derived from Lattice Diagrams 2000 Euromicro Conf. , pages : 188 - 192» show abstract « hide abstract Abstract In this paper the testability of circuits derived from BDDs representing totally symmetric functions is analyzed. A test pattern generation technique is presented that has runtime linear in the size of the BDD. The result is directly applicable to circuits derived from lattice diagrams, a new design style that combines the synthesis and the layout step. Experimental results show that complete test generation for functions with more than 500 variables can be done in less than 1 CPU second. Rolf Drechsler, Wolfgang Günther, Bernd BeckerTestability of Circuits Derived from Lattice Diagrams 2000 Latin-American Test Workshop Wolfgang Günther, Nicole Drechsler, Rolf Drechsler, Bernd BeckerVerification of Designs Containing Black Boxes 2000 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” » show abstract « hide abstract Abstract Often modern designs contain regions where the implementation of certain components is not (fully) known. These regions are called black boxes in the following. They occur e.g. if different designers work on a project in parallel or if IP cores are used. In this paper an approach based on a symbolic representation of characteristic functions for verifying circuits with black boxes is presented. We show that by this method more faults can be detected than with pure binary simulation and symbolic simulation using BDDs, respectively, only. This results from the formulation of our algorithm that allows implications over the black box. Experimental results are given to show what parts of a design can be proven to be correct, if black boxes are assumed. Of course, the probability for the detection of a fault in general depends on the size of the unknown regions. But fault injection experiments on benchmarks show that for many circuits even up to 90 percent of the faults are detected, even though large parts of the design are unspecified. Wolfgang Günther, Nicole Drechsler, Rolf Drechsler, Bernd BeckerVerification of Designs Containing Black Boxes 2000 Euromicro Conf. , pages : 100 - 105» show abstract « hide abstract Abstract Often modern designs contain regions where the implementation of certain components is not (fully) known. These regions are called black boxes in the following. They occur e.g. if different designers work on a project in parallel or if IP cores are used. In this paper an approach based on a symbolic representation of characteristic functions for verifying circuits with black boxes is presented. We show that by this method more faults can be detected than with pure binary simulation and symbolic simulation using BDDs, respectively, only. This results from the formulation of our algorithm that allows implications over the black box. Experimental results are given to show what parts of a design can be proven to be correct, if black boxes are assumed. Of course, the probability for the detection of a fault in general depends on the size of the unknown regions. But fault injection experiments on benchmarks show that for many circuits even up to 90 percent of the faults are detected, even though large parts of the design are unspecified. Wolfgang Günther, R. Schönfeld, Bernd Becker, Paul Molitork-Layer Straightline Crossing Minimization by Speeding up Sifting 2000 Graph Drawing Conf. , Springer Verlag, volume : 1984, pages : 253 - 258» show abstract « hide abstract Abstract =3. In this paper, we present two methods to speed up sifting. First, it is shown how the crossing matrix can be computed and updated efficiently. Then, we study lower bounds which can be incorporated in the sifting algorithm, allowing to prune large parts of the search space. Experimental results show that it is possible to speed up sifting by more than a factor of 20 using the new methods during the minimization process. back to the year overview Christoph Scholl, Bernd Becker, A. BrogleSolving the Multiple Variable Order Problem for Binary Decision Diagrams by Use of Dynamic Reordering Techniques , issue : 130, 1999» show abstract « hide abstract Abstract Reduced Ordered Binary Decision Diagrams (ROBDDs) gained widespread use in logic design verification, test generation, fault simulation, and logic synthesis [MWBS:88,Bry:92]. Since the size of an ROBDD heavily depends on the variable order used, there is a strong need to find variable orders that minimize the number of nodes in an ROBDD. In certain applications we have to cope with ROBDDs with different variable orders, whereas further manipulations of these ROBDDs require common variable orders. In this paper we solve the problem to transform ROBDDs with different variable orders into a good common variable order. To do so, we make use of dynamic variable ordering techniques. Rolf Drechsler, Marc Herbstritt, Bernd BeckerGrouping Heuristics for Word-level Decision Diagrams 1999 IEEE Int'l Symp. on Circuits and Systems , pages : 411 - 415 Rolf Drechsler, Bernd BeckerProbabilistic IP Verification 1999 Int'l Workshop on Testing Embedded Core-based System-Chips Nicole Drechsler, Rolf Drechsler, Bernd BeckerA New Model for Multi-Objective Optimization in Evolutionary Algorithms 1999 Int'l Conf. on Computational Intelligence , Springer Verlag, volume : 1625, pages : 108 - 117» show abstract « hide abstract Abstract Many optimization problems consist of several mutually dependent subproblems, where the resulting solutions must satisfy all requirements. We propose a new model for Multi-Objective Optimization (MOO) in Evolutionary Algorithms (EAs). The search space is partitioned into so-called Satisfiability Classes (SC), where each region represents the quality of the optimization criteria. Applying the SCs to individuals in a population a fitness can be assigned during the EA run. The model also allows the handling of infeasible regions and restrictions in the search space. Additionally, different priorities for optimization objectives can be modeled. Advantages of the model over previous approaches are discussed and an application is given that shows the superiority of the method for modeling MOO problems. Martin Keim, Ilia Polian, Harry Hengster, Bernd BeckerA Scalable BIST Architecture for Delay Faults 1999 European Test Workshop , pages : 98 - 103» show abstract « hide abstract Abstract In this paper we present a scalable BIST (Built-In Self Test) architecture that provides a tunable trade-off between on-chip area demand and test execution time for delay fault testing. So, the architecture can meet test execution time requirements or available area requirements or any target in between. Experiments show the scalability of our approach, e.g. that considerably shorter test execution time can be achieved by storing only a few additional input vectors of the BIST architecture. The gain of test execution time possible with the proposed method ranges from a factor of 2 up to a factor of more than 800000. Andreas Hett, Christoph Scholl, Bernd BeckerA.MORE - A Multi-Operand BDD Package University of Freiburg, 1999» show abstract « hide abstract Abstract Decision Diagrams (DDs) are often used in CAD systems for the efficient representation and manipulation of Boolean functions. The most popular data structure in this context are Ordered Binary Decision Diagrams (OBDDs) that are used in many applications. This package provides a novel approach for synthesizing BDDs using operator nodes instead of procedure calls. Thus, transformations on this data structure are possible as well as the flexibility to adapt the synthesis process in order to execute the 'most primising paths' first, resulting in less superfluous calculations and lower memory usage due to the skipping of unneeded branches. Christoph Scholl, D. Möller, Paul Molitor, Rolf DrechslerBDD Minimization Using Symmetries 1999 IEEE Trans. on CAD , volume : 18, issue : 2, pages : 81 - 100 Martin Keim, Nicole Drechsler, Bernd BeckerCombining GAs and Symbolic Methods for High Quality Tests of Sequential Circuits 1999 ASP Design Automation Conf. , pages : 315 - 318 Bernd Becker, Christoph Meinel, Shin-Ichi Minato, Fabio SomenziComputer Aided Design and Test, Decision Diagrams - Concepts and Applications, (99041), Dagstuhl-Seminar-Report 229, 24.01.-29.1.1999 1999 Saarbrücken: Geschäftsstelle Schloss Dagstuhl Wolfgang Günther, Rolf DrechslerCreating Hard Problem Instances in Logic Synthesis using Exact Minimization 1999 IEEE Int'l Symp. on Circuits and Systems , pages : VI:436 - VI:439» show abstract « hide abstract Abstract To evaluate synthesis algorithms, usually benchmark circuits are used. Since for these circuits no exact synthesis results are known, we propose the use of exact minimization to generate hard problem instances. By this we evaluate a standard synthesis tool on different classes of circuits. Paul Molitor, Christoph SchollDatenstrukturen und effiziente Algorithmen fuer die Logiksynthese kombinatorischer Schaltungen B.G. Teubner, 1999 Wolfgang Günther, Rolf DrechslerEfficient Manipulation Algorithms for Linearly Transformed BDDs 1999 Int'l Conf. on CAD , pages : 50 - 53» show abstract « hide abstract Abstract Binary Decision Diagrams (BDDs) are the state-of-the-art data structure in VLSI CAD. But due to their ordering restriction only exponential sized BDDs exist for many functions of practical relevance. Linear Transformations (LTs) have been proposed as a new concept to minimize the size of BDDs and it is known that in some cases even an exponential reduction can be obtained. In addition to a small representation, the efficient manipulation of a data structure is also important. In this paper we present polynomial time manipulation algorithms that can be used for Linearly Transformed BDDs (LT-BDDs) analogously to BDDs. For some operations, like synthesis algorithms based on ITE, it turns out that the techniques known from BDDs can be directly transferred, while for other operations, like quantification and cofactor computation, completely different algorithms have to be used. Experimental results are given to show the efficiency of the approach. Nicole Drechsler, Rolf DrechslerExploiting Don't Cares During Data Sequencing using Genetic Algorithms 1999 ASP Design Automation Conf. , pages : 303 - 306 S. Höreth, Rolf DrechslerFormal Verification of Word-Level Specifications 1999 Conf. on Design, Automation and Test in Europe , pages : 52 - 58 Rolf Drechsler, Marc Herbstritt, Bernd BeckerGrouping Heuristics for Word-level Decision Diagrams 1999 GI/ITG/GMM Workshop “Methoden und Beschreibungssprachen zur Modellierung und Verifikation von Schaltungen und Systemen” , Shaker Verlag, pages : 41 - 50 Rolf Drechsler, Wolfgang GüntherHistory-based Dynamic Minimization during BDD Construction 1999 IFIP Int'l Conf. on VLSI , pages : 334 - 345» show abstract « hide abstract Abstract Binary Decision Diagrams (BDDs) are the state-of-the-art data structure in VLSI CAD. Since their size largely depends on the chosen variable ordering, dynamic variable reordering methods, like sifting, often have to be applied while the BDD for a given circuit is constructed. Usually sifting is called each time a given node limit is reached and it is therefore called frequently during the construction of large BDDs. Often most of the runtime is spent for sifting while the BDD is built. In this paper we propose an approach to reduce runtime (and space requirement) during BDD construction by using history-based decision procedures. Dependent on the history of the construction process different types of sifting are called. We propose two methods that consider the quality of the hash table and the size reduction of previous sifting runs, respectively. Experimental results show that both approaches reduce the runtime significantly, i.e. by more than 40 percent on average. Bernd Becker, Martin Keim, R. KriegerHybrid Fault Simulation for Synchronous Sequential Circuit 1999 Jour. Electronic Testing , volume : 3, pages : 219 - 238 Andreas HettMORE good BDD Ideas 1999 Dagstuhl Seminar on Computer Aided Design and Test, Decision Diagrams: Concepts and Applications Wolfgang Günther, Rolf DrechslerMinimization of BDDs using Linear Transformations based on Evolutionary Techniques 1999 IEEE Int'l Symp. on Circuits and Systems , pages : I:387 - I:390» show abstract « hide abstract Abstract Binary Decision Diagrams (BDDs) are frequently used in many applications in VLSI CAD. However, they are very sensitive to the variable ordering and their size often becomes infeasible. Recently, a new approach for BDD minimization based on Linear Transformations (LTs), i.e. a special type of spectral techniques, has been proposed. We present an Evolutionary Algorithm (EA) to find an LT for which the BDD becomes small. The genetic operators make use of problem specific knowledge. Experimental results are given to show the efficiency of our approach. Wolfgang Günther, Rolf DrechslerMinimization of Free BDDs 1999 ASP Design Automation Conf. , pages : 323 - 326» show abstract « hide abstract Abstract Free BDDs (FBDDs) are an extension of ordered BDDs (OBDDs) that allow different orderings along each path. FBDDs allow a more efficient representation, while keeping (nearly) all of the properties of OBDDs. In some cases even an exponential reduction can be observed. In this paper we present for the first time an exact algorithm for finding a minimal FBDD representation for a given Boolean function. To reduce the huge search space, it makes use of a pruning technique. The algorithm also considers symmetries of the function. Since the algorithm is only applicable to small functions we also present a heuristic for FBDD minimization starting from an OBDD. Our experiments show that in many cases significant improvements can be obtained. Christoph Scholl, Bernd BeckerOn the Generation of Multiplexer Circuits for Pass Transistor Logic 1999 Int'l Workshop on Logic Synth. » show abstract « hide abstract Abstract used instead and should provide a much higher potential for optimization compared to a pure BDD approach. Nevertheless - to the best of our knowledge - an optimization of general Multiplexer Circuits (MCs) for PTL synthesis was not tried so far due to a lack of suitable optimization approaches. In this paper we present such an algorithm which is based on efficient BDD optimization techniques. Our experiments prove that there is indeed a high optimization potential by the use of general MCs - both concerning area and depth of the resulting PTL networks. Y. Ye, K. Roy, Rolf DrechslerPower Consumption in XOR-Based Circuits 1999 ASP Design Automation Conf. , pages : 299 - 302 Christoph Scholl, Bernd Becker, A. BrogleSolving the Multiple Variable Order Problem for Binary Decision Diagram by Use of Dynamic Reordering Techniques 1999 Int'l Workshop on Logic Synth. » show abstract « hide abstract Abstract Reduced Ordered Binary Decision Diagrams (ROBDDs) gained widespread use in logic design verification, test generation, fault simulation, and logic synthesis [MWBS:88,Bry:92]. Since the size of an ROBDD heavily depends on the variable order used, there is a strong need to find variable orders that minimize the number of nodes in an ROBDD. In certain applications we have to cope with ROBDDs with different variable orders, whereas further manipulations of these ROBDDs require common variable orders. In this paper we solve the problem to transform ROBDDs with different variable orders into a good common variable order. To do so, we make use of dynamic variable ordering techniques. Christoph Scholl, Bernd Becker, A. BrogleSolving the Multiple Variable Order Problem for Binary Decision Diagram by Use of Dynamic Reordering Techniques , issue : 130, 1999» show abstract « hide abstract Abstract Reduced Ordered Binary Decision Diagrams (ROBDDs) gained widespread use in logic design verification, test generation, fault simulation, and logic synthesis [MWBS:88,Bry:92]. Since the size of an ROBDD heavily depends on the variable order used, there is a strong need to find variable orders that minimize the number of nodes in an ROBDD. In certain applications we have to cope with ROBDDs with different variable orders, whereas further manipulations of these ROBDDs require common variable orders. In this paper we solve the problem to transform ROBDDs with different variable orders into a good common variable order. To do so, we make use of dynamic variable ordering techniques. Harry Hengster, Bernd BeckerSynthesis of Circuits Derived from Decision Diagrams - Combining Small Delay and Testability - 1999 Int'l Symp. on Defect and Fault Tolerance , pages : 268 - 275 Per Lindgren, Rolf Drechsler, Bernd BeckerSynthesis of Pseudo Kronecker Lattice Diagrams 1999 Int'l Conf. on Computer Design , pages : 307 - 310 Rolf Drechsler, Wolfgang GüntherUsing Lower Bounds during Dynamic BDD Minimization 1999 IEEE Design Automation Conference , pages : 29 - 32» show abstract « hide abstract Abstract Ordered Binary Decision Diagrams (BDDs) are a data structure for representation and manipulation of Boolean functions often applied in VLSI CAD. The choice of the variable ordering largely influences the size of the BDD; its size may vary from linear to exponential. The most successful methods for finding good orderings are based on dynamic variable reordering, i.e. exchanging of neighboring variables. This basic operation has been used in various variants, like sifting and window permutation. In this paper we show that lower bounds computed during the minimization process can speed up the computation significantly. First, lower bounds are studied from a theoretical point of view. Then these techniques are incorporated in dynamic minimization algorithms. By the computation of good lower bounds large parts of the search space can be pruned resulting in very fast computations. Experimental results are given to demonstrate the efficiency of our approach. Rolf Drechsler, Nicole DrechslerVLSI CAD and the Integration of Evolutionary Techniques CRC Press International Series on Computational Intelligence, 1999 back to the year overview Martin Keim, Nicole Göckel, Rolf Drechsler, Bernd BeckerCombining GAs and Symbolic Methods for High Quality Tests of Sequential Circuits , issue : 105/98, 1998» show abstract « hide abstract Abstract A symbolic fault simulator is integrated in a Genetic Algorithm (GA) environment to perform {Automatic Test Pattern Generation} (ATPG) for synchronous sequential circuits. In a two phase algorithm test length and fault coverage as well are optimized. However, there are circuits, that are hard to test using random pattern sequences, even if these sequences are genetically optimized. Thus, deterministic aspects are included in the GA environment to improve fault coverage. Experiments demonstrate that both a priori time consuming strategies can be combined at reasonable costs: Tests with higher fault coverages and considerably shorter test sequences than previously presented approaches are obtained. Wolfgang Günther, Rolf DrechslerBDD Minimization by Linear Transformations 1998 Advanced Computer Systems , pages : 525 - 532» show abstract « hide abstract Abstract Binary Decision Diagrams (BDDs) are a powerful tool and are frequently used in many applications in VLSI CAD, like synthesis and verification. Unfortunately, BDDs are very sensitive to the variable ordering and their size often becomes infeasible. Recently, a new approach for BDD minimization based on linear transformations, i.e. a special type of spectral techniques, has been proposed. In this paper we study this minimization method in more detail. While so far only experimental results are known, we prove for a family of Boolean functions that by linear transformations an exponential blow up of the BDD size can be prevented. Furthermore, we present a heuristic method that allows to significantly reduce the BDD sizes. By experiments we give a comparison to the state-of-the-art method to demonstrate the advantages of our approach. Rolf Drechsler, Bernd BeckerBinary Decision Diagrams - Theory and Implementation Kluwer Academic Publishers, 1998 Martin Keim, Nicole Drechsler, Rolf Drechsler, Bernd BeckerCombining GAs and Symbolic Methods for High Quality Tests of Sequential Circuits 1998 European Test Workshop , pages : 141 - 142 S. Höreth, Rolf DrechslerDynamic Minimization of Word-Level Decision Diagrams 1998 Conf. on Design, Automation and Test in Europe , pages : 612 - 617 G. Lee, Rolf DrechslerETDD-based Generation of Complex Terms for Incompletely Specified Boolean Functions 1998 ASP Design Automation Conf. , pages : 75 - 80 Rolf DrechslerEvolutionary Algorithms for VLSI CAD Kluwer Academic Publisher, 1998 Rolf Drechsler, Wolfgang GüntherExact Circuit Synthesis 1998 Int'l Workshop on Logic Synth. , pages : 177 - 184 Rolf Drechsler, Wolfgang GüntherExact Circuit Synthesis 1998 Advanced Computer Systems , pages : 517 - 524» show abstract « hide abstract Abstract In this paper we present an exact algorithm for finding a minimal circuit for a given Boolean function. The algorithm is based on graph enumeration and makes use of several pruning techniques to reduce the huge search space. It can handle not only single output but also multiple output functions. The exact netlists obtained are compared to circuits determined by SIS, a state-of-the-art synthesis tool. It turns out that the results can be improved up to 40% also on small examples. Rolf Drechsler, Bernd BeckerGraphenbasierte Funktionsdarstellung B.G. Teubner, 1998 M. Miller, Rolf DrechslerImplementing a Multiple-Valued Decision Diagram Package 1998 Int'l Symp. on Multi-Valued Logic , pages : 52 - 57 Wolfgang Günther, Rolf DrechslerLinear Transformations and Exact Minimization of BDDs 1998 Great Lakes Symp. on VLSI , pages : 325 - 330» show abstract « hide abstract Abstract We present an exact algorithm to find an optimal linear transformation for the variables of a Boolean function to minimize its corresponding ordered Binary Decision Diagram (BDD). To prune the huge search space, techniques known from algorithms for finding the optimal variable ordering are used. This BDD minimization finds direct application in FPGA design. We give experimental results for a large variety of circuits to show the efficiency of our approach. Christoph SchollMulti-output Functional Decomposition with Exploitation of Don't Cares 1998 Conf. on Design, Automation and Test in Europe , pages : 743 - 748» show abstract « hide abstract Abstract Functional decomposition is an important technique in logic synthesis, especially for the design of lookup table based FPGA architectures. We present a method for functional decomposition with a novel concept for the exploitation of don't cares thereby combining two essential goals: the minimization of the numbers of decomposition functions in the current decomposition step and the extraction of common subfunctions for multi-output Boolean functions. The exploitation of symmetries of Boolean functions plays an important role in our algorithm as a means to minimize the number of decomposition functions not only for the current decomposition step but also for the (recursive) decomposition algorithm as a whole. Experimental results prove the effectiveness of our approach. Martin Keim, Bernd BeckerNearly Exact Signal Probabilities for Synchronous Sequential Circuits - An Experimental Analysis , issue : 106/98, 1998» show abstract « hide abstract Abstract In this article we present a strategy how to estimate the signal probabilities for synchronous sequential circuits. For each line in the circuit an OBDDs is constructed only once. Using this OBDD the signal probabilities in every succeeding time frame is computed. We investigated a large set of the ISCAS'89 benchmark circuits, showing that the proposed strategy estimates 75 percent of all signals considered with a relative error of only 10 percent or less. M. Thornton, Rolf DrechslerOutput Probability Using AND/OR Graphs 1998 Research Report, University of Arkansas M. Thornton, Rolf DrechslerSpectral Methods for Digital Logic Verification 1998 Research Report, University of Arkansas Harry Hengster, Bernd BeckerSynthesis of Fully Testable High Speed Circuits Derived from Decision Diagrams 1998 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract We present an approach to synthesize fully stuck-at fault testable circuits along with complete test sets. Starting from functional descriptions given by Kronecker Functional Decision Diagrams (KFDDs) circuits with inherently small delay are constructed by a composition method based on Boolean Matrix Multiplication. During the construction efficient algorithms working on the KFDD are used to avoid the creation of constant lines. Furthermore, in each composition step tests for the faults at the newly generated lines are computed on the fly. Experimental results are given to underline the efficiency of the methods. Harry Hengster, Bernd BeckerSynthesis of Fully Testable High Speed Circuits Derived from Decision Diagrams 1998 Int'l Workshop on Logic Synth. , pages : 341 - 345» show abstract « hide abstract Abstract We present an EXOR-based synthesis approach to obtain fully stuck-at fault testable circuits along with complete test sets. Starting from functional descriptions given by Kronecker Functional Decision Diagrams (KFDDs) circuits with inherently small delay are constructed by a composition method based on Boolean Matrix Multiplication. During the construction efficient algorithms working on the KFDD are used to avoid the creation of constant lines. Furthermore, in each composition step tests for the faults at the newly generated lines are computed on the fly. Experimental results are given to underline the efficiency of the methods. Martin Keim, Nicole Drechsler, Rolf Drechsler, Bernd BeckerTest Generation for (Sequential) Multi-Valued Logic Networks based on Genetic Algorithm 1998 Int'l Symp. on Multi-Valued Logic , pages : 215 - 220 Martin Keim, Nicole Göckel, Rolf Drechsler, Bernd BeckerTest Generation for (Sequential) Multi-Valued Logic Networks based on Genetic Algorithm 1998 Int'l Symp. on Multi-Valued Logic Rolf Drechsler, Martin Sauerhoff, Detlef SielingThe Complexity of the Inclusion Operation on OFDDs 1998 IEEE Trans. on CAD , volume : 17, issue : 5, pages : 457 - 459 Rolf DrechslerVerifying Integrity of Decision Diagrams 1998 SAFECOMP , Springer Verlag, volume : 1516, pages : 380 - 389 Christoph Scholl, Bernd Becker, T. M. WeisWord-Level Decision Diagrams, WLCDs and Division 1998 Int'l Conf. on CAD , pages : 672 - 677» show abstract « hide abstract Abstract Several types of Decision Diagrams (DDs) have been proposed for the verification of Integrated Circuits. Recently, word-level DDs like BMDs, *BMDs, HDDs, K*BMDs and *PHDDs have been attracting more and more interest, e.g., by using *BMDs and *PHDDs it was for the first time possible to formally verify integer multipliers and floating point multipliers of "significant" bitlengths, respectively. On the other hand, it has been unknown, whether division, the operation inverse to multiplication, can be efficiently represented by some type of word-level DDs. In this paper we show that the representational power of any word-level DD is too weak to efficiently represent integer division. Thus, neither a clever choice of the variable ordering, the decomposition type or the edge weights, can lead to a polynomial DD size for division. For the proof we introduce Word-Level Linear Combination Diagrams (WLCDs), a DD, which may be viewed as a "generic" word-level DD. We derive an exponential lower bound on the WLCD representation size for integer dividers and show how this bound transfers to all other word-level DDs. back to the year overview Rolf Drechsler, Nicole GöckelA Genetic Algorithm for Data Sequencing 1997 Electronic Letters , volume : 33, issue : 10, pages : 843 - 845 Nicole Göckel, Martin Keim, Rolf Drechsler, Bernd BeckerA Genetic Algorithm for Sequential Circuit Test Generation based on Symbolic Fault Simulation 1997 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract We present a Genetic Algorithm (GA) for sequential circuit test generation. A Binary Decision Diagram (BDD) based fault simulator, i.e. a fault simulator that uses symbolic methods, is integrated in the GA environment. The fault simulator is used as a more accurate evaluation method. Experiments show that we can generate test patterns for circuits where all previously presented GA approaches fail. Additionally, we reduce the length of the test sequences. A comparison to the pure GA based approach shows the efficiency of this method. Nicole Göckel, Martin Keim, Rolf Drechsler, Bernd BeckerA Genetic Algorithm for Sequential Circuit Test Generation based on Symbolic Fault Simulation 1997 European Test Workshop Nicole Göckel, Rolf Drechsler, Bernd BeckerA Multi-Layer Detailed Routing Approach based on Evolutionary Algorithms 1997 Int'l Conf. on Evolutionary Computation , pages : 557 - 562» show abstract « hide abstract Abstract In this paper we present an Evolutionary Algorithm (EA) for Detailed Routing Problems (DRPs), like the Channel Routing Problem and the Switchbox Routing Problem. We combine EAs with domain specific knowledge, i.e. the genetic operators make use of the rip-up and reroute technique. The algorithm can work with two-layer and multi-layer problem instances. The efficiency of our algorithm is demonstrated by application to multi-layer channel routing benchmarks. Instances with up to five layers, 130 columns, and more than 60 nets are considered. Nicole Göckel, Rolf DrechslerAn Evolutionary Algorithm for Minimizing BDDs of Incompletely Specified Functions 1997 Advanced Computer Systems , pages : 224 - 231 Christoph Scholl, D. Möller, Paul Molitor, Rolf DrechslerBDD Minimization Using Symmetries , 1997 Rolf Drechsler, Nicole Göckel, Elke Mackensen, Bernd BeckerBEA: Specialized Hardware for Implementation of Evolutionary Algorithms 1997 Genetic Programming Conf. , page : 491 S. Höreth, Rolf DrechslerCompilation of Fast Manipulation Algorithms for K*BMDs 1997 IFIP WG 10.5 Workshop on Applications of the Reed-Muller Expansion in Circuit Design , pages : 187 - 196 Bernd Becker, Randy Bryant, Masahiro Fujita, Christoph MeinelComputer Aided Design and Test, Decision Diagrams - Concepts and Applications, (9705), Dagstuhl-Seminar-Report 166, 27.01.-31.1.1997 1997 Saarbrücken: Geschäftsstelle Schloss Dagstuhl Bernd Becker, Rolf DrechslerDecision Diagrams in Synthesis - Algorithms, Applications and Extensions - 1997 Int'l Conf. on VLSI Design , pages : 46 - 50» show abstract « hide abstract Abstract An overview on Decision Diagrams (DDs), the state-of-the-art data structure in computer aided design of integrated circuits, is given. The overview is not complete in the sense that all DDs are considered, but we mention the most important with practical relevance. DDs are widely used and in the meantime have also been integrated in commercial tools. In the following special interest is devoted to the use of DDs in the area of synthesis. We also discuss the influence of DDs in other fields and outline future trends. Rolf Drechsler, A. ZuzekEfficient Functional Diagnosis for Synchronous Sequential Circuits Based on AND/OR Graphs 1997 Int'l Symp. on IC Technologies, Systems and Applications , pages : 312 - 315 Rolf DrechslerEvolutionary Algorithms for Computer Aided Design of Integrated Circuits 1997 Int'l Symp. on IC Technologies, Systems and Applications , pages : 302 - 311 S. Höreth, Rolf DrechslerFast Construction of Kronecker Decision Diagrams from Library Based Circuits 1997 SASIMI , pages : 39 - 44 Rolf Drechsler, Nicole Göckel, Wolfgang GüntherFast Exact Minimization of BDDs , 1997 Andreas Hett, Rolf Drechsler, Bernd BeckerFast and Efficient Construction of BDDs by reordering based Synthesis 1997 European Design and Test Conf. , pages : 168 - 175» show abstract « hide abstract Abstract We present a new approach to symbolic simulation with BDDs. Our method uses Reordering Based Synthesis (RBS) whichallows the integration of dynamic variable ordering (even) within a single synthesis operation (e.g. an AND-operation). Thus huge peak sizes during the construction often can be avoided, and we obtain a method that, with no penalty in runtime, is more memory efficient than traditional ITE operator based symbolic simulation. The results are confirmed by experiments on a large set of benchmarks: We give a comparison to several previously published approaches and also consider some industrial benchmarks which are known to be hard to handle. Rolf Drechsler, Martin Keim, Bernd BeckerFault Simulation in Sequential Multi-Valued Logic Networks 1997 Int'l Symp. on Multi-Valued Logic , pages : 145 - 150» show abstract « hide abstract Abstract In this paper we present a fault simulator for Sequential Multi-Valued Logic Networks (SMVLN). With this tool we investigate their Random Pattern Testability (RPT). We discuss a unified approach for fault models in SMVLNs and show, that it is possible to describe all static fault models with a global formalism. A large set of experimental results is given that demonstrates the efficiency of our approach. For the first time fault coverages for the Stuck-At Fault Model (SAFM) and Skew Fault Model (SKFM) for large sequential circuits are reported. Christoph SchollFunctional Decomposition with Integrated Test Generation 1997 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract Functional decomposition is an important technique in logic synthesis, especially for the design of lookup table based FPGA architectures. We present an approach how to compute test information, e.g. a complete test set (according to the stuck-at-fault model or the cellular fault model), for functionally decomposed circuits. We are able to compute this test information in parallel with logic synthesis by (recursive) functional decomposition. Christoph Scholl, Rolf Drechsler, Bernd BeckerFunctional Simulation using Binary Decision Diagrams 1997 GI/ITG/GME Workshop “Methoden des Entwurfs und der Verifikation digitaler Systeme” » show abstract « hide abstract Abstract In many verification techniques fast functional evaluation of a Boolean network is needed. We investigate the idea of using Binary Decision Diagrams (BDDs) for functional simulation. The area-time trade-off that results from different minimization techniques of the BDD is discussed. We propose new minimization methods based on dynamic reordering that allow smaller representations with (nearly) no runtime penalty. Christoph Scholl, Rolf Drechsler, Bernd BeckerFunctional Simulation using Binary Decision Diagrams 1997 Int'l Workshop on Logic Synth. » show abstract « hide abstract Abstract In many verification techniques fast functional evaluation of a Boolean network is needed. We investigate the idea of using Binary Decision Diagrams (BDDs) for functional simulation. The area-time trade-off that results from different minimization techniques of the BDD is discussed. We propose new minimization methods based on dynamic reordering that allow smaller representations with (nearly) no runtime penalty. Rolf Drechsler, Bernd BeckerGraphenbasierte Funktionsdarstellung B.G. Teubner, 1997 Nicole Göckel, Rolf DrechslerInfluencing Parameters of Evolutionary Algorithms for Sequencing Problems 1997 Int'l Conf. on Evolutionary Computation , pages : 575 - 580» show abstract « hide abstract Abstract Several problems in Computer Aided Design (CAD) of Integrated Circuits (ICs) have to solve sequencing problems. For this reason many algorithms for solving these problems have been proposed. Especially, Evolutionary Algorithms (EAs) have been successfully applied in the past in these areas. In this paper we study the influence of different parameters on runtime and quality for one specific problem of large practical importance for CAD of ICs, i.e. finding the optimal variable ordering of Ordered Binary Decision Diagrams (OBDDs). We consider different genetic operators and a problem specific heuristic. Our study shows that the influence of problem specific knowledge is much more significant than finetuning the EA, especially if runtime is also considered as an optimization criterion. Our results directly transfer to other sequencing problems. Nicole Göckel, Rolf Drechsler, Bernd BeckerLearning Heuristics for OKFDD Minimization by Evolutionary Algorithms 1997 ASP Design Automation Conf. , pages : 469 - 472» show abstract « hide abstract Abstract Ordered Kronecker Functional Decision Diagrams (OKFDDs) are a data structure for efficient representation and manipulation of Boolean functions. OKFDDs are very sensitive to the chosen variable ordering and the decomposition type list, i.e. the size may vary from linear to exponential. In this paper we present an Evolutionary Algorithm (EA) that learns good heuristics for OKFDD minimization starting from a given set of basic operations. The difference to other previous approaches to OKFDD minimization is that the EA does not solve the problem directly. Rather, it develops strategies for solving the problem. To demonstrate the efficiency of our approach experimental results are given. The newly developed heuristics combine high quality results with reasonable time overhead. Rolf Drechsler, Nicole GöckelMinimization of BDDs by Evolutionary Algorithms 1997 Int'l Workshop on Logic Synth. Christoph SchollMulti-output Functional Decomposition with Exploitation of Don't Cares 1997 Int'l Workshop on Logic Synth. » show abstract « hide abstract Abstract ults prove the effectiveness of our approach. C. Ökmen, Martin Keim, R. Krieger, Bernd BeckerOn Optimizing BIST Architecture by Using OBDD-based Approaches and Genetic Algorithms 1997 VLSI Test Symp. , pages : 426 - 431 Martin Keim, M. Martin, Bernd Becker, Rolf Drechsler, Paul MolitorPolynomial Formal Verification of Multipliers 1997 VLSI Test Symp. , pages : 150 - 155» show abstract « hide abstract Abstract Until recently, verifying multipliers with formal methods was not feasible, even for small input word sizes. About two years ago, a new data structure, called Multiplicative Binary Moment Diagram (*BMD), was introduced for representing arithmetic functions over Boolean variables. Based on this data structure, methods were proposed by which verification of multipliers with input word sizes of up to 256 bits became now feasible. Only experimental data has been provided for these verification methods until now. In this paper we give a formal proof that logic verification using *BMDs is polynomially bounded in both space and time, when applied to the class of Wallace-tree like multipliers. Rolf DrechslerPseudo Kronecker Expressions for Symmetric Functions 1997 Int'l Conf. on VLSI Design , pages : 511 - 513 Andreas Hett, Rolf Drechsler, Bernd BeckerReordering Based Synthesis 1997 iwrm » show abstract « hide abstract Abstract Reordering Based Synthesis (RBS) as an alternative approach to manipulate Decision Diagrams (DDs) is presented. Based on the concept of operation nodes a single "core" operation, i.e. an extended Level Exchange (LE), is sufficient to perform the usual synthesis operations on several types of DDs. RBS allows the integration of dynamic variable ordering (even) within a single synthesis operation (e.g. an AND-operation) and thus provides the possibility of avoiding huge peak sizes during the construction. The optimization potential is significantly enlarged by allowing LEs even between operation nodes. A large number of experimental results for the Rolf DrechslerSecure Implementation of Decision Diagrams 1997 Int'l Workshop on Logic Synth. Rolf Drechsler, R. S. Stankovic, T. SasaoSpectral Transforms and Word-Level Decision Diagrams 1997 IFIP WG 10.5 Workshop on Applications of the Reed-Muller Expansion in Circuit Design Rolf Drechsler, R. S. Stankovic, T. SasaoSpectral Transforms and Word-Level Decision Diagrams 1997 SASIMI , pages : 33 - 38 Harry Hengster, Bernd BeckerSynthesis of Fully Testable High Speed Circuits Derived from Decision Diagrams , 1997» show abstract « hide abstract Abstract We present an approach to synthesize fully stuck-at fault testable circuits along with complete test sets. Starting from functional descriptions given by Kronecker Functional Decision Diagrams (KFDDs) circuits with inherently small delay are constructed by a composition method based on Boolean Matrix Multiplication. During the construction efficient algorithms working on the KFDD are used to avoid the creation of constant lines. Furthermore, in each composition step tests for the faults at the newly generated lines are computed on the fly. Experimental results are given to underline the efficiency of the methods. Rolf Drechsler, Harry Hengster, H. Schäfer, J. Hartmann, Bernd BeckerTestability of 2-Level AND/EXOR Expressions 1997 European Design and Test Conf. , pages : 548 - 553» show abstract « hide abstract Abstract It is often stated that AND/EXOR circuits are much easier testable than AND/OR circuits. This statement only holds for restricted classes of AND/EXOR expressions, like positive polarity Reed-Muller expressions and fixed polarity Reed-Muller expressions. For these two classes of circuits good deterministic testability properties are known. In this paper we show that for these circuits also good random pattern testability can be proven. An input probability distribution is given which yields a short expected test length for biased random patterns. This is the first time that theoretical results on random pattern testability are presented for 2-level AND/EXOR circuit realizations of arbitrary Boolean functions. For more general classes of 2-level AND/EXOR circuits analogous results are not proven. We present experimental results that show that in general minimized 2-level AND/OR circuits are as well (or badly) testable as minimized 2-level AND/EXOR circuits. back to the year overview Rolf Drechsler, Nicole GöckelA Genetic Algorithm for Data Sequencing 1996 Online Workshop on Evolutionary Computation » show abstract « hide abstract Abstract In this paper we present a Genetic Algorithm (GA) for the Data Ordering Problem (DOP). The DOP has large application in the area of low power design. By the use of more and more battery driven computers low power is becoming one of the main issues in IC design. Since the power consumption in computers is directly proportional to the switching activity, the sequence in which data words are transmitted is of great importance. We implemented a GA to solve this problem and carried out a large set of experiments. A comparison of our results to previously published demonstrates the efficiency of our approach. Rolf Drechsler, Bernd Becker, Nicole GöckelA Genetic Algorithm for the Construction of Small and Highly Testable OKFDD Circuits 1996 Genetic Programming Conf. , pages : 473 - 478» show abstract « hide abstract Abstract A Genetic Algorithm (GA) is applied to derive circuits that combine area efficiency and testability. These circuits are obtained from Ordered Kronecker Functional Decision Diagrams (OKFDDs). Previously a heuristic approach has been presented. In this paper we show how these results can further be improved by GAs. Finally, we apply our minimization algorithm to technology mapping for FPGAs. We present experimental results to show the efficiency of our approach. Nicole Göckel, G. Pudelko, Rolf Drechsler, Bernd BeckerA Hybrid Genetic Algorithm for the Channel Routing Problem 1996 IEEE Int'l Symp. on Circuits and Systems , pages : IV:675 - IV:678» show abstract « hide abstract Abstract We present a Hybrid Genetic Algorithm (HGA) for the Channel Routing Problem (CRP). To do so we combine a Genetic Algorithm (GA) with domain specific knowledge, i.e. the genetic operators make use of the rip-up and reroute technique. Thereby the execution time of our method is faster than previously presented evolutionary based approaches. Furthermore, concerning space complexity we show by experiments that our HGA can handle large channels (with more than 100 columns). Rolf Drechsler, Andreas Hett, Bernd BeckerA Note on Symbolic Simulation using Desicion Diagrams 1996 ulsi » show abstract « hide abstract Abstract If Decision Diagrams (DDs) are used for the representation of the logical behaviour of a combinational logic circuit it has to be traversed in topological order. At each gate the corresponding synthesis operation is carried out. This traversal process is called symbolic simulation. Obviously the sequence in which the operations are performed at each gate influences the number of nodes needed during the computation. In this paper we consider different traversal strategies for OBDDs and OMDDs. We compare them by means of experiments. Harry Hengster, Rolf Drechsler, S. Eckrich, T. Pfeiffer, Bernd BeckerAND/EXOR based Synthesis of Testable KFDD-Circuits with Small Depth 1996 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract Decision diagrams are used in design automation for efficient representation of Boolean functions. But it is also possible to derive circuits directly from decision diagrams. In this paper we present an approach to synthesize circuits from a very general class of decision diagrams, the ordered Kronecker functional decision diagrams. We investigate area, depth and testability of the synthesized circuits and compare them to circuit designs generated by other synthesis tools. Experimental results show that the presented approach is suitable to overcome the trade-off between depth and testability. Harry Hengster, Rolf Drechsler, S. Eckrich, T. Pfeiffer, Bernd BeckerAND/EXOR based Synthesis of Testable KFDD-Circuits with Small Depth 1996 IEEE Asian Test Symp. , pages : 148 - 154» show abstract « hide abstract Abstract Decision Diagrams are used in design automation for efficient representation of Boolean functions. It is also possible to directly derive circuits from Decision Diagrams. In this paper we present an approach to synthesize circuits from a very general class of Decision Diagrams, the ordered Kronecker Functional Decision Diagrams. These Decision Diagrams make use of Davio decompositions which are based on exclusive-or operations and therefore allow the use of EXOR gates in the synthesized circuits. We investigate area, depth, and testability of these circuits and compare them to circuit designs generated by other synthesis tools. Experimental results show that the presented approach is suitable to overcome the trade-off between depth and testability at the price of reasonable area overhead. Rolf Drechsler, A. ZuzekEfficient Functional Diagnosis for Synchronous Sequential Circuits based on AND/OR Graphs , 1996 Bernd Becker, Rolf DrechslerExact Minimization of Kronecker Expressions for Symmetric Functions 1996 IEEE Int'l Symp. on Circuits and Systems , pages : IV:388 - IV:391 Rolf Drechsler, H. Esbensen, Bernd BeckerGenetic Algorithms in Computer Aided Design of Integrated Circuits 1996 Online Workshop on Evolutionary Computation Rolf Drechsler, Nicole Göckel, Bernd BeckerLearning Heuristics for OBDD Minimization by Evolutionary Algorithms 1996 Parallel Problem Solving from Nature , Springer Verlag, volume : 1141, pages : 730 - 739» show abstract « hide abstract Abstract Ordered Binary Decision Diagrams (OBDDs) are the state-of-the-art data structure in CAD for ICs. OBDDs are very sensitive to the chosen variable ordering, i.e. the size may vary from linear to exponential. In this paper we present an Evolutionary Algorithm (EA) that learns good heuristics for OBDD minimization starting from a given set of basic operations. The difference to other previous approaches to OBDD minimization is that the EA does not solve the problem directly. Rather, it developes strategies for solving the problem. To demonstrate the efficiency of our approach experimental results are given. The newly developed heuristics are more efficient than other previously presented methods. Harry Hengster, U. Sparmann, Bernd Becker, Sudhakar M. ReddyLocal Transformations and Robust Dependent Path Delay Faults 1996 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract Local transformations are used in several synthesis approaches. During application of such transformations attention has to be paid to many important properties, e.g. area, speed, power consumption, and testability. In this paper we study relations between local transformations and delay fault testability. In delay testing it is not necessary to test every path in a circuit to ascertain correct timing behavior. For example, a set of robust dependent path delay faults need not to be considered for testing if all paths that are not robust dependent are tested. We will present sufficient conditions for local transformations which ensure that a test set for all non-robust-dependent paths in the original circuit is also a test set for all non-robust-dependent paths in the transformed circuit. These conditions are applied to some local transformations which are often used in logic synthesis and it is shown that they preserve testability. The impact of local transformations on robust dependent testability is demonstrated by experimental results performed on benchmark circuits. Harry Hengster, U. Sparmann, Bernd Becker, Sudhakar M. ReddyLocal Transformations and Robust Dependent Path Delay Faults 1996 European Test Workshop » show abstract « hide abstract Abstract Local transformations are used in several synthesis approaches. During application of such transformations attention has to be paid to many important properties, e.g. area, speed, power consumption, and testability. In this paper we study relations between local transformations and delay fault testability. In delay testing it is not necessary to test every path in a circuit to ascertain correct timing behavior. For example, a set of robust dependent path delay faults need not to be considered for testing if all paths that are not robust dependent are tested. We present sufficient conditions for local transformations which ensure that a test set for all non-robust-dependent paths in the original circuit is also a test set for all non-robust-dependent paths in the transformed circuit. These conditions are applied to some local transformations which are often used in logic synthesis and it is shown that they preserve testability. The impact of local transformations on robust dependent testability is demonstrated by experimental results performed on benchmark circuits. Harry Hengster, U. Sparmann, Bernd Becker, Sudhakar M. ReddyLocal Transformations and Robust Dependent Path Delay Faults 1996 Int'l Test Conf. » show abstract « hide abstract Abstract Local transformations are used in several synthesis approaches. During application of such transformations attention has to be paid to many important properties, e.g. area, speed, power consumption, and testability. In this paper we study relations between local transformations and delay fault testability. In delay testing it is not necessary to test every path in a circuit to ascertain correct timing behavior. For example, a set of robust dependent path delay faults need not be considered for testing if all paths that are not robust dependent are tested. We present sufficient conditions for local transformations which ensure that a test set for all non-robust-dependent paths in the original circuit is also a test set for all non-robust-dependent paths in the transformed circuit. These conditions are applied to some local transformations which are often used in logic synthesis and it is shown that they preserve testability. The impact of local transformations on robust dependent testability is demonstrated by experimental results performed on benchmark circuits. Andreas Hett, Rolf Drechsler, Bernd BeckerMORE Optimization Techniques. , 1996 Rolf Drechsler, Bernd BeckerOKFDDs - Algorithms, Applications and Extensions Kluwer Academic Publisher, pages : 163 - 190, 1996 Bernd Becker, Rolf Drechsler, Reinhard EndersOn the Computational Power of Bit-Level and Word-Level Decision Diagrams 1996 GI/ITG/GME Workshop “Methoden des Entwurfs und der Verifikation digitaler Systeme” » show abstract « hide abstract Abstract Several types of Decision Diagrams (DDs) have have been proposed in the area of Computer Aided Design (CAD), among them being bit-level DDs like OBDDs, OFDDs and OKFDDs. While the aforementioned types of DDs are suitable for representing Boolean functions at the bit-level and have proved useful for a lot of applications in CAD, recently DDs to represent integer-valued functions, like MTBDDs (=ADDs), EVBDDs, FEVBDDs, (*)BMDs, HDDs (=KBMDs), and K*BMDs, attract more and more interest, e.g., using *BMDs it was for the first time possible to verify multipliers of bit length up to n=256. In this paper we clarify the computational power of these DD classes. Several (inclusion) relations and (exponential) gaps between specific classes differing in the availability of additive and/or multiplicative edge weights and in the choice of decomposition types are shown. It turns out for example, that K(*)BMDs, a generalization of OKFDDs to the word-level, also "include" OBDDs, MTBDDs and (*)BMDs. On the other hand, it is demonstrated that a restriction of the K(*)BMD concept to subclasses, such as OBDDs, MTBDDs, (*)BMDs as well, results in families of functions which lose their efficient representation. Rolf DrechslerPseudo Kronecker Expressions for Symmetric Functions , 1996 Rolf Drechsler, Harry Hengster, H. Schäfer, Bernd BeckerTestability of AND/EXOR Expressions 1996 European Test Workshop » show abstract « hide abstract Abstract It is often stated that AND/EXOR circuits are much easier testable than AND/OR circuits. In this paper we show that this is not true for all 2-level circuits based on EXOR. The statement only holds for restricted classes of AND/EXOR expressions, like Positive Polarity Reed-Muller Expressions and Fixed Polarity Reed-Muller Expressions. AND/EXOR circuits are studied with respect to deterministic and random pattern testability. We present experimental results that show that in general minimized 2-level AND/OR circuits are as well (or badly) testable as minimized 2-level AND/EXOR circuits. Andreas Hett, Rolf Drechsler, Bernd BeckerThe DD Package PUMA - An Online Documentation https://ira.informatik.uni-freiburg.de/en//software/puma/, 1996» show abstract « hide abstract Abstract PUMA is a DD-Package for the efficient representation and manipulation of Boolean Functions with OKFDDs as well as Zero-Suppressed BDDs. Rolf DrechslerVerification of Multi-Valued Logic Networks 1996 Int'l Symp. on Multi-Valued Logic , pages : 10 - 15» show abstract « hide abstract Abstract In this paper a method for verification of Multi-Valued Logic Networks (MVLNs) using Ordered Multi-Valued Decision Diagrams (OMDDs) is presented. For tree-like MVLNs an upper bound on the OMDD size can be proven. Thus, heuristics known for OBDDs can also be used for OMDDs. A large set of experiments is presented that underlines the efficiency of the approach. back to the year overview Rolf Drechsler, Bernd Becker, Nicole GöckelA Genetic Algorithm for 2-Level AND/EXOR Minimization 1995 SASIMI , pages : 49 - 56» show abstract « hide abstract Abstract We show that there is a close relation between Ordered Kronecker Functional Decision Diagrams (OKFDDs) and 2-level AND/EXOR Expressions. This relation together with efficient OKFDD algorithms can be utilized for exact and heuristic minimization of these AND/EXOR expressions, called Reduced Kronecker Expressions (RKROs). We propose several (exact and heuristical) minimization algorithms and demonstrate the efficiency of our approach by experimental results. A Genetic Algorithm (GA) is applied to minimize RKROs. Minimized RKROs are determined for benchmark functions with up to 100 variables. Furthermore, we compare our solutions to results obtainable for other classes of 2-level AND/EXOR forms. In particular, AND/EXOR expressions derived from OKFDDs turn out to be much smaller than the well-known class of Fixed-Polarity Reed-Muller Expressions (FPRMs), although the complexity of the minimization algorithms is comparable. Our GA approach is a practical alternative to the exact algorithm. It produced optimal results for all considered benchmarks, but it is also applicable to functions with more than 20 variables due to its short runtimes. Rolf Drechsler, Bernd Becker, Nicole GöckelA Genetic Algorithm for Variable Ordering of OBDDs 1995 Int'l Workshop on Logic Synth. , pages : 5c:5.55 - 5.64» show abstract « hide abstract Abstract A Genetic Algorithm (GA) is applied to find a variable ordering that minimizes the size of Ordered Binary Decision Diagrams (OBDDs). OBDDs are a data structure for representation and manipulation of Boolean functions often applied in CAD. The choice of the variable ordering largely influences the size of the OBDD, i.e. its size may vary from polynomial to exponential. Dynamic variable ordering is the state-of-the-art method for finding good variable orderings. In this paper we show by experimental results that better sizes can be obtained using GAs. Our GA approach is a practical alternative to the exact algorithm for variable ordering. It produced optimal results for most considered benchmarks, but it is also applicable to functions with more than 20 variables due to its short runtimes. Rolf Drechsler, Bernd Becker, Nicole GöckelA Genetic Algorithm for Variable Ordering of OBDDs , issue : 5/95, 1995 Harry Hengster, Rolf Drechsler, Bernd BeckerAND/OR/EXOR based Synthesis of KFDD-Circuits with Small Depth 1995 Reed-Muller Colloquium UK » show abstract « hide abstract Abstract Construction of circuits from Decision Diagrams (DDs) has shown several advantages, e.g. the resulting circuits have nice testability properties. One major drawback of circuits derived from ordered DDs is that their depth is proportional to the number of inputs. In this paper we present an approach to design DD circuits derived from Kronecker Functional DDs (KFDDs) with small depth. Experimental results are given that show the efficiency of our approach. Bernd Becker, Randal Bryant, Olivier Coudert, Christoph MeinelComputer Aided Design and Test, Dagstuhl-Seminar-Report 105 (9507), 13.2.1995 - 17.2.1995 1995 Saarbrücken: Geschäftsstelle Schloss Dagstuhl Rolf Drechsler, Bernd Becker, S. RuppertzDynamic Minimization of K*BMDs , 1995 Rolf Drechsler, Bernd BeckerDynamic Minimization of OKFDDs , issue : 5/95, 1995 Bernd Becker, Rolf DrechslerExact Minimization of Kronecker Expressions for Symmetric Functions 1995 IFIP WG 10.5 Workshop on Applications of the Reed-Muller Expansion in Circuit Design , pages : 240 - 245 R. Krieger, Bernd Becker, R. SinkovićNecessary Assignments for an Accelerated OBDD-based Computation of Exact Fault Detection Probabilities 1995 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” R. Krieger, Bernd Becker, C. ÖkmenOBDD-based Optimization of Input probabilities for weighted random test 1995 Int'l Symp. on Defect and Fault Tolerance , pages : 120 - 129 Bernd Becker, Rolf Drechsler, M. TheobaldOKFDDs versus OBDDs and OFDDs 1995 ICALP , Springer Verlag, volume : 944, pages : 475 - 486» show abstract « hide abstract Abstract Ordered Decision Diagrams (ODDs) as a means for the representation of Boolean functions are used in many applications in CAD. Depending on the decomposition type, various classes of ODDs have been defined, the most important being the Ordered Binary Decision Diagrams (OBDDs), the Ordered Functional Decision Diagrams (OFDDs) and the Ordered Kronecker Functional Decision Diagrams (OKFDDs). In this paper we clarify the computational power of OKFDDs versus OBDDs and OFDDs from a (more) theoretical point of view. We prove several exponential gaps between specific types of ODDs. Combining these results it follows that a restriction of the OKFDD concept to subclasses, such as OBDDs and OFDDs as well, results in families of functions which lose their efficient representation. Bernd Becker, Rolf Drechsler, Reinhard EndersOn the Computational Power of Bit-Level and Word-Level Decision Diagrams , 1995 Bernd Becker, Rolf Drechsler, R. WerchnerOn the Relation Between BDDs and FDDs 1995 Information and Computation , volume : 123(2), pages : 185 - 197 Rolf DrechslerOrdered Kronecker Functional Decision Diagrams und ihre Anwendungen Modell Verlag, Ph.D. thesis at J.W. Goethe-Universität, 1995 Rolf Drechsler, Bernd BeckerPUMA: An OKFDD-Package and its Implementation 1995 European Design and Test Conf. R. Krieger, Bernd Becker, Martin KeimSymbolic Fault Simulation for Sequential Circuits and the Multiple Observation Time Test Strategy 1995 IEEE Design Automation Conference , pages : 339 - 344» show abstract « hide abstract Abstract Fault simulation for synchronous sequential circuits is a very time-consuming task. The complexity of the task increases if there is no information about the initial state of the circuit. In this case an unknown initial state is assumed which is usually handled by introducing a three-valued logic. As it is well-known fault simulation based on this logic only determines a lower bound of the fault coverage. Recently it was shown that fault simulation based upon the multiple observation time test strategy can improve the accuracy of the fault coverage. In this paper we describe how this strategy can be successfully implemented based on Ordered Binary Decision Diagrams. Our experiments demonstrate the efficiency of the fault simulation procedure developed. Bernd Becker, Rolf DrechslerSynthesis for Testability: Circuits Derived from Ordered Kronecker Functional Decision Diagrams 1995 European Design and Test Conf. , page : 592» show abstract « hide abstract Abstract Testability properties of circuits derived from Ordered Kronecker Functional Decision Diagrams (OKFDDs) are studied with respect to the Stuck-At Fault Model (SAFM) and the Cellular Fault Model (CFM). The computation of complete test sets and of all occuring redundancies can be done easily and efficiently and circuits with high testability can be obtained. The full paper including experimental results can be obtained from the authors. back to the year overview Rolf DrechslerBiTeS: A BDD based test pattern generator for strong robust path delay faults 1994 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Rolf DrechslerBiTeS: A BDD based test pattern generator for strong robust path delay faults 1994 European Design Automation Conf. , pages : 322 - 327» show abstract « hide abstract Abstract This paper presents an algorithm for generation of test patterns for strong robust path delay faults, i.e. tests that propagate the fault along a single path and additionally are hazard free. The algorithm uses Binary Decision Diagrams (BDDs) for the computation and representation of the test sets. It has been implemented as the program BiTeS. The implementation is very simple, since only BDD operations must be carried out. BiTeS computes the complete set of patterns that lead to a test instead of a single test. Thus, methods for test set compaction can easily be applied. Experimental results are presented to demonstrate the efficiency of BiTeS. Rolf Drechsler, M. Theobald, Bernd BeckerFast FDD based Minimization of Generalized Reed-Muller Forms 1994 Workshop über Komplexitätstheorie, Datenstrukturen und effiziente Algorithmen Workshop über Komplexitätstheorie, Datenstrukturen und effiziente Algorithmen Rolf Drechsler, H. Esbensen, Bernd BeckerGenetic Algorithms in Computer Aided Design of Integrated Circuits , issue : 17/94, 1994» show abstract « hide abstract Abstract An increasing number of successful applications of Genetic Algorithms (GAs) within Computer Aided Design (CAD) of Integrated Circuits (ICs) is reported in the literature. The problems dealt with in this field consist of sequences of sub-problems, which are characterized by being NP-hard, large and mutually dependent. This very high level of complexity should make the GA a well suited approach, at least in principle. However, the GA is of practical interest to the CAD community if and only if it is competitive to the existing approaches with respect to performance. With this fact as the starting point, the purpose of this paper is to discuss how to develop high-performance GAs for CAD of ICs . After reviewing a number of recent, performance competitive GAs for key problems in CAD of ICs, the common characteristics of these algorithms are discussed. They all exploit problem specific knowledge in various ways or are integrated with other heuristics. The paper also discusses performance evaluation principles. To make an impact, it is crucial that performance is evaluated using the measures commonly applied in the CAD field. The practical implications hereof for GA-based applications are addressed. Bernd Becker, Rolf Drechsler, M. TheobaldMinimization of 2-level AND/XOR Expressions using Ordered Kronecker Functional Decision Diagrams , issue : 3/94, 1994 Rolf Drechsler, Bernd Becker, A. JahnkeOn Variable Ordering and Decomposition Type Choice in OKFDDs 1994 Workshop über Komplexitätstheorie, Datenstrukturen und effiziente Algorithmen Rolf Drechsler, Bernd Becker, A. JahnkeOn Variable Ordering and Decomposition Type Choice in OKFDDs , issue : 11/94, 1994 Bernd Becker, Rolf Drechsler, M. TheobaldOn Variable Ordering of Functional Decision Diagrams , issue : TR-94-006, 1994 Bernd Becker, Rolf Drechsler, M. TheobaldOn Variable Ordering of Ordered Functional Decision Diagrams 1994 GI/GME/ITG-Fachtagung “Rechnergestützter Entwurf und Architektur mikroelektronischer Systeme” , Springer Verlag, pages : 62 - 71 Rolf Drechsler, A. Sarabi, M. Theobald, Bernd Becker, M. A. PerkowskiOn the Computational Power of Ordered Kronecker Functional Decision Diagrams , issue : 4/94, 1994 Rolf Drechsler, A. Sarabi, M. Theobald, Bernd Becker, M. A. PerkowskiOrdered Kronecker Functional Decision Diagrams: An Efficient Tool for Synthesis and Verification 1994 GI/ITG Workshop “Anwendung formaler Methoden im Systementwurf” D. Möller, Paul Molitor, Rolf DrechslerSymmetry based Variable Ordering for ROBDDs 1994 IFIP Workshop on Logic and Architecture Synthesis, Grenoble , pages : 47 - 53 Bernd Becker, Rolf DrechslerSynthesis for Testability: Circuits Derived from Ordered Kronecker Functional Decision Diagrams 1994 Int'l Test Synthesis Workshop Bernd Becker, Rolf DrechslerSynthesis for Testability: Circuits Derived from Ordered Kronecker Functional Decision Diagrams , issue : 14/94, 1994 Harry Hengster, Rolf Drechsler, Bernd BeckerTestability Properties of Local Circuit Transformations with Respect to the Robust Path-Delay-Fault Model 1994 Int'l Conf. on VLSI Design , pages : 123 - 126» show abstract « hide abstract Abstract We present a new approach to show that local circuit transformations which improve the area of a circuit preserve or improve robust path-delay-fault testability. In contrast to previously published methods which had to consider the whole circuit we examine only the subcircuits to be transformed. Furthermore, we present some transformations which preserve or improve testability. Bernd Becker, Rolf DrechslerTestability of Circuits Derived from Functional Decision Diagrams 1994 European Design and Test Conf. , page : 667» show abstract « hide abstract Abstract We investigate the testability properties of Boolean networks derived from a special class of multi-level AND/ EXOR expressions, called ordered functional decicision diagrams (OFDDs). We consider the stuck-at fault model (SAFM) and the cellular fault model (CFM). All occuring redundancies are classified. The resulting circuits are highly testable with nearly 100% fault coverage on average. The full paper including all experimental results can be obtained from the authors. back to the year overview Bernd Becker, Rolf Drechsler, Harry Hengster, R. Krieger, R. SinkovićBinary Decision Diagrams and Testing 1993 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract Binary Decision Diagrams (BDD's) can be successfully applied to a wide variety of tasks in digital system design, verification and testing. In this paper we focus on the area of testing and demonstrate how two key problems, the computation of signal and fault detection probabilities and the synthesis of testable circuits, can profit from a BDD-based approach. Bernd Becker, Randal Bryant, Christoph MeinelComputer Aided Design and Test: 15.2.1993-19.2.1993 (9307), Dagstuhl Seminar Report 56 1993 Saarbrücken: Geschäftsstelle Schloss Dagstuhl Rolf Drechsler, A. Sarabi, M. Theobald, Bernd Becker, M. A. PerkowskiEfficient Representation and Manipulation of Switching Functions Based on Ordered Kronecker Functional Decision Diagrams , issue : 14/93, 1993 Rolf Drechsler, M. Theobald, Bernd BeckerFast FDD based Minimization of Generalized Reed-Muller Forms , issue : 15/93, 1993 Bernd Becker, Rolf Drechsler, Harry HengsterLocal Circuit Transformations Preserving Robust Path-Delay-Fault Testability , issue : 1/93, 1993 Bernd Becker, Rolf Drechsler, Paul MolitorOn Generation of Area-Time Optimal Testable Adders 1993 Technical Report 3/93 Bernd Becker, Rolf Drechsler, M. TheobaldOn Variable Ordering of Functional Decision Diagrams , 1993 Bernd Becker, Rolf DrechslerOn the Computational Power of Functional Decision Diagrams 1993 Workshop über Komplexitätstheorie, Datenstrukturen und effiziente Algorithmen Workshop über Komplexitätstheorie, Datenstrukturen und effiziente Algorithmen Bernd Becker, Rolf Drechsler, M. TheobaldOn the Implementation of a Package for Efficient Representation and Manipulation of Functional Decision Diagrams 1993 IFIP WG 10.5 Workshop on Applications of the Reed-Muller Expansion in Circuit Design , pages : 162 - 169 Bernd Becker, Rolf Drechsler, R. WerchnerOn the Relation between BDDs and FDDs , issue : 12/93, 1993 Bernd Becker, Rolf Drechsler, Christoph MeinelOn the Testability of Circuits Derived from Binary Decision Diagrams , issue : 9/93, 1993 Rolf Drechsler, Bernd BeckerRapid Prototyping of Fully Testable Multi-Level AND/EXOR Networks 1993 IFIP WG 10.5 Workshop on Applications of the Reed-Muller Expansion in Circuit Design , pages : 126 - 133 Bernd Becker, Rolf DrechslerTestability of Circuits Derived from Functional Decision Diagrams , 1993 Ralf Hahn, Bernd Becker, Harry HengsterThe Fault Graph and its Application to Combinational Fault Simulation 1993 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” » show abstract « hide abstract Abstract The fault graph is a data structure to store faults and to manage relations between faults. It allows to exploit these relations for testing purposes and to control the order of faults considered. To demonstrate the effectiveness of the fault graph we developed an algorithm for fault simulation in combinational circuits that is based on this data structure. The algorithm is conceptually simple. Experiments show that it is superior to a sophisticated fanout oriented fault simulation algorithm. R. Krieger, Bernd Becker, Harry Hengsterlgc++: Ein Werkzeug zur Implementierung von Logiken als abstrakte Datentypen in C++ , 1993 R. Krieger, Ralf Hahn, Bernd Beckertest_circ: Ein abstrakter Datentyp zur Repräsentation von hierarchischen Schaltkreisen , 1993 back to the year overview Rolf Drechsler, Bernd Becker, Paul MolitorA Performance Oriented Generator for Robust Path-Delay-Fault Testable Adders 1992 GI/ITG Workshop “Testmethoden und Zuverlässigkeit von Schaltungen und Systemen” Bernd Becker, Rolf Drechsler, Paul MolitorOn the Implementation of an Efficient Performance Driven Generator for Conditional-Sum-Adders. 1992 Technical Report 2/93 Rolf Drechsler, Bernd BeckerRapid Prototyping of Robust Path-Delay-Fault Testable Circuits Derived from Binary Decision Diagrams , issue : TR-17/92, SFB 124, 1992 back to the year overview Bernd Becker, Christoph MeinelEntwerfen, Prüfen, Testen: 18.2.1991-22.2.1991 (9108), Dagstuhl-Seminar-Report 6 1991 Saarbrücken: Geschäftsstelle Schloss Dagstuhl back to the year overview Bernd Becker, U. SparmannA Uniform Test Approach for RCC-Adders 1988 Aegean Workshop on Parallel Computation and VLSI Theory , Springer Verlag, volume : 319, pages : 288 - 300 back to the year overview Bernd BeckerAn Easily Testable Optimal-Time VLSI-Multiplier 1987 Acta Informatica , volume : 24, pages : 363 - 380 Bernd Becker, G. Hotz, R. Kolla, Paul Molitor, H. G. OsthofCADIC - Ein System zum hierarchischen Entwurf integrierter Schaltungen 1987 E.I.S.-Workshop , pages : 235 - 245 Bernd Becker, H. G. OsthofLayouts with Wires of Balanced Length 1987 Information and Computation , volume : 73(1), pages : 45 - 58 back to the year overview Bernd Becker, G. Hotz, R. Kolla, Paul MolitorEin logisch-topologischer Kalkül zur Konstruktion von integrierten Schaltkreisen 1986 INFORMATIK Forschung und Entwicklung 1 , page : 38--47,72- back to the year overview Bernd BeckerOn the Crossing-free, Rectangular Embedding of Weighted Graphs in the Plane 1983 Theor. Comput. Sci., GI-Conf. , Springer Verlag, volume : 145, pages : 61 - 72