0

IN THIS ISSUE

### Research Papers

J. Comput. Inf. Sci. Eng. 2017;17(4):041001-041001-8. doi:10.1115/1.4036198.
FREE TO VIEW

With the arrival of cyber physical world and an extensive support of advanced information technology (IT) infrastructure, nowadays it is possible to obtain the footprints of design activities through emails, design journals, change logs, and different forms of social data. In order to manage a more effective design process, it is essential to learn from the past by utilizing these valuable sources and understand, for example, what design tasks are actually carried out, their interactions, and how they impact each other. In this paper, a computational approach based on the deep belief nets (DBN) is proposed to automatically uncover design tasks and quantify their interactions from design document archives. First, a DBN topic model with real-valued units is developed to learn a set of intrinsic topic features from a simple word-frequency-based input representation. The trained DBN model is then utilized to discover design tasks by unfolding hidden units by sets of strongly connected words, followed by estimating the interactions among tasks on the basis of their co-occurrence frequency in a hidden topic space. Finally, the proposed approach is demonstrated through a real-life case study using a design email archive spanning for more than 2 yr.

Topics: Design , Modeling
Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041002-041002-8. doi:10.1115/1.4036487.

Rapidly changing environment has affected organizations' ability to maintain viability. As a result, various criteria and uncertain situations in a complex environment encounter problems when using the traditional performance evaluation with precise and deterministic data. The purpose of this paper is to propose an applicable model for evaluating the performance of the overall supply chain (SC) network and its members. Performance evaluation methods, which do not include uncertainty, obtain inferior results. To overcome this, rough set theory (RST) was used to deal with such uncertain data and extend rough noncooperative Stackelberg data envelopment analysis (DEA) game to construct a model to evaluate the performance of supply chain under uncertainty. This applies the concept of Stackelberg game/leader–follower in order to develop models for measuring performance. The ranking method of noncooperative two-stage rough DEA model is discussed. While developing the model, which is suitable to evaluate the performance of the supply chain network and its members when it operates in uncertain situations and involves a high degree of vagueness. The application of this paper provides a valuable procedure for performance evaluation in other industries. The proposed model provides useful insights for managers on the measurement of supply chain efficiency in uncertain environment. This paper creates a new perspective into the use of performance evaluation model in order to support managerial decision-making in the dynamic environment and uncertain situations.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041003-041003-14. doi:10.1115/1.4036119.

Different sensor technologies are available for dimensional metrology and reverse engineering processes. Tactile systems, optical sensors, and computed tomography (CT) are being used to an increasing extent in various industrial contexts. However, each technique has its own peculiarities, which may limit its usability in demanding applications. The measurement of complex shapes, such as those including hidden and twisted geometries, could be better afforded by multisensor systems combining the advantages of two or more data acquisition technologies. In this paper, a fully automatic multisensor methodology has been developed with the aim at performing accurate and reliable measurements of both external and internal geometries of industrial components. The methodology is based on tracking a customized hand-held tactile probe by a passive stereo vision system. The imaging system automatically tracks the probe by means of photogrammetric measurements of markers distributed over a plate rigidly assembled to the tactile frame. Moreover, the passive stereo system is activated with a structured light projector in order to provide full-field scanning data, which integrate the point-by-point measurements. The use of the same stereo vision system for both tactile probe tracking and structured light scanning allows the two different sensors to express measurement data in the same reference system, thus preventing inaccuracies due to misalignment errors occurring in the registration phase. The tactile methodology has been validated by measuring primitive shapes. Moreover, the effectiveness of the integration between tactile probing and optical scanning has been experienced by reconstructing twisted and internal shapes of industrial impellers.

Topics: Impellers , Probes
Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041004-041004-11. doi:10.1115/1.4036120.

Content-based retrieval is particularly important for exploiting company model databases and online catalogs. To allow the identification of reusable part models possibly fitting with the product under development, methods for the similarity assessment between shapes should be provided in terms of both global and partial shape matching. In this perspective, this paper proposes a method directly working on B-rep models for 3D model retrieval, which does not require any model conversion to triangular meshes and in addition to global and partial matching allows the identification of components that may likely be assembled with a given model.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041005-041005-7. doi:10.1115/1.4034739.

This paper presents a new method for complex system failure analysis and adaptive mission planning that provides both an overall failure analysis on a system's performance as well as a mission-based failure analysis. The adaptive mission planning and analysis (AMPA) method presented here uses physics-based governing equations to identify the system's overall behavior during both nominal and faulty conditions. The AMPA method is unique, in which it first identifies a specific failure or combination of failures within a system and then determines how each failure scenario will affect the system's overall performance characteristics, i.e., its functionality. Then, AMPA uses this failure information to assess and optimize various missions that the system may be asked to perform. The AMPA method is designed to identify functional failures of a given system and then, depending on the types of failures that have occurred and what tasks the system will be asked to perform, identify the optimal functional approach needed for moving forward to successfully complete its mission. Ultimately, this method could be applied in situ to systems using sensor data rather than simulations to allow autonomous systems to automatically adapt to failures. That is, by using the remaining healthy components in a new or different way to compensate for the faulty components to extend the systems lifespan and optimize the chance of mission completion.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041006-041006-19. doi:10.1115/1.4036121.

As engineering change (EC) is an inevitable activity in the industry and uses a lot of engineering design resources, the management of EC has become a crucial discipline. In current researches, most of the data related to the product design change are scattered in different forms and the product data are acquired manually from various files in the EC management, which is time-consuming and error-prone. In this work, design change-oriented model-based definition (DCMBD) model is defined as the sole data source. Based on the proposed DCMBD model, this work presents a method to acquire the product changes automatically and evaluate design change propagation proactively in a uniform way. The objective of the proposed method is to effectively and efficiently manage ECs. In this paper, first, DCMBD model is defined specifically, which records the product data: geometry, material, tolerance and annotations, relations of product items, lifecycle data, etc. Then, based on the defined DCMBD model, algorithms are presented to automatically acquire two types of product change: parameter change and topology face change. Next, relation models for the product items (parameter and topology face) are demonstrated. After that, the change propagation in terms of parameters and topology faces are clarified. Meanwhile, indices of parameter change influence (PCI) and topology face change influence (TFCI) are presented to evaluate the change impact. Finally, a prototype system for product design change is developed and a case study is demonstrated to show how the proposed method can be applied to the product design change.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041007-041007-11. doi:10.1115/1.4035530.

Incomplete component information may lead to wide bounds for system reliability prediction, making decisions difficult in the system design stage. The missing information is often the component dependence, which is a crucial source for the exact system reliability estimation. Component dependence exists due to the shared environment and operating conditions. But it is difficult for system designers to model component dependence because they may have limited information about component design details if outside suppliers designed and manufactured the components. This research intends to produce narrow system reliability bounds with a new way for system designers to consider the component dependence implicitly and automatically without knowing component design details. The proposed method is applicable for a wide range of applications where the time-dependent system stochastic load is shared by components of the system. Simulation is used to obtain the extreme value of the system load for a given period of time, and optimization is employed to estimate the system reliability bounds, which are narrower than those from the traditional method with independent component assumption and completely dependent component assumption. Examples are provided to demonstrate the proposed method.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041008-041008-9. doi:10.1115/1.4036556.

Finite element analysis (FEA) has been one of the successful tools in studying mechanical behavior of biological materials. There are many instances where creating FE models requires extensive time and effort. Such instances include finite element analysis of tree branches with complex geometries and varying mechanical properties. Once a FE model of a tree branch is created, the model is not applicable to another branch, and all the modeling steps must be repeated for each new branch with a different geometry and, in some cases, material. In this paper, we describe a new and novel program “Immediate-TREE” and its associated guided user interface (GUI). This program provides researchers a fast and efficient tool to create finite element analysis of a large variety of tree branches. Immediate-TREE automates the process of creating finite element models with the use of computer-generated Python files. Immediate-TREE uses tree branch data (geometry, mechanical, and material properties) and generates Python files. Files were then run in finite element analysis software (abaqus) to complete the analysis. Immediate-TREE is approximately 240 times faster than creating the same model directly in the FEA software (abaqus). This new process can be used with a large variety of biological applications including analyses of bones, teeth, as well as known biological materials.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041009-041009-7. doi:10.1115/1.4036923.

Existing techniques for motion imitation often suffer a certain level of latency due to their computational overhead or a large set of correspondence samples to search. To achieve real-time imitation with small latency, we present a framework in this paper to reconstruct motion on humanoids based on sparsely sampled correspondence. The imitation problem is formulated as finding the projection of a point from the configuration space of a human's poses into the configuration space of a humanoid. An optimal projection is defined as the one that minimizes a back-projected deviation among a group of candidates, which can be determined in a very efficient way. Benefited from this formulation, effective projections can be obtained by using sparsely sampled correspondence, whose generation scheme is also introduced in this paper. Our method is evaluated by applying the human's motion captured by an RGB-depth (RGB-D) sensor to a humanoid in real time. Continuous motion can be realized and used in the example application of teleoperation.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041010-041010-11. doi:10.1115/1.4036615.

The goal in this paper is to enable collaboration in the codesign of engineering artifacts when participants are reluctant to share their design-related confidential and proprietary information with other codesigners, even though such information is needed to analyze and validate the overall design. We demonstrate the viability of codesign by multiple entities who view the parameters of their contributions to the joint design to be confidential. In addition to satisfying this confidentiality requirement, an online codesign process must result in a design that is of the same quality as if full sharing of information had taken place between the codesigners. We present online codesign protocols that satisfy both requirements and demonstrate their practicality using a simple example of codesign of an automotive suspension system and the tires. Our protocols do not use any cryptographic primitives—they only use the kinds of mathematical operations that are currently used in single-designer situations. The participants in the online design protocols include the codesigners, and a cloud server that facilitates the process while learning nothing about the participants' confidential information or about the characteristics of the codesigned system. The only assumption made about this cloud server is that it does not collude with some participants against other participants. We do not assume that the server does not, on its own, attempt to compute as much information as it can about the confidential inputs and outputs of the codesign process: It can make a transcript of the protocol and later attempt to infer all possible information from it, so it is a feature of our protocols the cloud server can infer nothing from such a transcript.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041011-041011-9. doi:10.1115/1.4036558.

This article proposes the use of polytopes in $HV$-description to solve tolerance analysis problems. Polytopes are defined by a finite set of half-spaces representing geometric, contact, or functional specifications. However, the list of the vertices of the polytopes is useful for computing other operations as Minkowski sums. Then, this paper proposes a truncation algorithm to obtain the $V$-description of polytopes in $ℝn$ from its $H$-description. It is detailed how intersections of polytopes can be calculated by means of the truncation algorithm. Minkowski sums as well can be computed using this algorithm making use of the duality property of polytopes. Therefore, a Minkowski sum can be calculated intersecting some half-spaces in the dual space. Finally, the approach based on $HV$-polytopes is illustrated by the tolerance analysis of a real industrial case using the open source software politocat and politopix.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2017;17(4):041012-041012-10. doi:10.1115/1.4037738.

Topology optimization has been considered as a promising tool for conceptual design due to its capability of generating innovative design candidates without depending on the designer's intuition and experience. Various optimization methods have been developed through the years, and one of the promising options is the level-set-based topology optimization method. The benefit of this alternative method is that the design is characterized by its clear boundaries. This advantage can avoid postprocessing work in conventional topology optimization process to a large extent and realize direct integration between topology optimization and additive manufacturing (AM). In this paper, practical algorithms and a matlab-based open source framework are developed to seamlessly integrate the level-set-based topology optimization procedure with AM process by converting the design to STereoLithography (STL) files, which is the de facto standard format for three-dimensional (3D) printing. The proposed algorithm and code are evaluated by a proof-of-concept demonstration with 3D printing of both single and multimaterial topology optimization results. The algorithm and the open source framework proposed in this paper will be beneficial to the areas of computational design and AM.

Commentary by Dr. Valentin Fuster