0

Accepted Manuscripts

BASIC VIEW  |  EXPANDED VIEW
research-article  
Xingchen Liu and Vadim Shapiro
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036552
Spatial variation of material structures is a principal mechanism for creating and controlling spatially varying material properties in nature and engineering. While the spatially varying homogenized properties can be represented by scalar and vector fields on the macroscopic scale, explicit microscopic structures of constituent phases are required to facilitate the visualization, analysis and manufacturing of functionally graded material (FGM). The challenge of FGM structure modeling lies in the integration of these two scales. We propose to represent and control material properties of FGM at macroscale using the notion of material descriptors which include common geometric, statistical, and topological measures, such as volume fraction, correlation functions and Minkowski functionals. At microscale, the material structures are modeled as Markov random fields: we formulate the problem of design and (re)construction of FGM structure as a process of selecting neighborhoods from a reference FGM, based on target material descriptors fields. The effectiveness of the proposed method in generating a spatially varying structure of FGM with target properties is demonstrated by two examples: design of a graded bone structure and generating functionally graded lattice structures with target volume fraction fields.
TOPICS: Design, Functionally graded materials, Materials properties, Bone, Scalars, Manufacturing, Construction, Microscale devices, Modeling, Visualization
research-article  
Zahra Shahbazi, Devon Keane, Domenick Avanzi and Lance Evans
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036556
Finite Element Analysis (FEA) has been one of the successful tools in studying mechanical behavior of biological materials. There are many instances where creating FE models requires extensive time and effort. Such instances include finite element analysis of tree branches with complex geometries and varying mechanical properties. Once a FE model of a tree branch is created, the model is not applicable to another branch and all the modeling steps must be repeated for each new branch with a different geometry and, in some cases, material. In this paper, we describe a new and novel program “Immediate-TREE”, and its associated Guided User Interface (GUI). This program provides researchers a fast and efficient tool to create finite element analysis of a large variety of tree branches. Immediate-TREE automates creating finite element models with the use of computer generated Python files. Immediate-TREE uses tree branch data (geometry, mechanical and material properties) and generates Python files. Files were then run in finite element analysis software (Abaqus) to complete the analysis. Immediate-TREE is approximately 240 times faster than creating the same model directly in the FEA software (Abaqus). This new process can be used with a large variety of biological applications including analyses of bones, teeth as well as none biological materials.
TOPICS: Finite element analysis, Finite element model, Geometry, Computer software, Materials properties, Mechanical properties, Bone, User interfaces, Graphical user interfaces, Mechanical behavior, Modeling, Computers
research-article  
Santiago Arroyave-Tobon, Denis Teissandier and Vincent Delos
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036558
This article proposes the use of polytopes in HV-description to solve tolerance analysis problems. Polytopes are defined by a finite set of half-spaces representing geometric, contact or functional specifications. However, the list of the vertices of the polytopes are useful for computing other operations as Minkowski sums. Then, this paper proposes a truncation algorithm to obtain the V-description of polytopes in R^n from its H-description. It is detailed how intersections of polytopes can be calculated by means of the truncation algorithm. Minkowski sums as well can be computed using this algorithm making use of the duality property of polytopes. Therefore, a Minkowski sum can be calculated intersecting some half-spaces in the dual space. Finally, the approach based on HV-polytopes is illustrated by the tolerance analysis of a real industrial case using the open source software PolitoCAT and politopix.
TOPICS: Tolerance analysis, Vickers hardness testing, Algorithms, Space, Computer software
research-article  
Morteza Shafiee
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036487
Rapidly changing environment has affected organizations ability to maintain viability. As a result, various criteria and uncertain situations in a complex environment encounter problems with the traditional performance evaluation with precise and deterministic data. The purpose of this paper is to propose an applicable model for evaluating the performance of the overall supply chain network and its members. Performance evaluation methods which do not include uncertainty obtain inferior results. To overcome this, rough set theory was used to deal with such uncertain data and extend rough non-cooperative stackelberg DEA game to construct a model to evaluate the performance of supply chain under uncertainty. This applies the concept of stackelberg game/leader-follower in order to develop models for measuring performance. The ranking method of non-cooperative two-stage rough DEA model is discussed. Develops the model, which is suitable to evaluate performance of the supply chain network and its members while operates in uncertain situations and involves a high degree of vagueness. The applicable of this paper provides a valuable procedure for performance evaluation in other industries. The proposed model provides useful insights to managers on the measurement of supply chain efficiency in uncertain environment. This paper creates a new perspective into the use of performance evaluation model in order to support managerial decision making in the dynamic environment and uncertain situations.
TOPICS: Surface roughness, Performance evaluation, Supply chains, Uncertainty, Set theory, Decision making
research-article  
Yao Cheng, Daniel C Conrad and Xiaoping Du
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4035530
Incomplete component information may lead to wide system reliability bounds, which make it difficult to make decisions during the system design stage. The missing information is often the component dependence, which is a crucial source for the exact system reliability estimation. Component dependence exists due to the shared environment and operation conditions. But it is difficult for system designers to model component dependence because they may not have access to detailed component design information if the components are designed and manufactured by outside suppliers. This research intends to produce narrow system reliability bounds with a new way for system designers to consider the component dependence implicitly and automatically without knowing component design details. The proposed method is applicable for a wide range of applications where the time-dependent system stochastic load is shared by components of the system. Simulation is used to obtain the extreme value of the system load for a given period of time, and optimization is employed to estimate the system reliability interval. As a result, the system reliability can be estimated within a narrower interval than that obtained from the traditional method with independent component assumption and completely dependent component assumption. Examples are given to demonstrate the proposed method.
TOPICS: Reliability, Stochastic processes, Design, Stress, Optimization, Simulation
research-article  
Lijun Lan, Ying Liu and Wen Feng Lu
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036198
With the arrival of cyber physical world and an extensive support of advanced IT infrastructure, nowadays it is possible to obtain the footprints of design activities through emails, design journals, change logs, and different forms of social data. In order to manage a more effective design process, it is essential to learn from the past by utilizing these valuable sources and understand, for example, what design tasks are actually carried out, their interactions and how they impact each other. In this paper, a computational approach based on deep belief nets (DBN) is proposed to automatically uncover design tasks and quantify their interactions from design document archives. Firstly, a DBN topic model with real-valued units is developed to learn a set of intrinsic topic features from a simple word-frequency based input representation. The trained DBN model is then utilized to discover design tasks by unfolding hidden units by sets of strongly connected words, followed by estimating their interactions by their co-occurrence frequency in a hidden representation space. Finally, the proposed approach is demonstrated through a real-life case study using a design email archive spanning for more than two years.
TOPICS: Design
research-article  
Sandro Barone, Alessandro Paoli and Armando V. Razionale
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036119
Different sensor technologies are available for dimensional metrology and reverse engineering processes. Tactile systems, optical sensors and computed tomography are being used to an increasing extent in various industrial contexts. However, each technique has its own peculiarities, which may limit its usability in demanding applications. The measurement of complex shapes, such as those including hidden and twisted geometries, could be better afforded by multi-sensor systems combining the advantages of two or more data acquisition technologies. In this paper, a fully automatic multi-sensor methodology has been developed with the aim at performing accurate and reliable measurements of both external and internal geometries of industrial components. The methodology is based on tracking a customized hand-held tactile probe by a passive stereo vision system. The imaging system automatically tracks the probe by means of photogrammetric measurements of markers distributed over a plate rigidly assembled to the tactile frame. Moreover, the passive stereo system is activated with a structured light projector in order to provide full-field scanning data, which integrate the point-by-point measurements. The tactile methodology has been validated by measuring primitive shapes. Moreover, the effectiveness of the integration between tactile probing and optical scanning has been experienced by reconstructing twisted and internal shapes of industrial impellers.
TOPICS: Probes, Impellers, Reverse engineering, Sensors, Shapes, Imaging, Dimensional metrology, Data acquisition, Computerized tomography
research-article  
Franca Giannini, Katia Lupinetti and Marina Monti
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036120
Content-based retrieval is particularly important for exploiting CAD model data-bases and on line catalogues. To allow the identification of reusable part models possibly fitting with the product under development, methods for the similarity assessment between shapes should be provided both in terms of global and partial shape matching. In this perspective, this paper proposes a method directly working on CAD B-rep models for 3D model retrieval, which does not require any model conversion to triangular meshes and in addition to global and partial matching allows the identification of components that may likely be assembled with a given model.
TOPICS: Computer-aided design, Shapes, Three-dimensional models, Databases, Fittings
research-article  
Leilei Yin, Dunbing Tang, Qi Wang, Inayat Ullah and Haitao Zhang
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4036121
As EC (engineering change) is an inevitable activity in the industry and uses a lot of engineering design resources, the management of EC has become a crucial discipline. In current researches, most of the data related to the product design change is scattered in different forms and the product data is acquired manually from various files in the EC management, which is time-consuming and error-prone. In this work, DCMBD (design change-oriented model-based definition) model is defined as the sole data source. Based on the proposed DCMBD model, this work presents a method to acquire the product changes automatically and evaluate design change propagation proactively in a uniform way. The objective of the proposed method is to effectively and efficiently manage engineering changes. First, DCMBD model is defined specifically, which records the product data: geometry, material, tolerance and annotations, relations of product items, lifecycle data, etc. Then, based on the proposed DCMBD model, algorithms are presented to automatically acquire two types of product change: parameter change and topology face change. Next, relation models for the product items (parameter and topology face) are demonstrated. After that, the change propagation in terms of parameters and topology faces are clarified. Meanwhile, indices of parameter change influence and topology face change influence are presented to evaluate the change impact. Finally, a prototype system for product design change is developed and a case study is demonstrated to tentatively show how the MBD-based method can be applied to the product design change.
TOPICS: Change management, Product design, Topology, Design, Engineering disciplines, Engineering design, Engineering prototypes, Algorithms, Errors, Geometry
research-article  
Q. J. Ge, Anurag Purwar, Ping Zhao and Shrinath Deshpande
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4035528
This paper studies the problem of planar four-bar motion approximation from the viewpoint of extraction of geometric constraints from a given set of planar displacements. Using the Image Space of planar displacements, we obtain a class of quadrics, called Generalized- or G-manifolds, with eight linear and homogeneous coefficients as a unified representation for constraint manifolds of all four types of planar dyads, RR, PR, and PR, and PP. Given a set of image points that represent planar displacements, the problem of synthesizing a planar four-bar linkage is reduced to finding a pencil of G-manifolds that best fit the image points in the least squares sense. This least squares problem is solved using Singular Value Decomposition. The linear coefficients associated with the smallest singular values are used to define a pencil of quadrics. Additional constraints on the linear coefficients are then imposed to obtain a planar four-bar linkage that best guides the coupler through the given displacements. The result is an efficient and linear algorithm that naturally extracts the geometric constraints of a motion and leads directly to the type and dimensions of a mechanism for motion generation.
TOPICS: Linkages, Fittings, Manifolds, Algebra, Algorithms, Approximation, Dimensions
research-article  
Charlie Destefano and David Jensen
J. Comput. Inf. Sci. Eng   doi: 10.1115/1.4034739
This paper presents a new method for complex system failure analysis and adaptive mission planning that provides both an overall failure analysis on a system's performance as well as a mission-based failure analysis. The Adaptive Mission Planning and Analysis (AMPA) method presented here uses physics-based governing equations to identify the system's overall behavior during both nominal and faulty conditions. The AMPA method is unique in that it first identifies a specific failure or combination of failures within a system and then determines how each failure scenario will affect the system's overall performance characteristics, i.e. its functionality. Then, AMPA uses this failure information to assess and optimize various missions that the system may be asked to perform. The AMPA method is designed to identify functional failures of a given system and then, depending on the types of failures that have occurred and what tasks the system will be asked to perform, identify the optimal functional approach needed moving forward to successfully complete its mission. Ultimately, this method could be applied in-situ to systems using sensor data rather than simulations to allow autonomous systems to automatically adapt to failures. That is, by using the remaining healthy components in a new or different way to compensate for the faulty components to extend the systems lifespan and optimize the chance of mission completion.
TOPICS: Complex systems, Failure, Failure analysis, Performance characterization, Physics, Sensors, Simulation, Engineering simulation

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In