0


Review Article

J. Comput. Inf. Sci. Eng. 2016;17(1):010801-010801-10. doi:10.1115/1.4034325.

Exchanging computer-aided design (CAD) model data among heterogeneous CAD systems is indispensable for collaborative product development. Currently, the industry mainly uses the standardized neutral files-based methods to implement such exchange. While at the same time, the application of web ontology language (OWL) file and underlying semantic web technologies in CAD model data exchange is gaining importance and popularity within the academia. The coexistence of different types of methods has generated a series of controversies and questions within the industry and the academia. Yet, can the neutral files-based exchange methods completely implement model data exchange among heterogeneous CAD systems? What challenges have been addressed to date by the developed CAD model data exchange standards? Why OWL has been introduced to CAD model data exchange? Does CAD model data exchange really need OWL? Are there any issues in existing neutral files-based exchange methods and OWL file-based exchange methods need to be addressed in future studies? This paper proposes to conduct a study of the standardized neutral files-based exchange methods and OWL file-based exchange methods. An in-depth analysis of the widely used standard for the exchange of product model data (STEP) method and the newly emerging OWL methods is first provided. Then, the paper makes a detailed comparison between these two types of methods based on this analysis. Finally, some issues in the two types of methods that need to be addressed in the future are discussed.

Commentary by Dr. Valentin Fuster

Research Papers

J. Comput. Inf. Sci. Eng. 2016;17(1):011001-011001-7. doi:10.1115/1.4034267.
OPEN ACCESS

The research presented here describes an industry case study of the use of immersive virtual reality (VR) as a general design tool with a focus on the decision making process. A group of design and manufacturing engineers, who were involved in an active new product development project, were invited to participate in three design reviews in an immersive environment. Observations, interviews, and focus groups were conducted to evaluate the effect of using this interface on decision making in early product design. Because the team members were actively engaged in a current product design task, they were motivated to use the immersive technology to address specific challenges they needed to solve to move forward with detailed product design. This case study takes the approach of asking not only what can users do from a technology standpoint but also how their actions in the virtual environment influence decision making. The results clearly show that the team identified design issues and potential solutions that were not identified or verified using traditional computer tools. The design changes that were the outcome of the experience were implemented in the final product design. Another result was that software familiarity played a significant role in the comfort level and subsequent effectiveness of the team discussions. Finally, participants commented on how the immersive VR environment encouraged an increased sense of team engagement that led to better discussions and fuller participation of the team members in the decision process.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011002-011002-7. doi:10.1115/1.4034434.

In particle finite element simulations, a continuous body is represented by a set of particles that carry all physical information of the body, such as the deformation. In order to form this body, the boundary of the particle set needs to be determined. This is accomplished by the α-shape method, where the crucial parameter α controls the level of detail of the detected shape. However, in solid mechanics, it can be observed that α has an influence on the structural integrity as well. In this paper, we study a single boundary segment of a body during a deformation and it is shown that α can be interpreted as the maximum stretch of this segment. On the continuum level, a relation between α and the eigenvalues of the right Cauchy–Green tensor is presented.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011003-011003-9. doi:10.1115/1.4034324.

Tracking refers to a set of techniques that allows one to calculate the position and orientation of an object with respect to a global reference coordinate system in real time. A common method for tracking with point clouds is the iterative closest point (ICP) algorithm, which relies on the continuous matching of sequential sampled point clouds with a reference point cloud. Modern commodity range cameras provide point cloud data that can be used for that purpose. However, this point cloud data is generally considered as low-fidelity and insufficient for accurate object tracking. Mesh reconstruction algorithms can improve the fidelity of the point cloud by reconstructing the overall shape of the object. This paper explores the potential for point cloud fidelity improvement via the Poisson mesh reconstruction (PMR) algorithm and compares the accuracy with a common ICP-based tracking technique and a local mesh reconstruction operator. The results of an offline simulation are promising.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011004-011004-15. doi:10.1115/1.4034472.

When implementing configuration management methods, the amount of data required can be problematic when validating changes to the database. This is especially true for rule-based configuration management techniques. This paper presents a graph visualization tool to assist in validating changes to the rule database. The development and implementation of the tool is presented, along with the execution and results of two user studies designed to test specific aspects of the support tool. The paper then presents how the visualization tool was implemented for four ongoing configuration changes at the original equipment manufacturer (OEM) to prove the effectiveness of the tool in assisting in validating configurations changes.

Topics: Visualization
Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011005-011005-8. doi:10.1115/1.4034129.

Environmental effectiveness refers to the influence and harm on products and materials resulting from the effects of various environmental factors. In their actual usage in a complex environment, products are put forward to address a series of urgent engineering problems caused by environmental effectiveness. However, environmental effectiveness is not extensively studied, and it is not sufficiently considered in the process of product reliability design and analysis. To solve these issues, we apply an ontology and rule reasoning method to design an ontology-based environmental effectiveness knowledge application system. The system comprises four layers: ontology, reasoning, data storage, and knowledge application. With the use of this system, specific measures for possible product failures caused by the environment can be deduced on the basis of the existing environment and failure data. This system can satisfy the requirements for extracting useful environmental effectiveness knowledge from large data to assist reliability designers in realizing complete reliability designs. A semi-intelligent analysis for environmental effectiveness can be applied to reliability analysis and design works. Finally, a case study of a rubber seal for environment protection design is presented to illustrate the applications of the system.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011006-011006-13. doi:10.1115/1.4034130.

Computer-aided design (CAD) models of thin-walled solids such as sheet metal or plastic parts are often reduced dimensionally to their corresponding midsurfaces for quicker and fairly accurate results of computer-aided engineering (CAE) analysis. Computation of the midsurface is still a time-consuming and mostly, a manual task due to lack of robust and automated techniques. Most of the existing techniques work on the final shape (typically in the form of boundary representation, B-rep). Complex B-reps make it hard to detect subshapes for which the midsurface patches are computed and joined, forcing usage of hard-coded heuristic rules, developed on a case-by-case basis. Midsurface failures manifest in the form of gaps, overlaps, nonmimicking input model, etc., which can take hours or even days to correct. The research presented here proposes to address these problems by leveraging feature-information available in the modern CAD models, and by effectively using techniques like simplification, abstraction, and decomposition. In the proposed approach, first, the irrelevant features are identified and removed from the input FbCAD model to compute its simplified gross shape. Remaining features then undergo abstraction to transform into their corresponding generic Loft-equivalents, each having a profile and a guide curve. The model is then decomposed into cellular bodies and a graph is populated, with cellular bodies at the nodes and fully overlapping-surface-interfaces at the edges. The nodes are classified into midsurface-patch generating nodes (called “solid cells” or sCells) and interaction-resolving nodes (“interface cells” or iCells). In a sCell, a midsurface patch is generated either by offset or by sweeping the midcurve of the owner-Loft-feature's profile along with its guide curve. Midsurface patches are then connected in the iCells in a generic manner, thus resulting in a well-connected midsurface with minimum failures. Output midsurface is then validated topologically for correctness. At the end of this paper, real-life parts are used to demonstrate the efficacy of the proposed approach.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011007-011007-12. doi:10.1115/1.4034034.

There are many real-life processes whose smart control requires processing context information. Though processing time-varied context information is addressed in the literature, domain-independent solutions for reasoning about time-varying process scenarios are scarce. This paper proposes a method for dynamic context computation concerning spatial and attributive information. Context is interpreted as a body of information dynamically created by a pattern of entities and relationships over a history of situations. Time is conceived as a causative force capable of changing situations and acting on people and objects. The invariant and variant spatial information is captured by a two-dimensional spatial feature representation matrix (SFR-matrix). The time-dependent changes in the context information are computed based on a dynamic context information (DCI) management hyper-matrix. This humble but powerful representation lends itself to a quasi-real time computing and is able to provide information about foreseeable happenings over multiple situations. Based on this, the reasoning mechanism proposed in this paper is able to provide informative instructions for users who needed to be informed in a dynamically changing situation. This paper uses the practical case of evacuation of a building in fire both as an explorative case for conceptualization of the functionality of the computational mechanism and as a demonstrative and testing application. Our intention is to use the dynamic context computation mechanism as a kernel component of a reasoning platform for informing cyber-physical systems (I-CPSs). Our future research will address the issue of context information management for multiple interrelated spaces.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011008-011008-11. doi:10.1115/1.4034131.

Exchange and reuse of three-dimensional (3D) product models are hampered by the absence of trust in product-lifecycle data quality. The root cause of the missing trust is years of “silo” functions (e.g., engineering, manufacturing, and quality assurance) using independent and disconnected processes. Those disconnected processes result in data exchanges that do not contain all of the required information for each downstream lifecycle process, which inhibits the reuse of product data and results in duplicate data. The X.509 standard, maintained by the Telecommunication Standardization Sector of the International Telecommunication Union (ITU-T), was first issued in 1988. Although originally intended as the authentication framework for the X.500 series for electronic directory services, the X.509 framework is used in a wide range of implementations outside the originally intended paradigm. These implementations range from encrypting websites to software-code signing, yet X.509 certificate use has not widely penetrated engineering and product realms. Our approach is not trying to provide security mechanisms, but equally as important, our method aims to provide insight into what is happening with product data to support trusting the data. This paper provides a review of the use of X.509 certificates and proposes a solution for embedding X.509 digital certificates in 3D models for authentication, authorization, and traceability of product data. This paper also describes an application within the aerospace domain. Finally, the paper draws conclusions and provides recommendations for further research into using X.509 certificates in product lifecycle management (PLM) workflows to enable a product lifecycle of trust.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011009-011009-14. doi:10.1115/1.4034435.

A method is presented for formulating and numerically integrating ordinary differential equations of motion for nonholonomically constrained multibody systems. Tangent space coordinates are defined in configuration and velocity spaces as independent generalized coordinates that serve as state variables in the formulation, yielding ordinary differential equations of motion. Orthogonal-dependent coordinates and velocities are used to enforce constraints at position, velocity, and acceleration levels. Criteria that assure accuracy of constraint satisfaction and well conditioning of the reduced mass matrix in the equations of motion are used as the basis for updating local coordinates on configuration and velocity constraint manifolds, transparent to the user and at minimal computational cost. The formulation is developed for multibody systems with nonlinear holonomic constraints and nonholonomic constraints that are linear in velocity coordinates and nonlinear in configuration coordinates. A computational algorithm for implementing the approach is presented and used in the solution of three examples: one planar and two spatial. Numerical results using a fifth-order Runge–Kutta–Fehlberg explicit integrator verify that accurate results are obtained, satisfying all the three forms of kinematic constraint, to within error tolerances that are embedded in the formulation.

Commentary by Dr. Valentin Fuster
J. Comput. Inf. Sci. Eng. 2016;17(1):011010-011010-5. doi:10.1115/1.4035000.

To improve the quality of point cloud data, as well as maintain edge and detail information in the course of filtering intensity data, a three-dimensional (3D) diffusion filtering equation based on the general principle of diffusion filtering is established in this paper. Moreover, we derive theoretical formulas for the scale parameter and maximum iteration number and achieve self-adaptive denoising, fine control of the point cloud filtering, and accurate prediction of the diffusion convergence. Through experiments with three types of typical point cloud intensity data, the theoretical formulas for the scale parameter and iteration number are verified. Comparative experiments with point cloud data of different types show that the 3D diffusion filtering method has significant denoising and edge-preserving abilities. Compared with the traditional median filtering algorithm, the signal-to-noise ratio (SNR) of the point cloud after filtering is increased by an average of 10% and above, with a maximum value of 40% and above.

Commentary by Dr. Valentin Fuster

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In