The aim of this work is an efficien methodology development for a real-time control of human assembling sequences of mechanical components. The method involves a CAD environment, an hardware system, referred to as a PAA (Personal Active Assistant), and a set of Mixed Reality features. The whole scheme is targeted to positively influence the connection between CAD and Mixed Reality in order to proficiently reduce the gap between engineers and manual operators. The system is based on a CAD assembly module and on a Mixed Reality wearable equipment. It can be used to improve several activities in the industrial field, such as operator professional training, optimal assembly sequence seeking or on-field teleconferencing (suitable for remote collaboration or for full exploitation of Concurrent Engineering suggestions during design and set up stages). The main characteristic o PAA is a real-time wireless linkage to a remote servere or designer workstation, where project geometric database is stored. The Mixed Reality wearable equipment consists of an optical see-through display device and a PAA head-mounted camera. The user can freely operate in the mixed environment, while the camera can record the human driven assembly sequence and check the efficiency and correctness via object recognition: an incrementally sub-assembly detection algorithm has been developed in order to achieve complex dataset monitoring. Conversely, designer or assembly planner can exploit the peculiarities of Mixed Reality-based assembly: a straightforward interaction with assembly operator can be obtained by sending vocal advices or by displaying superimposed visual information on the real scene. In the paper a new method for CAD models and Mixed Reality environment integration will be presented and discussed in order to improve and simplify personnel training or warehouse part seeking.

Interactive control of manufacturing assemblies with Mixed Reality

LIVERANI, ALFREDO;AMATI, GIANCARLO;CALIGIANA, GIANNI
2006

Abstract

The aim of this work is an efficien methodology development for a real-time control of human assembling sequences of mechanical components. The method involves a CAD environment, an hardware system, referred to as a PAA (Personal Active Assistant), and a set of Mixed Reality features. The whole scheme is targeted to positively influence the connection between CAD and Mixed Reality in order to proficiently reduce the gap between engineers and manual operators. The system is based on a CAD assembly module and on a Mixed Reality wearable equipment. It can be used to improve several activities in the industrial field, such as operator professional training, optimal assembly sequence seeking or on-field teleconferencing (suitable for remote collaboration or for full exploitation of Concurrent Engineering suggestions during design and set up stages). The main characteristic o PAA is a real-time wireless linkage to a remote servere or designer workstation, where project geometric database is stored. The Mixed Reality wearable equipment consists of an optical see-through display device and a PAA head-mounted camera. The user can freely operate in the mixed environment, while the camera can record the human driven assembly sequence and check the efficiency and correctness via object recognition: an incrementally sub-assembly detection algorithm has been developed in order to achieve complex dataset monitoring. Conversely, designer or assembly planner can exploit the peculiarities of Mixed Reality-based assembly: a straightforward interaction with assembly operator can be obtained by sending vocal advices or by displaying superimposed visual information on the real scene. In the paper a new method for CAD models and Mixed Reality environment integration will be presented and discussed in order to improve and simplify personnel training or warehouse part seeking.
A. Liverani; G. Amati; G. Caligiana
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11585/27606
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 8
social impact