R2021x DELMIA GA Cloud Role Portfolio Work Safety Engineer (EWK-OC)

R2021x DELMIA GA Cloud Role Portfolio Work Safety Engineer (EWK-OC)

Regular price
Sale price
$0.00

Create virtual workers with standard anthropometry (human body dimensions) to evaluate human interaction

Work Safety Engineer (EWK) lets planners create virtual worker with standard anthropometry (human body dimensions). They can place them into a 3D model and evaluate human interaction with a workplace. EWK makes it simple to define common tasks using predefined actions such as picking up and placing objects, walking, operating a device, or using a tool in a manufacturing environment. EWK delivers a wide range of manipulation and analysis tools to make ergonomics assessment accessible to all 3DEXPERIENCE® stakeholders.

 Benefits

Save costs by early integration of human factors in workplace design

EWK makes it simple to place a virtual worker in an immersive environment to see how humans will perform in a manufacturing context. Planners can do ergonomics validation early in the design phase to avoid costly changes later.

 

Reduce risk of injury

Designers can avert injuries in the workplace by early identification of potential ergonomics problems.

 

Capture and reuse ergonomic enterprise standards

A customized manikin, including its attributes and skills, can be saved for use in future evaluation scenarios.  This saves time and effort for downstream users while capturing company knowhow.

 

Switch target populations without updating tasks

EWK can be used to teach a manikin from one population, such as European 80 percentile male, a series of motion postures. After the analysis for that manikin is complete, the user can easily switch to a manikin from a different population, such as Asian 50 percentile female, without needing to update the taught tasks.


 Highlights

Task sequencing for multiple humans, robots and devices

A human task is made of activities that describe a worker’s assignment. EWK provides functions for sequencing activities, editing the sequence, and assigning tasks to let users evaluate the way humans will interact in a manufacturing environment.  Users can sequence and simulate the tasks of each individually programmed human, robot and device to validate the synchronized behavior of the workcell.

 

Building common actions

EWK helps build activities such as reaching for an object, picking it up, or gesturing. In 3D, users can simply select the desired activity and apply it to the manikin and an object or location. The series of motions required to achieve that action is automatically generated, saving time and effort. Postures can then be fine-tuned to specific needs. Creation of human tasks follows an easily understood natural process that makes authoring and simulating human activities easily adoptable by all 3DEXPERIENCE® stakeholders.

 

Time management from task level to motion elements

EWK provides default times and speeds for a human to perform basic movements. Users can change the speeds or simulation times of elementary motions to assess the overall time needed to perform a task and validate the cycle time defined at the planning level. This allows users to simulate a task in a given time and validate its plausibility.

 

Capture and reuse of basic skills or complex tasks

EWK lets users define a sequence of postures, such as kneeling and sitting.  These sequences can be saved in a library for application to a manikin, simplifying task definition. Similarly, users can save more complex tasks as templates in a library and re-use them in other contexts.

 

Associativity for automatic layout updates

The location of a Lifelike Human can be associated to a layout so a manikin can be taught walk activities. Similarly, users can associate a human interface to an object and use it for interactions such as Get and Put. The use of human locations and human interfaces makes activities associative to the layout. If the layout is changed, associated activities are automatically adapted.

 

Easier manikin manipulation

Any part of the manikin’s body can be selected and moved using the corresponding manipulator, inverse or forward kinematics, based on the body segment selected. Users can define the way manikins reach and grasp any object.  These pre-defined human skills can be re-used for similar objects in the same virtual model.

 

Biomechanics analysis
Based on the manikin’s position and the specified weight of an object on the manikin’s segments, this analysis will calculate the moments and forces being applied to each joint. It will also determine the percentage of the manikin’s population that will be unable to perform this action.

 

Push, pull, and carry analysis
EWK provides push, pull, and carry analysis with the Snook and Ciriello equations. Distance and population samples are used as input to provide a maximum acceptable carrying weight. The pushing-pulling force can be compared to what is considered a safe force for the activity.

 

Lifting and lowering analysis
EWK lifting and lowering analysis utilizes the NIOSH 1981/1991 as well as the Snook and Ciriello guidelines.  Duration, frequency, the lifting posture start and finish, and coupling conditions are used as input to provide recommended and maximum weight of the object to be lifted. The analysis results can be exported in a text or HTML format.

 

Rapid upper limb assessment
In a static workplace design, planners can detect work-related upper-limb disorder risks using the Rapid Upper Limb Assessment (RULA) survey.  RULA is a screening tool that assesses biomechanical and static postural workload on the whole body with particular attention to the neck, trunk, and upper limbs. Color-coded analysis results are displayed on the manikin’s upper body segments (neck, trunk, wrists, and arms).  A RULA report can be generated and exported in text or HTML format.

 

Manikin vision assessment

A separate vision window displays the field of vision from the manikin’s eyes.  The line of sight, the vision cones and the field of view can be displayed.  They can be customized to simulate visual limits created by equipment such as a helmet

QUESTIONS & ANSWERS

Have a Question?

Be the first to ask a question about this.

Ask a Question