Using a simulation model for better execution
THK of America broke the Plant Simulation plan down into three major steps:
- Capture floor data to build an accurate model.
- Validate the simulation model to assure accuracy.
- Generate optimization parameters to allow the simulation to increase efficiency.
Three Steps Towards a Working Model
For the first goal, capture floor data, the team already faced a plethora of challenges. With the customizability available on THK’s rails, multiple different properties can affect processing time and overall workflow.

THK solved this by building a timing library that includes all of the machines, part types, and processing times. The team spent long hours working closely with production staff to fully understand the intricacies of each step in the cutting process.
To tie this all together, the team utilized custom SimTalk functions to parse each rail, and determine what would happen to the rail before, during, and after being cut.
The next major milestone was to validate the simulation model. The variability in the product flow required using daily historical data and running it through the simulation. Utilizing the Experiment Manager, each day was run as an experiment and generated multiple possible scenarios, accounting for machines in maintenance, employees taking the day off, etc. This system was used to continue iterating the model until the simulation results came close to a satisfactory threshold of the real-life daily results.
I came from not knowing what Plant Simulation was, to implementing this solution in just six months. Learning from the Siemens Xcelerator Academy, reading through the forums, and doing a deep dive on the user manual all helped my team get up to speed quickly so we could produce this output.”
Daniel Abdelsamed, Manager DX team, THK Manufacturing:
After that, the results are captured in an external software that sends them to the central DX data system. As an overview, Plant Simulation runs the optimizer and generates results. Afterwards, a Javascript server running on Windows captures those results via the COM system. The COM system is also responsible for running the optimizer upon request or on a time basis, like once a day.
Finally, all this data ends up in the central DX data system, which is a custom system developed to handle all of THK’ data from Plant Simulation and all other sources. Basically, a single source of truth for digital information (?). The DX system is built on Linux servers running Kubernetes. This allows THK to run all applications (including Plant Simulation) redundant and resilient. This central system outputs a data graph that can be used by some other applications.
From Simulation to Better Execution
THK Manufacturing: Siemens Digital Industries Software Tecnomatix Plant Simulation offers all means to improve automatic and manual processes

In the end, the simulation generates rail groupings, which are run in a single batch on a single machine. It specifies exactly how much stock has to be released from the warehouse and which drop rail to use. In addition, the warehouse team knows exactly when the order has to be released to be on schedule. Once the drop rail arrives at the cut process (cutting process?), the cut team knows when processing that group needs to start.
Providing Manufacturing Information
Successfully running a process requires information. At THK the following sample applications pull information from the central DX data source.
A web application for the warehouse team shows them when and what needs to get released next. The warehouse team can see what orders are on that cart, and what stock needs to get released for it. They are also able to print the cut plan in preparation for the operators to use it.
The operators can view the digital signage (?) at the machines. These signs show the operators what needs to be run, and what their current priority is. These are kept up to date in case a high priority order comes through. The operators also use the plans printed by the warehouse team to set the machines up for the cuts.
The scheduling team can access the current orders and their machine allocations. They can analyze the simulation forecast and make updates to order due dates and future process scheduling to account for it. They can also download the order queue dataset for further analysis.
Overall Results
“After only 15 days of running the system, we saw a 50 % decrease in drop inventory. The old system wasn’t able to fully utilize the drop or required manual intervention to do so. Our system was able to re-release those rails more efficiently and completely autonomously.”
Daniel Abdelsamed, Manager DX team, THK Manufacturing

The productivity in the cut process increased, with a more predictable output. The scheduling team also saw a decreased workload in the cut facility, so they could focus their time and effort on communicating with customers and forecasting.