Designing, developing and manufacturing high-performance, ground-breaking products requires engineers to integrate multiple disciplinary skills into an effective and efficient process of design and optimization.
Furthermore, the need to bring to market Industry 4.0-ready solutions at competitive production and operational costs, demands a systemic vision capable of implementing virtual prototypes that not only emulate functional and physical behaviors, but also support decision-making in all the product’s life-cycle scenarios.
In this context, the engineer - in the broadest sense of the term – guarantees that proven simulation methodologies, combined with new enabling technologies, are effectively developed and productively used. Specifically, this means that anyone involved in numerical simulation and Computer Aided Engineering today must have solid expertise in the specific discipline and must keep their practical experience of the available tools up to date. The ability to use appropriate, proven models that are consistent with the utilization phase and that are capable of providing the necessary information is a prerequisite for contributing successfully to the planning, operational and decision-making processes.
The conceptual and pre-design phases are then organized in the best way, having as focal point the vision of the whole design process and its connections with singular disciplines. People involved in each discipline are outlined and assigned to specific activities, avoiding overlapping of work areas. In the context of spreading knowledge, all the results are stored in a shared environment, making them available for all the allowed users.
The Time to Market needs to be improved constantly, reducing costs derived from huge numbers of numerical simulations and physical prototypes. In this perspective the automation of processes represents the essential step to move on to the Industry 4.0 environment, making it possible to reduce the number of hour/man dedicated to actions that could be made by artificial intelligence.
These goals can be easily achieved, using platforms expressly created for managing the amount of data generated by a company.
What is a Simulation Process and Data Management (SPDM)
A Simulation Process and Data Management is a technology able to organise data in a shared environment. Nowadays companies need the effective management of data coming from various sources, such as numerical simulations, experimental data, financial analysis, etc… It is because of this necessity that technologies such as a SPDM are being used more and more.
Sharing and knowledge of data
Technicians can be used to work alone, in a local environment and take advantage of cloud and shared folders to communicate with their colleagues. When many different departments have to collaborate, issues related to the sharing of data can arise: that’s why the creation of a shared environment is one of the focal point of an SPDM system.
A company can be the best in its field due to the knowledge of its people: in this perspective, the loss of a technician can be the loss of an important slice of knowledge for the whole company. The SPDM gives the possibility to record and to maintain the knowledge inside the company, having tracked and stored all the essential files, reports and analysis. This work can also be done with a basic database, but the power of a SPDM is that data are organized in a smart way: assigned labels and metadata structures allow an advanced search for what is needed, in the smaller amount of time.. In this way, it is possible to choose the most profitable solutions and shorten product-development cycles.
Processes do not only need to be shared and stored, but it is essential to have them automate: transferring data from one discipline to another requires a lot of time and high probability of errors. Automatizing the process brings the advantage to make the artificial intelligence to the dirty job, with no errors, leaving higher value actions to the human.
Versioning
Simulating is not a static process: it is made of lots of different versions of the same basic design. How can we track all the changes, who made them, when and what has been changed? The versioning feature becomes then the answer: tracking the versions of the same model builds the history of that model, giving information about its life, from the start to the end. From this perspective, the capability to recover an old design is immediate, since it is possible to address the right version.
Users education
The technical features are given by the technology, but they are nothing without the correct education of users: labels, versions and metadata are suggested from the platform and must be, at some percentage, compiled by a human.
The company must then give a clear message to the departments, regarding the correct behaviour of users, in order to properly exploit the capabilities of the SPDM. Some indications must be outline:
- All users must agree to the change in the working modus, given by the SPDM: process are mapped and completely clear from the beginning.
- Activities must be assigned to specific users, defining their limits.
- Rules about naming, creation of folders, saving of simulations and the general deployment environment must be sketch and defined at the beginning of the project.
One important factor to be considered is that targets belonging to single disciplines may not be aligned with the broader business strategy: this is the reason why awareness of the multidisciplinary framework must be taken into consideration.
Effects on hardware and company technologies
The usage of the SPDM is not only influencing people who will work with it, but also the hardware needed to support the technology must be up to date. It is then fundamental that the IT department knows exactly the development and the effect the SPDM will have on the hardware. In this context the automation of processes pays a major rule: the link with software and technologies already installed must be analysed to make the connection effective. As an example, all the medium/big companies have platforms like PLM, which are essential tools to be used. The SPDM must interact with the PLM creating a bridge towards the idea of an all-in-one collaborative way of working. The elements of the integration vary from CAD, materials, product performances and configurations.
Simulations are nothing without the comprehension of results: the post processing is then an essential part of a technology. It needs to be enough rich of charts and statistics to make the user able to know the behaviour of data and then make the best decision. Automation of processes also lies in the automatic creation of reports, enabling one fixed layout, easy to understand and shared across departments.
Security of data
Security and privacy of data are nowadays anessential factor when a new technology is been adopted. For this reason, the management of security barriers is widely developed in the SPDM. Storage, access and viewing of data are accessible only by users with permissions, addressed by the administrator of the platform. Permissions can be changed from project to project in order to replicate real situations.
The Simulation Process and Data Management technology is the new up-to-date development platform for companies that want to be part of the Industry 4.0 revolution.
Its features encounter contemporary needs for the new pillars of technology: shared knowledge, security of data, smart exploitation of hardware resources and automation of processes.
The objective is to cover all activities in order to organize and manage engineering workflows and support the management of data even as input for or results of computational engineering tasks and tools. During these times of remote work it is fundamental to understand the importance of technologies able to preserve the productivity of a company, without loosing time and resources.