Abstract |
In recent years, due to the rapid development of computational science, software, and the internet, the popularity of storage device and high-capacity make the computational science rising greatly in various fields. Using computer to calculate, engineering experiment or information collecting from internet have increased greatly and conserved. The amount of information is so huge that the traditional data processing can’t be handled. It needs a new type of data processing, so the development of data mining is very rapid. The purpose of this study is to develop a standard operating procedure of data mining, written by Python, which is highly expandable and easy to be compiled. Such that the predicting model corresponding to a massive amount of input engineering data can be obtained by using artificial neural network algorithms and optimized the engineering problem by using genetic algorithms. In this study, 130 sets of data collected from an existed literature, each set of data contains four input parameters and two output targets, were inputted into the proposed data mining procedure to establish the relevant predicting model. The predicted results showed that the mean square root of relative error of the two outputs target is 2.27% and 3.97%, respectively, which verifies the reliability of the predicting model established by the proposed data mining procedure. Furthermore, this study also uses Rastrigin function to verify the feasibility and correctness of the proposed data mining procedure in searching for the optimal combination of input parameters. This study also discusses the variations of the important control parameters of the two algorithms used in the proposed data mining procedure on the impact of the results. Finally, this study established a predicting model and optimizing the multi-objective target, using 64 sets of data simulated by ANSYS. The predicted results showed that the RMSE of the three outputs target are 1.72%、2.42% and 4.51%, and verifies the reliability of the predicting model. By using genetic algorithms, we also got the results of multi-objective optimization which are 483.28、359.33 and 176.39, all of those are lower than the minimum of the datasets we got. Verifying the multi-objective optimization we established again. |