Utilizing Sensitivity Analysis in Data Generation for Machine Learning Training

Autor/innen

  • Chrissula Bermperi Hochschule Bielefeld

Abstract

This poster explores sensitivity analysis, a technique for assessing how uncertainties in a mathematical model's output relate to variations in its inputs. It focuses on sensitivity indices, particularly the total-order index, which quantifies each input parameter's impact on the model's output. The main objective is to identify and remove parameters with minimal influence, thereby simplifying the model. The study centers on the Adapted Rosenbrock function, a challenging optimization problem often used for algorithm testing. Multiple neural network training sessions were conducted on different datasets, each configured with varying input setups. Employing the SALib Python library, the study followed a modular approach to sensitivity analysis. Initial findings suggest the potential exclusion of insignificant parameters, but further investigation with extended training and larger datasets is necessary for conclusive insights.

Veröffentlicht

03.06.2024