In numerous real-world problems including a broad range of modeling tasks, we are faced with a diversity of locally available distributed sources of data and expert knowledge, with which one has to interact, reconcile and form a global and user-oriented model of the system under consideration as well as assess their quality. While the technology of Computational Intelligence (CI) has been playing a vital role with this regard, there are still a number of challenges inherently manifesting in these problems.
To prudently address these challenges, in this talk, we introduce a concept of information granules embracing a plethora of formal constructs such as intervals (sets), fuzzy sets, rough sets, etc. We highlight an emergence of higher type and higher order information granules in the analysis and synthesis of granular models. The fundamental problem that becomes central to all investigations is concerned with the formation of information granules. We elaborate on the principle of justifiable granularity and discuss its role as a key design vehicle facilitating a construction of information granules realized on a basis of available experimental evidence (which could be either numeric or granular).
We elaborate on a number of conceptual and design issues of granular models. In particular, it is demonstrated that granular models developed on a basis of existing numeric models of CI lead to their substantial augmentations and result in interesting and comprehensive ways of evaluation of their performance. Two general approaches under investigation are associated with a formation of granular parameter spaces and granular output spaces. The proposed assessment of the quality of the model embraces two generic criteria, namely a coverage criterion of experimental data and a specificity criterion. It is shown that a hierarchy of information granules gives rise to granular models both of higher type and higher order.
The detailed investigations are focused on selected problems including rule-based models and building auto-encoders in architectures of deep learning.