Graduate Program in Computational Science and Engineering.Badur, Bertan Yılmaz.Gursu, Ali Emre.2025-04-142025-04-142023Graduate Program in Computational Science and Engineering. PHYS 2023 O87 PhD (Thes BM 2023 H56https://digitalarchive.library.bogazici.edu.tr/handle/123456789/21509Deep generative models are powerful class of machine learning models. However, a significant amount of computing power and technical knowledge is required to conduct the training process. Even searching for hyperparameters requires a high computational cost. Moreover, there is still ongoing research on methods for evaluating generative models, and owing to the lack of a robust and consistent metric, there are limited comparisons between generative model architectures and algorithms. In this study, we attempted to compare two types of generative model architectures, Generative Adversarial Networks (GANs) and Real-valued Non-Volume-Preserving (NVP) flows, with synthetic datasets as well as with a well known image dataset MNIST. We evaluate their data capturing ability according to data dimensionality and variability. We propose an Minimum Description Length (MDL) based metric to examine the effect of model complexity which is measured as model’s parameter count. We provide estimated Kullback-Leibler (KL) divergence and propsed MDL-based metric results. Our findings indicate that NVP models have the capability to encode more data variability while utilizing fewer parameters when contrasted with GANs for lower dimensional datasets. The proposed MDL-based metric, facilitates selecting suitable architecture in terms of model complexity for a given dataset considering its variability and dimensionality. NOTE Keywords : Generative Models, Generative Adversarial Networks, RealNVP, Deep Learning.Deep learning (Machine learning)Generative adversarial networks.Generative models.An analysis on dimensionality and architecture on generative modelsxii, 53 leaves