How to Generate Synthetic Paintings to Improve Art Style Classification

Sarah Pires Pérez, Fabio Gagliardi Cozman

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Indexing artwork is not only a tedious job; it is an impossible task to complete manually given the amount of online art. In any case, the automatic classification of art styles is also a challenge due to the relative lack of labeled data and the complexity of the subject matter. This complexity means that common data augmentation techniques may not generate useful data; in fact, they may degrade performance in practice. In this paper, we use Generative Adversarial Networks for data augmentation so as to improve the accuracy of an art style classifier, showing that we can improve performance of EfficientNet B0, a state of art classifier. To achieve this result, we introduce Class-by-Class Performance Analysis; we also present a modified version of the SAGAN training configuration that allows better control against mode collapse and vanishing gradient in the context of artwork.

Original languageEnglish
Title of host publicationIntelligent Systems - 10th Brazilian Conference, BRACIS 2021, Proceedings, Part 2
EditorsAndré Britto, Karina Valdivia Delgado
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages16
ISBN (Print)9783030916985
StatePublished - 2021
Externally publishedYes
Event10th Brazilian Conference on Intelligent Systems, BRACIS 2021 - Virtual, Online
Duration: 29 Nov 20213 Dec 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13074 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference10th Brazilian Conference on Intelligent Systems, BRACIS 2021
CityVirtual, Online


  • Art style classification
  • Computer vision
  • GAN


Dive into the research topics of 'How to Generate Synthetic Paintings to Improve Art Style Classification'. Together they form a unique fingerprint.

Cite this