Enriching synthetic data with real noise using Neural Style Transfer
PDF

Palavras-chave

Machine learning
Geophysics
Neural style transfer

Como Citar

TAKEMOTO, Naomi; COIMBRA, Tiago; ARAÚJO, Lucas; TYGEL, Martin; AVILA, Sandra; BORIN, Edson. Enriching synthetic data with real noise using Neural Style Transfer. Revista dos Trabalhos de Iniciação Científica da UNICAMP, Campinas, SP, n. 27, p. 1–1, 2019. DOI: 10.20396/revpibic2720192342. Disponível em: https://econtents.bc.unicamp.br/eventos/index.php/pibic/article/view/2342. Acesso em: 25 abr. 2024.

Resumo

Deep Learning experiments require large amounts of labeled data, but few annotated seismic datasets are available and annotation is a time-consuming, expensive activity. Synthetic modeled datasets may be a viable alternative. However, they lack the variability and intricacies of a real data signal. Moreover, methods that add colored noises are not enough to represent such variability. Thus, our goal is to produce a noise type that is characteristic of real data to create a viable synthetic dataset to train Deep Learning models. In this context, we apply the Neural StyleTransfer technique, which combines the structural content of an image with the textural style of another, to produce a synthetic data with noise characteristics extracted from a real dataset. The results show that the stylized synthetic seismic data preserves the modeled content while incorporating characteristics of some real data chosen as style, creating synthetic data with a more realistic noise profile.

https://doi.org/10.20396/revpibic2720192342
PDF

Referências

Gatys, L. A., A. S. Ecker, and M. Bethge, Image style transfer using convolutional neural networks: IEEE Conference on Computer Vision and Pattern Recognition. 2016, 2414–2423.

Todos os trabalhos são de acesso livre, sendo que a detenção dos direitos concedidos aos trabalhos são de propriedade da Revista dos Trabalhos de Iniciação Científica da UNICAMP.

Downloads

Não há dados estatísticos.