Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Dokl. RAN. Math. Inf. Proc. Upr.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia, 2023, Volume 509, Pages 94–100
DOI: https://doi.org/10.31857/S2686954322600562
(Mi danma368)
 

This article is cited in 1 scientific paper (total in 1 paper)

INFORMATICS

Suppression of speckle noise in medical images via segmentation-grouping of 3D objects using sparse contourlet representation

V. F. Kravchenkoab, Yu. V. Gulyaeva, V. I. Ponomarevc, G. Aranda Bojorgesc

a Kotelnikov Institute of Radioengineering and Electronics, Russian Academy of Sciences, Moscow, Russia
b Bauman Moscow State Technical University, Moscow, Russia
c Instituto Politecnico Nacional de Mexico, Mexico City, Mexico
Citations (1)
References:
Abstract: A novel method for filtering ultrasonic and magnetic resonance images contaminated by multiplicative (speckle) noise is justified and implemented. The method consists of several stages: segmentation of image areas, grouping of similar structures in 3D, homomorphic transformation, a 3D filtering approach based on a sparse representation in the contourlet transform (CLT) space with posterior filtering according to MI weights of similar 2D structures, and the final inverse homomorphic transformation. A physical interpretation of the filtering procedure in the case of speckle noise is given, and a structural scheme for noise suppression is developed. Simulation based on the new filtering procedure has confirmed its superiority in terms of generally accepted criteria, such as the structural similarity index measure, peak signal-to-noise ratio, edge preservation index, and the resolution index alpha, as well as in a visual comparison of filtered images.
Keywords: ultrasonic and magnetic resonance images, superpixel segmentation methods, filtering, speckle noise, grouping of objects, holomorphic transformation, peak signal-to-noise ratio.
Received: 09.09.2022
Revised: 24.11.2022
Accepted: 26.12.2022
English version:
Doklady Mathematics, 2023, Volume 107, Issue 1, Pages 77–82
DOI: https://doi.org/10.1134/S1064562423700461
Bibliographic databases:
Document Type: Article
UDC: 621.391.2
Language: Russian
Citation: V. F. Kravchenko, Yu. V. Gulyaev, V. I. Ponomarev, G. Aranda Bojorges, “Suppression of speckle noise in medical images via segmentation-grouping of 3D objects using sparse contourlet representation”, Dokl. RAN. Math. Inf. Proc. Upr., 509 (2023), 94–100; Dokl. Math., 107:1 (2023), 77–82
Citation in format AMSBIB
\Bibitem{KraGulPon23}
\by V.~F.~Kravchenko, Yu.~V.~Gulyaev, V.~I.~Ponomarev, G.~Aranda Bojorges
\paper Suppression of speckle noise in medical images via segmentation-grouping of 3D objects using sparse contourlet representation
\jour Dokl. RAN. Math. Inf. Proc. Upr.
\yr 2023
\vol 509
\pages 94--100
\mathnet{http://mi.mathnet.ru/danma368}
\crossref{https://doi.org/10.31857/S2686954322600562}
\elib{https://elibrary.ru/item.asp?id=50436210}
\transl
\jour Dokl. Math.
\yr 2023
\vol 107
\issue 1
\pages 77--82
\crossref{https://doi.org/10.1134/S1064562423700461}
Linking options:
  • https://www.mathnet.ru/eng/danma368
  • https://www.mathnet.ru/eng/danma/v509/p94
  • This publication is cited in the following 1 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024