Neural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions

dc.contributor.authorChen, Wei-Gang
dc.contributor.authorYu, Runyi
dc.contributor.authorWang, Xun
dc.date.accessioned2026-02-06T18:49:38Z
dc.date.issued2020
dc.departmentDoğu Akdeniz Üniversitesi
dc.description.abstractQuantization in lossy video compression may incur severe quality degradation, especially at low bit-rates. Developing post-processing methods that improve visual quality of decoded images is of great importance, as they can be directly incorporated in any existing compression standard or paradigm. We propose in this article a two-stage method, a texture detail restoration stage followed by a deep convolutional neural network (CNN) fusion stage, for video compression artifact reduction. The first stage performs in a patch-by-patch manner. For each patch in the current decoded frame, one prediction is formed based on the sparsity prior assuming that natural image patches can be represented by sparse activation of dictionary atoms. Under the temporal correlation hypothesis, we search the best matching patch in each reference frame, and select several matches with more texture details to tile motion compensated predictions. The second stage stacks the predictions obtained in the preceding stage along with the decoded frame itself to form a tensor, and proposes a deep CNN to learn the mapping between the tensor as input and the original uncompressed image as output. Experimental results demonstrate that the proposed two-stage method can remarkably improve, both subjectively and objectively, the quality of the compressed video sequence.
dc.description.sponsorshipNational Natural Science Foundation of China [61672460]; Public Welfare Technology Research Project of Zhejiang Province [LGG20F020005]; Science and Technology Program of Zhejiang Province (Key Research and Development Plan) [2020C01049]
dc.description.sponsorshipThis work was supported in part by the National Natural Science Foundation of China under Grant 61672460, in part by the Public Welfare Technology Research Project of Zhejiang Province under Grant LGG20F020005, and in part by the Science and Technology Program of Zhejiang Province (Key Research and Development Plan) under Grant 2020C01049.
dc.identifier.doi10.1109/ACCESS.2020.3020388
dc.identifier.endpage162490
dc.identifier.issn2169-3536
dc.identifier.scopus2-s2.0-85102882116
dc.identifier.scopusqualityQ1
dc.identifier.startpage162479
dc.identifier.urihttps://doi.org/10.1109/ACCESS.2020.3020388
dc.identifier.urihttps://hdl.handle.net/11129/14972
dc.identifier.volume8
dc.identifier.wosWOS:000572885700001
dc.identifier.wosqualityQ2
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherIEEE-Inst Electrical Electronics Engineers Inc
dc.relation.ispartofIeee Access
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/openAccess
dc.snmzKA_WoS_20260204
dc.subjectImage coding
dc.subjectImage restoration
dc.subjectDictionaries
dc.subjectCorrelation
dc.subjectVideo compression
dc.subjectQuantization (signal)
dc.subjectDiscrete cosine transforms
dc.subjectCompression artifact reduction
dc.subjectconvolutional neural networks
dc.subjecthigh efficiency video coding
dc.subjectsparse representation
dc.subjecttemporal correlation
dc.titleNeural Network-Based Video Compression Artifact Reduction Using Temporal Correlation and Sparsity Prior Predictions
dc.typeArticle

Files