In recent years, with the development of deep learning, image color rendering method has become a research hotspot once again. To overcome the detail problems of color overstepping and boundary blurring in the robust image color rendering method, as well as the problems of unstable training based on generative adversarial networks, we propose an color rendering method using Gabor filter based improved pix2pix for robust image. Firstly, the multi-direction/multi-scale selection characteristic of Gabor filter is used to preprocess the image to be rendered, which can retain the detailed features of the image while preprocessing to avoid the loss of features. Moreover, among the Gabor texture feature maps with 6 scales and 4 directions, the texture map with the scale of 7 and the direction of 0° has the comparable rendering performance. Finally, by improving the loss function of pix2pix model and adding the penalty term, not only the training can be stabilized, but also the ideal color image can be obtained. To reflect image color rendering quality of different models more objectively, PSNR and SSIM indexes are adopted to evaluate the rendered images. The experimental results of the proposed method show that the robust image rendered by this method has better visual performance and reduces the influence of light and noise on the image to a certain extent.
Citation: Hong-an Li, Min Zhang, Zhenhua Yu, Zhanli Li, Na Li. An improved pix2pix model based on Gabor filter for robust color image rendering[J]. Mathematical Biosciences and Engineering, 2022, 19(1): 86-101. doi: 10.3934/mbe.2022004
[1] | Xiaoxue Zhao, Zhuchun Li . Synchronization of a Kuramoto-like model for power grids with frustration. Networks and Heterogeneous Media, 2020, 15(3): 543-553. doi: 10.3934/nhm.2020030 |
[2] | Tingting Zhu . Synchronization of the generalized Kuramoto model with time delay and frustration. Networks and Heterogeneous Media, 2023, 18(4): 1772-1798. doi: 10.3934/nhm.2023077 |
[3] | Seung-Yeal Ha, Yongduck Kim, Zhuchun Li . Asymptotic synchronous behavior of Kuramoto type models with frustrations. Networks and Heterogeneous Media, 2014, 9(1): 33-64. doi: 10.3934/nhm.2014.9.33 |
[4] | Tingting Zhu . Emergence of synchronization in Kuramoto model with frustration under general network topology. Networks and Heterogeneous Media, 2022, 17(2): 255-291. doi: 10.3934/nhm.2022005 |
[5] | Seung-Yeal Ha, Jaeseung Lee, Zhuchun Li . Emergence of local synchronization in an ensemble of heterogeneous Kuramoto oscillators. Networks and Heterogeneous Media, 2017, 12(1): 1-24. doi: 10.3934/nhm.2017001 |
[6] | Seung-Yeal Ha, Jeongho Kim, Jinyeong Park, Xiongtao Zhang . Uniform stability and mean-field limit for the augmented Kuramoto model. Networks and Heterogeneous Media, 2018, 13(2): 297-322. doi: 10.3934/nhm.2018013 |
[7] | Seung-Yeal Ha, Hansol Park, Yinglong Zhang . Nonlinear stability of stationary solutions to the Kuramoto-Sakaguchi equation with frustration. Networks and Heterogeneous Media, 2020, 15(3): 427-461. doi: 10.3934/nhm.2020026 |
[8] | Seung-Yeal Ha, Se Eun Noh, Jinyeong Park . Practical synchronization of generalized Kuramoto systems with an intrinsic dynamics. Networks and Heterogeneous Media, 2015, 10(4): 787-807. doi: 10.3934/nhm.2015.10.787 |
[9] | Vladimir Jaćimović, Aladin Crnkić . The General Non-Abelian Kuramoto Model on the 3-sphere. Networks and Heterogeneous Media, 2020, 15(1): 111-124. doi: 10.3934/nhm.2020005 |
[10] | Young-Pil Choi, Seung-Yeal Ha, Seok-Bae Yun . Global existence and asymptotic behavior of measure valued solutions to the kinetic Kuramoto--Daido model with inertia. Networks and Heterogeneous Media, 2013, 8(4): 943-968. doi: 10.3934/nhm.2013.8.943 |
In recent years, with the development of deep learning, image color rendering method has become a research hotspot once again. To overcome the detail problems of color overstepping and boundary blurring in the robust image color rendering method, as well as the problems of unstable training based on generative adversarial networks, we propose an color rendering method using Gabor filter based improved pix2pix for robust image. Firstly, the multi-direction/multi-scale selection characteristic of Gabor filter is used to preprocess the image to be rendered, which can retain the detailed features of the image while preprocessing to avoid the loss of features. Moreover, among the Gabor texture feature maps with 6 scales and 4 directions, the texture map with the scale of 7 and the direction of 0° has the comparable rendering performance. Finally, by improving the loss function of pix2pix model and adding the penalty term, not only the training can be stabilized, but also the ideal color image can be obtained. To reflect image color rendering quality of different models more objectively, PSNR and SSIM indexes are adopted to evaluate the rendered images. The experimental results of the proposed method show that the robust image rendered by this method has better visual performance and reduces the influence of light and noise on the image to a certain extent.
Synchronization in complex networks has been a focus of interest for researchers from different disciplines[1,2,4,8,15]. In this paper, we investigate synchronous phenomena in an ensemble of Kuramoto-like oscillators which is regarded as a model for power grid. In [9], a mathematical model for power grid is given by
Pisource=I¨θi˙θi+KD(˙θi)2−N∑l=1ailsin(θl−θi),i=1,2,…,N, | (1) |
where
By denoting
(˙θi)2=ωi+KNN∑l=1sin(θl−θi),˙θi>0,i=1,2,…,N. | (2) |
Here, the setting
If
(˙θi)2=ωi+KNN∑l=1sin(θl−θi+α),˙θi>0,i=1,2,…,N. | (3) |
We will find a trapping region such that any nonstationary state located in this region will evolve to a synchronous state.
The contributions of this paper are twofold: First, for identical oscillators without frustration, we show that the initial phase configurations located in the half circle will converge to complete phase and frequency synchronization. This extends the analytical results in [5] in which the initial phase configuration for synchronization needs to be confined in a quarter of circle. Second, we consider the nonidentical oscillators with frustration and present a framework leading to the boundness of the phase diameter and complete frequency synchronization. To the best of our knowledge, this is the first result for the synchronization of (3) with nonidentical oscillators and frustration.
The rest of this paper is organized as follows. In Section 2, we recall the definitions for synchronization and summarize our main results. In Section 3, we give synchronization analysis and prove the main results. Finally, Section 4 is devoted to a concluding summary.
Notations. We use the following simplified notations throughout this paper:
νi:=˙θi,i=1,2,…,N,ω:=(ω1,ω2,…,ωN),ˉω:=max1≤i≤Nωi,ω_:=min1≤i≤Nωi,D(ω):=ˉω−ω_,θM:=max1≤i≤Nθi,θm:=min1≤i≤Nθi,D(θ):=θM−θm,νM:=max1≤i≤Nνi,νm:=min1≤i≤Nνi,D(ν):=νM−νm,θνM∈{θj|νj=νM},θνm∈{θj|νj=νm}. |
In this paper, we consider the system
(˙θi)2=ωi+KNN∑l=1sin(θl−θi+α),˙θi>0,α∈(−π4,π4),θi(0)=θ0i,i=1,2,…,N. | (4) |
Next we introduce the concepts of complete synchronization and conclude this introductory section with the main result of this paper.
Definition 2.1. Let
1. it exhibits asymptotically complete phase synchronization if
limt→∞(θi(t)−θj(t))=0,∀i≠j. |
2. it exhibits asymptotically complete frequency synchronization if
limt→∞(˙θi(t)−˙θj(t))=0,∀i≠j. |
For identical oscillators without frustration, we have the following result.
Theorem 2.2. Let
θ0∈A:={θ∈[0,2π)N:D(θ)<π}, |
then there exits
D(θ(t))≤D(θ0)e−λ1t,t≥0. | (5) |
and
D(ν(t))≤D(ν(t0))e−λ2(t−t0),t≥t0. | (6) |
Next we introduce the main result for nonidentical oscillators with frustration. For
Kc:=D(ω)√2ˉω1−√2ˉωsin|α|>0. |
For suitable parameters, we denote by
sinD∞1=sinD∞∗:=√ˉω+K(D(ω)+Ksin|α|)K√ω_−K,0<D∞1<π2<D∞∗<π. |
Theorem 2.3. Let
θ0∈B:={θ∈[0,2π)N|D(θ)<D∞∗−|α|}, |
then for any small
D(ν(t))≤D(ν(T))e−λ3(t−T),t≥T. | (7) |
Remark 1. If the parametric conditions in Theorem 2.3 are fulfilled, the reference angles
D(ω)√2ˉω1−√2ˉωsin|α|<K,1−√2ˉωsin|α|>0. |
This implies
√2ˉω(D(ω)+Ksin|α|)K<1. |
Then, by
sinD∞1=sinD∞∗:=√ˉω+K(D(ω)+Ksin|α|)K√ω_−K≤√2ˉω(D(ω)+Ksin|α|)K<1. |
Remark 2. In order to make
In this subsection we consider the system (4) with identical natural frequencies and zero frustration:
(˙θi)2=ω0+KNN∑l=1sin(θl−θi),˙θi>0,i=1,2,…,N. | (8) |
To obtain the complete synchronization, we need to derive a trapping region. We start with two elementary estimates for the transient frequencies.
Lemma 3.1. Suppose
(˙θi−˙θj)(˙θi+˙θj)=2KNN∑l=1cos(θl−θi+θj2)sinθj−θi2. |
Proof. It is immediately obtained by (8).
Lemma 3.2. Suppose
˙θi≤√ω0+K. |
Proof. It follows from (8) and
(˙θi)2=ω0+KNN∑l=1sin(θl−θi)≤ω0+K. |
Next we give an estimate for trapping region and prove Theorem 2.2. For this aim, we will use the time derivative of
Lemma 3.3. Let
Proof. For any
T:={T∈[0,+∞)|D(θ(t))<D∞,∀t∈[0,T)}. |
Since
D(θ(t))<D∞,t∈[0,η). |
Therefore, the set
T∗=∞. | (9) |
Suppose to the contrary that
D(θ(t))<D∞,t∈[0,T∗),D(θ(T∗))=D∞. |
We use Lemma 3.1 and Lemma 3.2 to obtain
12ddtD(θ(t))2=D(θ(t))ddtD(θ(t))=(θM−θm)(˙θM−˙θm)=(θM−θm)1˙θM+˙θm2KNN∑l=1cos(θl−θM+θm2)sin(θm−θM2)≤(θM−θm)1˙θM+˙θm2KNN∑l=1cosD∞2sin(θm−θM2)≤(θM−θm)1√ω0+KKNN∑l=1cosD∞2sin(θm−θM2)=−2KcosD∞2√ω0+KD(θ)2sinD(θ)2≤−KcosD∞2π√ω0+KD(θ)2,t∈[0,T∗). |
Here we used the relations
−D∞2<−D(θ)2≤θl−θM2≤0≤θl−θm2≤D(θ)2<D∞2 |
and
xsinx≥2πx2,x∈[−π2,π2]. |
Therefore, we have
ddtD(θ)≤−KcosD∞2π√ω0+KD(θ),t∈[0,T∗), | (10) |
which implies that
D(θ(T∗))≤D(θ0)e−KcosD∞2π√ω0+KT∗<D(θ0)<D∞. |
This is contradictory to
Now we can give a proof for Theorem 2.2.
Proof of Theorem 2.2.. According to Lemma 3.3, we substitute
On the other hand, by (5) there exist
˙νi=K2NνiN∑l=1cos(θl−θi)(νl−νi). |
Using Lemma 3.2, we now consider the temporal evolution of
ddtD(ν)=˙νM−˙νm=K2NνMN∑l=1cos(θl−θνM)(νl−νM)−K2NνmN∑l=1cos(θl−θνm)(νl−νm)≤Kcosδ2NνMN∑l=1(νl−νM)−Kcosδ2NνmN∑l=1(νl−νm)≤K2Ncosδ√ω0+KN∑l=1(νl−νM)−K2Ncosδ√ω0+KN∑l=1(νl−νm)=Kcosδ2N√ω0+KN∑l=1(νl−νM−νl+νm)=−Kcosδ2√ω0+KD(ν),t≥t0. |
This implies that
D(ν(t))≤D(ν(t0))e−Kcosδ2√ω0+K(t−t0),t≥t0, |
and proves (6) with
Remark 3. Theorem 2.2 shows, as long as the initial phases are confined inside an arc with geodesic length strictly less than
In this subsection, we prove the main result for nonidentical oscillators with frustration.
Lemma 3.4. Let
(˙θi−˙θj)(˙θi+˙θj)≤D(ω)+KNN∑l=1[sin(θl−θi+α)−sin(θl−θj+α)]. |
Proof. By (4) and for any
(˙θi−˙θj)(˙θi+˙θj)=(˙θi)2−(˙θj)2, |
the result is immediately obtained.
Lemma 3.5. Let
˙θi∈[√ω_−K,√ˉω+K],∀i=1,2,…,N. |
Proof. From (4), we have
ω_−K≤(˙θi)2≤ˉω+K,∀i=1,2,…,N, |
and also because
Lemma 3.6. Let
Proof. We define the set
T:={T∈[0,+∞)|D(θ(t))<D∞∗−|α|,∀t∈[0,T)},T∗:=supT. |
Since
T∗=∞. |
Suppose to the contrary that
D(θ(t))<D∞∗−|α|,t∈[0,T∗),D(θ(T∗))=D∞∗−|α|. |
We use Lemma 3.4 to obtain
12ddtD(θ)2=D(θ)ddtD(θ)=D(θ)(˙θM−˙θm)≤D(θ)1˙θM+˙θm[D(ω)+KNN∑l=1(sin(θl−θM+α)−sin(θl−θm+α))]⏟I. |
For
I=D(ω)+KcosαNN∑l=1[sin(θl−θM)−sin(θl−θm)]+KsinαNN∑l=1[cos(θl−θM)−cos(θl−θm)]. |
We now consider two cases according to the sign of
(1)
I≤D(ω)+KcosαsinD(θ)ND(θ)N∑l=1[(θl−θM)−(θl−θm)]+KsinαNN∑l=1[1−cosD(θ)]=D(ω)−K[sin(D(θ)+α)−sinα]=D(ω)−K[sin(D(θ)+|α|)−sin|α|]. |
(2)
I≤D(ω)+KcosαsinD(θ)ND(θ)N∑l=1[(θl−θM)−(θl−θm)]+KsinαNN∑l=1[cosD(θ)−1]=D(ω)−K[sin(D(θ)−α)+sinα]=D(ω)−K[sin(D(θ)+|α|)−sin|α|]. |
Here we used the relations
sin(θl−θM)θl−θM,sin(θl−θm)θl−θm≥sinD(θ)D(θ), |
and
cosD(θ)≤cos(θl−θM),cos(θl−θm)≤1,l=1,2,…,N. |
Since
I≤D(ω)−K[sin(D(θ)+|α|)−sin|α|] | (11) |
≤D(ω)+Ksin|α|−KsinD∞∗D∞∗(D(θ)+|α|). | (12) |
By (12) and Lemma 3.5 we have
12ddtD(θ)2≤D(θ)1˙θM+˙θm(D(ω)+Ksin|α|−KsinD∞∗D∞∗(D(θ)+|α|))=D(ω)+Ksin|α|˙θM+˙θmD(θ)−KsinD∞∗D∞∗(˙θM+˙θm)D(θ)(D(θ)+|α|)≤D(ω)+Ksin|α|2√ω_−KD(θ)−KsinD∞∗D∞∗2√ˉω+KD(θ)(D(θ)+|α|),t∈[0,T∗). |
Then we obtain
ddtD(θ)≤D(ω)+Ksin|α|2√ω_−K−KsinD∞∗2D∞∗√ˉω+K(D(θ)+|α|),t∈[0,T∗), |
i.e.,
ddt(D(θ)+|α|)≤D(ω)+Ksin|α|2√ω_−K−KsinD∞∗2D∞∗√ˉω+K(D(θ)+|α|)=KsinD∞∗2√ˉω+K−KsinD∞∗2D∞∗√ˉω+K(D(θ)+|α|),t∈[0,T∗). |
Here we used the definition of
D(θ(t))+|α|≤D∞∗+(D(θ0)+|α|−D∞∗)e−KsinD∞∗2D∞∗√ˉω+Kt,t∈[0,T∗), |
Thus
D(θ(t))≤(D(θ0)+|α|−D∞∗)e−KsinD∞∗2D∞∗√ˉω+Kt+D∞∗−|α|,t∈[0,T∗). |
Let
D(θ(T∗))≤(D(θ0)+|α|−D∞∗)e−KsinD∞∗2D∞∗√ˉω+KT∗+D∞∗−|α|<D∞∗−|α|, |
which is contradictory to
T∗=∞. |
That is,
D(θ(t))≤D∞∗−|α|,∀t≥0. |
Lemma 3.7. Let
ddtD(θ(t))≤D(ω)+Ksin|α|2√ω_−K−K2√ˉω+Ksin(D(θ)+|α|),t≥0. |
Proof. It follows from (11) and Lemma 3.5, Lemma 3.6 and that we have
12ddtD(θ)2=D(θ)ddtD(θ)≤D(θ)1˙θM+˙θm[D(ω)−K(sin(D(θ)+|α|)−sin|α|)]=D(ω)+Ksin|α|˙θM+˙θmD(θ)−Ksin(D(θ)+|α|)˙θM+˙θmD(θ)≤D(ω)+Ksin|α|2√ω_−KD(θ)−Ksin(D(θ)+|α|)2√ˉω+KD(θ),t≥0. |
The proof is completed.
Lemma 3.8. Let
D(θ(t))<D∞1−|α|+ε,t≥T. |
Proof. Consider the ordinary differential equation:
˙y=D(ω)+Ksin|α|2√ω_−K−K2√ˉω+Ksiny,y(0)=y0∈[0,D∞∗). | (13) |
It is easy to find that
|y(t)−y∗|<ε,t≥T. |
In particular,
D(θ(t))+|α|<D∞1+ε,t≥T, |
which is the desired result.
Remark 4. Since
sinD∞1≥D(ω)K+sin|α|>sin|α|, |
we have
Proof of Theorem 2.3. It follows from Lemma 3.8 that for any small
supt≥TD(θ(t))<D∞1−|α|+ε<π2. |
We differentiate the equation (4) to find
˙νi=K2NνiN∑l=1cos(θl−θi+α)(νl−νi),νi>0. |
We now consider the temporal evolution of
ddtD(ν)=˙νM−˙νm=K2NνMN∑l=1cos(θl−θνM+α)(νl−νM)−K2NνmN∑l=1cos(θl−θνm+α)(νl−νm)≤K2NνMN∑l=1cos(D∞1+ε)(νl−νM)−K2NνmN∑l=1cos(D∞1+ε)(νl−νm)≤Kcos(D∞1+ε)2N√ˉω+KN∑l=1(νl−νM−νl+νm)=−Kcos(D∞1+ε)2√ˉω+KD(ν),t≥T, |
where we used
cos(θl−θνM+α),cos(θl−θνm+α)≥cos(D∞1+ε),andνM,νm≤√ˉω+K. |
Thus we obtain
D(ν(t))≤D(ν(T))e−Kcos(D∞1+ε)2√ˉω+K(t−T),t≥T, |
and proves (7) with
In this paper, we presented synchronization estimates for the Kuramoto-like model. We show that for identical oscillators with zero frustration, complete phase synchronization occurs exponentially fast if the initial phases are confined inside an arc with geodesic length strictly less than
We would like to thank the anonymous referee for his/her comments which helped us to improve this paper.
[1] |
M. Wang, G. W. Yang, S. M. Hu, S. T. Yau, A. Shamir, Write-a-video: Computational video montage from themed text, ACM Trans. Graphics, 38 (2019), 1–13. doi: 10.1145/3355089.3356520. doi: 10.1145/3355089.3356520
![]() |
[2] | R. Yi, Y. J. Liu, Y. K. Lai, P. L. Rosin, Apdrawinggan: Generating artistic portrait drawings from face photos with hierarchical gans, in 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), (2019), 10743–10752. |
[3] | T. Yuan, Y. Wang, K. Xu, R. R. Martin, S. M. Hu, Two-layer qr codes, IEEE Trans. Image Process., 28 (2019), 4413–4428. doi: 10.1109/TIP.2019.2908490. |
[4] | H. Li, Q. Zheng, J. Zhang, Z. Du, Z. Li, B. Kang, Pix2pix-based grayscale image coloring method, J. Comput. Aided Comput. Graphics, 33 (2021), 929–938. |
[5] |
H. Li, M. Zhang, K. Yu, X. Qi, J. Tong, A displacement estimated method for real time tissue ultrasound elastography, Mobile Netw. Appl., 26 (2021), 1–10. doi: 10.1007/s11036-021-01735-3. doi: 10.1007/s11036-021-01735-3
![]() |
[6] |
T. Welsh, M. Ashikhmin, K. Mueller, Transferring color to greyscale images, ACM Trans. Graph., 21 (2002), 277–280. doi: 10.1145/566570.566576. doi: 10.1145/566570.566576
![]() |
[7] | Y. Jing, Z. J. Chen, Analysis and research of globally matching color transfer algorithms in different color spaces, Comput. Eng. Appl., (2007), 45–54. |
[8] | S. F. Yin, C. L. Cao, H. Yang, Q. Tan, Q. He, Y. Ling, et al., Color contrast enhancent method to imprrove target detectability in night vision fusion, J. Infrared Milli. Waves, 28 (2009), 281–284. |
[9] | M. W. Xu, Y. F. Li, N. Chen, S. Zhang, P. Xiong, Z. Tang, et al., Coloration of the low light level and infrared image using multi-scale fusion and nonlinear color transfer technique, Infrared Techn., 34 (2012), 722–728. |
[10] | Z. P, M. G. Xue, C. C. Liu, Night vision image color fusion method using color transfer and contrast enhancement, J. Graphics, 35 (2014), 864–868. |
[11] | R. Zhang, J. Zhu, P. Isola, X. Geng, A. S. Lin, T. Yu, et al., Real-time user-guided image colorization with learned deep priors, preprint, arXiv: 1705.02999. |
[12] | Z. Cheng, Q. Yang, B. Sheng, Deep colorization, preprint, arXiv: 1605.00075. |
[13] | K. Nazeri, E. Ng, M. Ebrahimi, Image colorization using generative adversarial networks, in International Conference on Articulated Motion and Deformable Objects, (2018), 85–94. doi: 10.1007/978-3-319-94544-69. |
[14] |
H. Li, Q. Zheng, W. Yan, R. Tao, X. Qi, Z. Wen, Image super-resolution reconstruction for secure data transmission in internet of things environment, Math. Biosci. Eng., 18 (2021), 6652–6671. doi: 10.3934/mbe.2021330. doi: 10.3934/mbe.2021330
![]() |
[15] |
H. A. Li, Q. Zheng, X. Qi, W. Yan, Z. Wen, N. Li, et al., Neural network-based mapping mining of image style transfer in big data systems, Comput. Intell. Neurosci., 21 (2021), 1–11. doi: 10.1155/2021/8387382. doi: 10.1155/2021/8387382
![]() |
[16] |
C. Xiao, C. Han, Z. Zhang, J. Qin, T. Wong, G. Han, et al., Example-based colourization via dense encoding pyramids, Comput. Graph. Forum, 12 (2019), 20–33. doi: 10.1111/cgf.13659. doi: 10.1111/cgf.13659
![]() |
[17] |
S. S. Huang, H. Fu, S. M. Hu, Structure guided interior scene synthesis via graph matching, Graph. Models, 85 (2016), 46–55. doi: 10.1016/j.gmod.2016.03.004. doi: 10.1016/j.gmod.2016.03.004
![]() |
[18] |
Y. Liu, K. Xu, L. Yan, Adaptive brdf mriented multiple importance sampling of many lights, Comput. Graph. Forum, 38 (2019), 123–133. doi: 10.1111/cgf.13776. doi: 10.1111/cgf.13776
![]() |
[19] | S. S. Huang, H. Fu, L. Wei, S. M. Hu, Support substructures: Support-induced part-level structural representation, 22 (2015), 2024–36. doi: 10.1109/TVCG.2015.2473845. |
[20] | G. Larsson, M. Maire, G. Shakhnarovich, Learning representations for automatic colorization, in European Conference on Computer Vision, Springer International Publishing, (2016), 577–593. doi: 10.1007/978-3-319-46493-035. |
[21] |
I. H. Iizuka S, Simo-Serra E, Let there be color!: Joint end-to-end learning of global and local image priors for automatic image colorization with simultaneous classification, ACM Trans. Graph., 35 (2016), 577–593. doi: 10.1145/2897824.2925974. doi: 10.1145/2897824.2925974
![]() |
[22] |
R. Zhang, P. Isola, A. A. Efros, Colorful image colorization, Comput. Vision Pattern Recogn., 9907 (2016), 649–666. doi: 10.1007/978-3-319-46487-940. doi: 10.1007/978-3-319-46487-940
![]() |
[23] |
C. Li, J. Guo, C. Guo, Emerging from water: Underwater image color correction based on weakly supervised color transfer, IEEE Signal Proc. Lett., 25 (2018), 323–327. doi: 10.1109/LSP.2018.2792050. doi: 10.1109/LSP.2018.2792050
![]() |
[24] |
R. Zhou, C. Tan, P. Fan, Quantum multidimensional color image scaling using nearest-neighbor interpolation based on the extension of frqi, Mod. Phys. Lett. B, 31 (2017), 175–184. doi: 10.1142/s0217984917501846. doi: 10.1142/s0217984917501846
![]() |
[25] |
E. Reinhard, M. Adhikhmin, B. Gooch, P. Shirley, Color transfer between images, IEEE Comput. Graph. Appl., 21 (2001), 34–41. doi: 10.1109/38.946629. doi: 10.1109/38.946629
![]() |
[26] | P. Isola, J. Zhu, T. Zhou, A. A. Efro, Image-to-image translation with conditional adversarial networks, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), 1125–1134. doi: arXiv-1611.07004. |
[27] | L. Tao, Review on gabor expansion and transform, J. Anhui Univ., 41 (2017), 2–13. |
[28] | R. Yi, Y. J. Liu, Y. K. Lai, P. L. Rosin, Unpaired portrait drawing generation via asymmetric cycle mapping, in 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), (2020), 8214–8222. doi: 10.1109/CVPR42600.2020.00824. |
[29] |
Z. H. Wang, Z. Z. Wang, Robust cell segmentation based on gradient detection, Gabor filtering and morphological erosion, Biomed. Signal Proces. Control, 65 (2021), 1–13. doi: 10.1016/j.bspc.2020.102390. doi: 10.1016/j.bspc.2020.102390
![]() |
[30] | V. Kouni, H. Rauhut, Star DGT: a robust gabor transform for speech denoising, preprint, arXiv: 2104.14468. |
[31] |
Y. Chen, L. Zhu, P. Ghamisi, X. Jia, G. Li, L. Tang, Hyperspectral Images Classification With Gabor Filtering and Convolutional Neural Network, IEEE Geosci. Remote Sens. Lett., 14 (2020), 2355–2359. doi: 10.1109/LGRS.2017.2764915. doi: 10.1109/LGRS.2017.2764915
![]() |
[32] | H. W. Sino, Indrabayu, I. S. Areni, Face recognition of low-resolution video using gabor filter and adaptive histogram equalization, in 2019 International Conference of Artificial Intelligence and Information Technology (ICAIIT), (2019), 417–421. doi: 10.1109/ICAIIT.2019.8834558. |
[33] | X. Lin, X. Lin, X. Dai, Design of two-dimensional gabor filters and implementation of iris recognition system, Telev. technol., 35 (2011), 109–112. |
[34] | I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, et al., Generative adversarial networks, Adv. Neural Inform. Proc. Sys., 3 (2014), 2672–2680, . |
[35] | X. Mao, Q. Li, H. Xie, R. Y. K. Lau, Z. Wang, S. P. Smolley, Least squares generative adversarial networks, in Proceedings of the IEEE International Conference on Computer Vision, (2016), 2813–2821. |
[36] | I. Gulrajani, F. Ahmed, M. Arjovsky, V. Dumoulin, A. Courville, Improved training of wasserstein gans, preprint, arXiv: 1704.00028. |
[37] | Z. Zhang, M. R. Sabuncu, Generalized cross entropy loss for training deep neural networks with noisy labels, in 32nd Conference on Neural Information Processing Systems (NeurIPS), (2018), 1–14. |
[38] | M. Arjovsky, S. Chintala, L. Bottou, Wasserstein gan, preprint, arXiv: 1701.07875. |
[39] |
F. Duan, S. Yin, P. Song, W. Zhang, H. Yokoi, Automatic welding defect detection of x-ray images by using cascade adaboost with penalty term, IEEE Access, 7 (2019), 125929–125938. doi: 10.1109/ACCESS.2019.2927258. doi: 10.1109/ACCESS.2019.2927258
![]() |
[40] | CycleGAN/datasets, Summer2winter, 2000. Available from: https://people.eecs.berkeley.edu/taesungpark/CycleGAN/datasets. |
[41] | Z. Wang, A. C. Bovik, H. R. Sheikh, E. P. Simoncelli, Image quality assessment : From error visibility to structural similarity, IEEE Trans. Image Process., (2004), 600–612. doi: 10.1109/TIP.2003.819861. |
[42] | A. Horé, D. Ziou, Image quality metrics: Psnr vs. ssim, in International Conference on Pattern Recognition, (2010), 2366–2369. doi: 10.1109/ICPR.2010.579. |
1. | Sha Xu, Xiaoyue Huang, Hua Zhang, 2024, Synchronization of a Kuramoto-like Model with Time Delay and Phase Shift, 978-9-8875-8158-1, 5299, 10.23919/CCC63176.2024.10662837 | |
2. | Sun-Ho Choi, Hyowon Seo, Inertial power balance system with nonlinear time-derivatives and periodic natural frequencies, 2024, 129, 10075704, 107695, 10.1016/j.cnsns.2023.107695 |