1 |
贾鑫, 朱卫纲, 曲卫, 等. 认知电子战概念及关键技术[J]. 装备学院学报, 2015, 26 (4): 96- 100.
|
|
JIA X , ZHU W G , QU W , et al. Concept of cognitive electronic warfare and it's key technology[J]. Journal of Equipment Aca-demy, 2015, 26 (4): 96- 100.
|
2 |
李瑞. DJS欺骗干扰波形的快速生成技术研究[D]. 西安: 西安电子科技大学, 2014.
|
|
LI R. Research on fast synthesizing the deception jamming based on DJS[D]. Xi'an: Xidian University, 2014.
|
3 |
AMURU S, BUEHRER R M. Optimal jamming using delayed learning[C]//Proc. of the IEEE Military Communications Conference, 2014: 1528-1533.
|
4 |
陶海红, 廖桂生, 王伶. 基于混合遗传算法的m序列波形优化设计[J]. 电波科学学报, 2004, 19 (3): 253- 257.
|
|
TAO H H , LIAO G S , WANG L . Novel m-sequences waveform design using the hybrid genetic algorithm[J]. Chinese Journal of Radio Science, 2004, 19 (3): 253- 257.
|
5 |
刘永贵, 胡国平. 基于DRFM的遗传算法干扰技术研究[J]. 无线电工程, 2009, 39 (6): 31- 33.
|
|
LIU Y G , HU G P . Research on DRFM jamming rechnology based on genetic algorithm[J]. Radio Engineering, 2009, 39 (6): 31- 33.
|
6 |
GOODFELLOW I , POUGET-ABADIE J , MIRZA M , et al. Generative adversarial nets[J]. Communications of the ACM, 2020, 63 (11): 139- 144.
doi: 10.1145/3422622
|
7 |
王坤峰, 苟超, 段艳杰, 等. 生成式对抗网络GAN的研究进展与展望[J]. 自动化学报, 2017, 43 (3): 321- 332.
|
|
WANG K F , GOU C , DUAN Y J , et al. Generative adversarial networks: the state of the art and beyond[J]. Acta Automatica Sinica, 2017, 43 (3): 321- 332.
|
8 |
梁俊杰, 韦舰晶, 蒋正锋. 生成对抗网络GAN综述[J]. 计算机科学与探索, 2020, 14 (1): 1- 17.
|
|
LIANG J J , WEI J J , JIANG Z F . Generative adversarial networks GAN overview[J]. Journal of Frontiers of Computer Science and Technology, 2020, 14 (1): 1- 17.
|
9 |
RADFORD A, METZ L, CHINTALA S. Unsupervised representation learning with deep convolutional generative adversarial networks[C]//Proc. of the 4th International Conference on Learning Representations, 2016.
|
10 |
MIRZA M, OSINDERO S. Conditional generative adversarial nets[EB/OL]. [2022-02-01]. https://arxiv.org/abs/1411.1784.
|
11 |
CHEN X , DUAN Y , HOUTHOOFT R , et al. InfoGAN: interpretable representation learning by information maximizing generative adversarial nets[J]. Advances in Neural Information Processing Systems, 2016, 29 (1): 2180- 2188.
|
12 |
ODENA A, OLAH C, SHLENS J. Conditional image synthesis with auxiliary classifier GANs[C]//Proc. of the International Conference on Machine Learning, 2017: 2642-2651.
|
13 |
ARJOVSKY M, CHINTALA S, BOTTOU L. Wasserstein generative adversarial networks[C]//Proc. of the International Conference on Machine Learning, 2017: 214-223.
|
14 |
PETZKA H, FISCHER A, LUKOVNICOV D. On the regularization of Wasserstein GANs[C]//Proc. of the 6th International Conference on Learning Representations, 2018.
|
15 |
KARRAS T, AILA T, LAINE S, et al. Progressive growing of GANs for improved quality, stability, and variation[C]//Proc. of the 6th International Conference on Learning Representations, 2018.
|
16 |
KARRAS T, LAINE S, AILA T. A style-based generator architecture for generative adversarial networks[C]//Proc. of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019: 4401-4410.
|
17 |
KARRAS T, LAINE S, AITTALA M, et al. Analyzing and improving the image quality of styleGAN[C]//Proc. of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020: 8110-8119.
|
18 |
CHOI Y, CHOI M, KIM M, et al. StarGAN: unified generative adversarial networks for multi-domain image-to-image translation[C]//Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, 2018: 8789-8797.
|
19 |
CHOI Y, UH Y, YOO J, et al. StarGAN v2: diverse image synthesis for multiple domains[C]//Proc. of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020: 8188-8197.
|
20 |
DAUPHIN Y N , PASCANU R , GULCEHRE C , et al. Identifying and attacking the saddle point problem in high-dimensional non-convex optimization[J]. Advances in Neural Information Processing Systems, 2014, 27 (2): 2933- 2941.
|
21 |
QIAN N . On the momentum term in gradient descent learning algorithms[J]. Neural Networks, 1999, 12 (1): 145- 151.
|
22 |
NESTEROV Y . A method for unconstrained convex minimization problem with the rate of convergence O (1/k.2)[J]. Doklady an USSR, 1983, 269, 543- 547.
|
23 |
BENGIO Y, BOULANGER-LEWANDOWSKI N, PASCANU R. Advances in optimizing recurrent networks[C]//Proc. of the IEEE International Conference on Acoustics, Speech and Signal Processing, 2013: 8624-8628.
|
24 |
KINGMA D P, BA J. Adam: a method for stochastic optimization[C]//Proc. of the 3rd International Conference for Learning Representations, 2015.
|
25 |
DUCHI J , HAZAN E , SINGER Y . Adaptive subgradient methods for online learning and stochastic optimization[J]. Journal of Machine Learning Research, 2011, 12 (7): 2121- 2159.
|
26 |
DEAN J , CORRADO G , MONGA R , et al. Large scale distributed deep networks[J]. Advances in Neural Information Processing Systems, 2012, 25 (1): 1- 11.
|
27 |
ZEILER M D. Adadelta: an adaptive learning rate method[EB/OL]. [2022-02-01]. https://arxiv.org/abs/1212.5701.
|
28 |
DOZAT T. Incorporating nesterov momentum into Adam[EB/OL]. [2022-02-01]. https://cs229.stanford.edu/proj2015/054_report.pdf.
|
29 |
REDDI S J, KALE S, KUMAR S. On the convergence of Adam and beyond[C]//Proc. of the International Conference on Learning Representations, 2018.
|
30 |
LOSHCHILOV I, HUTTER F. Decoupled weight decay regu-larization[C]//Proc. of the International Conference on Learning Representations, 2018.
|
31 |
MA J, YARATS D. Quasi-hyperbolic momentum and Adam for deep learning[C]//Proc. of the 7th International Conference on Learning Representations, 2019.
|
32 |
LUCAS J, SUN S Y, ZEMEL R, et al. Aggregated momentum: stability through passive damping[C]//Proc. of the 8th International Conference on Learning Representations, 2018.
|