王建军

2024-06-17 10:05 作者: 审核: 浏览:

个人简历

 

姓名:王建军                            性别:男

学历:博士                              职称:教授(研究员)

邮箱:wjj@swu.edu.cn; wjjmath@163.com

研究方向:高维数据建模、机器学习(深度学习)、张量分析、数字图像处理、数据挖掘、压缩感知、函数逼近论等。


个人简介

王建军,博士,教授(三级),博士生导师,重庆市英才计划·创新创业领军人才,巴渝学者特聘教授,重庆市学术技术带头人,美国数学评论评论员,中国数学会理事,重庆市工业与应用数学学会副理事长,重庆市运筹学学会副理事长。200612月西安交通大学获理学博士学位,为西安交通大学优秀博士毕业生(导师:徐宗本院士,应用数学专业),优秀毕业生。200612月至今在西南大学任教,20081月至200912月在西安交通大学博士后力学流动站从事研究工作。20126月破格评聘为研究员,20128月至20138月受国家留学基金委资助在美国Texas A&M大学访问(合作导师: Ronald DeVore 美国科学院院士)。

主要研究方向为:高维数据建模、机器学习(深度学习)、数据挖掘、压缩感知、张量分析、函数逼近论等。在神经网络(深度学习)逼近复杂性和高维数据稀疏建模等方面有较好的学术积累。主持国家重点研发计划课题、子课题各1项,国家自然科学基金5项,教育部科学技术重点项目1项,重庆市自然科学基金重点2项,主研8项国家自然、社会科学基金;现主持国家重点研发课题一项,国家自然科学基金面上项目1项,国家重点研发计划子课题1项,重庆市人才计划项目1项,参与国家重点基础研究发展‘973’计划一项, 多次出席国际、国内重要学术会议,并应邀做大会特邀报告40余次。已在IEEE Transactions on Pattern Analysis and Machine Intelligence(5), IEEE Transactions on Information TheoryIEEE Transactions on Image Processing(2)IEEE Transactions on Neural Networks and Learning System(3) Applied and Computational Harmonic Analysis(2), Inverse Problems, AAAICCF A类顶会),ACM MMCCF A类顶会),Pattern Recognition Knowledge-Based SystemsApplied Mathematics ModellingNeural Networks, Signal Processing(3), IEEE Signal Processing letters(3), Journal of Computational and applied mathematics, Journal of Computational MathematicsNeurocomputing, ICASSP(CCF B类会议,5)IET Image processing, IET Signal processing,中国科学(A,F)(5), 数学学报, 计算机学报, 电子学报(3)、数学年刊等知名高水平专业期刊发表100余篇学术论文,4项国家发明专利。 IEEE等系列刊物,NSRSignal ProcessingNeural NetworksPattern Recognization,中国科学, 计算机学报,电子学报,数学学报,CCF顶级会议(PC)等审稿人。2019年,以第一完成人申报的阶段性成果《复杂结构性高维数据稀疏建模的方法与算法应用》荣获重庆市自然科学奖。

 

 

科研情况

Ø  纵向科研项目

 

1.       基于扩散模型的缺陷检测归因理论与方法,国家重点研发计划课题. 执行时间:2024.01-2028.12.(主持)

2.       耦合多重先验信息的低秩张量恢复模型、理论与算法研究. 国家自然科学基金面上项目. 执行时间:2021.01-2024.12.(主持)

3.       新型轻量化动态核磁共振成像的数学模型、理论与算法研究,重庆市自然科学基金创新发展联合基金重点项目,执行时间:2023.10-2026.10.(主持)

4.       智能驾驶汽车内部异构网络轻量化安全防护,国家重点研发计划子课题.执行时间:2021.01-2024.12.

5.       动态结构化张量数据的稀疏恢复理论、模型与高性能算法研究,国家自然科学基金面上项目. 执行时间:2020.01-2023.12.

6.       重庆英才计划·创新领军人才(教育领域),2020-2024

7.       基于样本的非线性压缩感知理论及其应用. 国家自然科学基金面上项目. 执行时间:2017.01-2020.12.(主持)

8.       低秩矩阵复原的Schatten-q正则化理论与算法研究. 国家自然科学基金面上项目. 执行时间:2013.01-2016.12(主持)

9.       基于L1/2正则化的压缩传感可重构性理论研究. 国家自然科学基金青年项目. 执行时间:2011.01-2013.12(主持)

10.   关于前馈神经网络结构与本质逼近阶研究. 国家自然科学基金青年项目. 执行时间:2008.01-2011.12(主持)

11.   关于神经网络拓扑选择与逼近阶研究. 教育部科学技术重点项目. 执行时间:2008.01-2010.12(主持)

12.   关于神经网络逼近能力与算法研究. 部委级科研项目面上项目. 执行时间:2008.06-2010.06(主持)

13.   关于前向神经网络逼近复杂性与算法研究. 部委级科研项目一般项目. 执行时间:2009.06-2012.06(主持)

14.   基于Lq极小化的压缩传感理论及应用研究. 中央高校基本科研业务费重点项目,执行时间:2010.10-2013.10(主持)

15.   块稀疏信号重构的非凸极小化方法及算法应用研究. 中央高校基本科研业务费重大项目,执行时间:2015.01-2017.12(主持)

16.   网络上的流行病动力系统的研究. 国家自然科学基金青年项目. 执行时间:2008.01-2010.12(主持子课题一项)

17.   直觉模糊近似空间和形式背景中知识获取研究. 国家自然科学基金青年项目. 执行时间:2012.01-2014.12(主研)

18.   16非线性算子方程的变号解及其应用. 国家自然科学基金青年项目.执行时间:2009.01-2009.12(参与)

 

Ø  部分主要学术论文

 

1.        Qin W, Wang H, Zhang F, Wang J.(通讯作者) et al., Nonconvex Robust High-Order Tensor Completion Using Randomized Low-Rank Approximation[J]. IEEE Transactions on Image Processing, 2024.

2.        Liu C, Li S, Hu D, Wang J.(通讯作者) et al., Nonlocal Tensor Decomposition With Joint Low Rankness and Smoothness for Spectral CT Image Reconstruction[J]. IEEE Transactions on Computational Imaging, 2024.

3.        Cheng X, Kong W, Luo X, Wang J.(通讯作者) et al., Tensor completion via joint reweighted tensor Q-nuclear norm for visual data recovery[J]. Signal Processing, 2024, 219: 109407.

4.        Huang K, Kong W, Zhou M, Wang J.(通讯作者) et al.,  Enhanced Low-Rank Tensor Recovery Fusing Reweighted Tensor Correlated Total Variation Regularization for Image Denoising[J]. Journal of Scientific Computing, 2024, 99(3): 69.

5.        Feng Q, Zhang F, Kong W, Wang J.(通讯作者) et al., Poisson image deblurring with frame-based nonconvex regularization[J]. Applied Mathematical Modelling, 2024, 132: 109-128.

6.        Yang L, Zhang B, Feng Q, Wang J.(通讯作者) et al.,  Schatten Capped p Regularization for Robust Principle Component Analysis[C]//Computer Graphics International Conference. Cham: Springer Nature Switzerland, 2023: 28-40.

7.        Dou Y, Liu X, Zhou M, Wang J.(通讯作者) et al., Robust principal component analysis via weighted nuclear norm with modified second-order total variation regularization[J]. The Visual Computer, 2023, 39(8): 3495-3505.

8.        Tan H. Zhang F. Wang J.(通讯作者), High-Order Tensor Recovery Coupling Multilayer Subspace Priori With Application in Video Restoration.  ACM International Conference on Multimedia, 2023 Accepted.CCF-A 会议)

9.        Huang J. Wang W., Wang J.(通讯作者), Zhang F. The Perturbation Analysis of Nonconvex Low-Rank Matrix Robust Recovery, IEEE Transactions on Neural Networks and Learning Systems,2023 accepted.

10.    Yu P B, Wang J J(通讯作者),, Xu C. Matrix recovery using deep generative priors with low-rank deviations[C]. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2023: 1-5.

11.    Tan H, Wang J J(通讯作者),, Kong W C. Deep Plug-and-play for tensor robust principal component analysis[C]. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2023: 1-5.

12.     Wang J., Hou J., Eldar Y. C., "Tensor Robust Principal Component Analysis From Multilevel Quantized Observations," IEEE Transactions on Information Theory, 2023, 69(1): 383-406.

13.    Liu X L, Hou J Y, Peng J J, Wang H L, Meng D Y, Wang J.(通讯作者) Tensor compressive sensing fused low-rankness and local-smoothness[C]. Proceedings of the AAAI Conference on Artificial Intelligence. 2023, 37(7): 8879-8887.

14.     Zhang F, Yang L H, Wang J.(通讯作者),, Luo X, Randomized sampling techniques based low-tubal-rank plus sparse tensor recovery[J]. Knowledge-Based Systems, 2023, 261:110198.

15.     Zhong Y X, Xu C, Zhang B, Hou J Y, Wang J.(通讯作者). One-bit compressed sensing via total variation minimization method[J]. Signal Processing, 2023, 207: 108939.

16.    Kong W C, Zhang F, Qin W J, Wang J.(通讯作者).. Low-Tubal-Rank tensor recovery with multilayer subspace prior learning. Pattern Recognition, 2023, 140: 109545.

17.    Zhang F, Wang H L, Qin W J, Zhao X L, Wang J.(通讯作者).. Generalized nonconvex regularization for tensor RPCA and its applications in visual inpainting, Applied Intelligence, 2023.

18.    Chen G, Wang J.(通讯作者), Wang H L, Wen J M, Gao Y, Xu Y J. Fluorescence microscopy images denoising via deep convolutional sparse coding, Signal Processing: Image Communication, 2023, Signal Processing: Image Communication, 117: 117003.

19.    Zhang F, Yang L H, Wang J.(通讯作者), Luo X, Randomized sampling techniques based low-tubal-rank plus sparse tensor recovery, Knowledge-Based Systems, 2023, 261: 110198.

20.     Hou J. Zhang F., Qiu H., Wang J.(通讯作者),, Wang Y., Meng D., Robust Low-tubal-rank Tensor Recovery from Binary Measurements. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(8): 4355 - 4373.

21.    Zhang F., Wang J.(通讯作者),, Wang W., Xu C., Low-tubal-rank plus sparse tensor recovery with prior subspace information. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 43(10): 3492-3507.

22.     Wang W., Zhang F., Wang J.J.(通讯作者),,Low-rank matrix recovery via regularized nuclear norm minimization, Applied and Computational Harmonic Analysis, 2021541-19

23.     Wang J.J., Huang J.W., Zhang F, Wang W.D. Group sparse recovery in impulsive noise via alternating direction method of multipliers, Applied and Computational Harmonic Analysis, 202049831-862

24.    Qin W., Wang H., Zhang F., Wang J.(通讯作者),, Luo X., Huang T., Low-rank high-order tensor completion with applications in visual data. IEEE Transactions on Image Processing, 2022, 31: 2433-2448.

25.  Hou J., Wang J.(通讯作者),, Zhang F., Huang J., One-bit compressed sensing via -minimization method. Inverse Problems, 2021, 35(5): 055005.

26.     Wang H., Zhang F., Wang J.(通讯作者),, Huang T., Huang J., Liu X., Generalized non-convex approach for low-tubal-rank tensor recovery. IEEE Transactions on Neural Networks and Learning Systems, 2022, 33(8): 3305-3319.

27.     Wang Z., Liu Y., Luo X., Wang J.(通讯作者),, Gao C., Peng D., Chen W., "Large-Scale Affine Matrix Rank Minimization With a Novel Nonconvex Regularizer," IEEE Transactions on Neural Networks and Learning Systems, 2022, 33(9): 4661-4675.

28.     Huang J.,  Zhang F. , Wang J.(通讯作者),, Liu X., Jia X., The Perturbation Analysis of Nonconvex Low-Rank Matrix Robust Recovery,  IEEE Transactions on Neural Networks and Learning Systems, 2023 Accepted.

29.     Kong W.Zhang F.,Qin W., Wang J.(通讯作者),, Low-tubal-rank tensor recovery with multilayer subspace prior learning, Pattern Recognition. 2023 Accepted.

30.     Zhang F., Yang L., Wang J.(通讯作者), Luo X., `` Randomized sampling techniques based low-tubal-rank plus sparse tensor recovery,” Knowledge-Based Systems, 2023, 261:110198.

31.     Zhong Y., Xu C., Zhang B., Hou J., Wang J.(通讯作者),, ``One-bit compressed sensing via total variation minimization method,” Signal Processing, 2023, Accepted.

32.     Liu X., Hou J., Wang J.(通讯作者), Robust low-rank matrix recovery fusing local-smoothness,  IEEE Signal Processing Letters, 2022, 29: 2552-2556.

33.     Luo X.,  Wu H., Wang Z. Wang J., Meng D. A Novel Approach to Large Scalirected Network Representation, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022,40(12): 9756-9773

34.     Wang H.,  Peng J., Qin W. Wang J., Meng D. , Guaranteed Tensor Recovery Fused Low-rankness and Smoothness, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023 45(9): 10990 – 11007

35.     Peng J., Wang Y. Zhang H. Wang J., Meng D. , Exact Decomposition of Joint Low Rankness and Local Smoothness Plus Sparse Matrices, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023 45(5): 5766-5781

36.     Hou J., Zhang F., Wang J.(通讯作者), One-bit tensor completion via transformed tensor singular value decomposition. Applied Mathematical Modelling, 2021, In Press. DOI: 10.1016/j.apm.2021.02.032.

37.     Wang H., Zhang F., Wang J.(通讯作者), Wang Y., Estimating structural missing values via low-tubal-rank tensor completion. Proceedings of the 45th International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2020: 3297-3301.

38.     Hou J., Zhang F., Wang Y., Wang J.(通讯作者), Low-tubal-rank tensor recovery from one-bit measurements. Proceedings of the 45th International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2020: 3302-3306.

39.     Yang Y., Wang H., Wang J.(通讯作者), Non-convex sparse deviation modeling via generative models, IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2021, 2345-2349  

40.     Xie Y., Wang H., Wang J.(通讯作者), CMCS-net: image compressed sensing with convolutional measurement via DCNN, IET Image Processing, 2021, 1-11.

41.     Wen Z., Wang H., Wang J.(通讯作者), A denoising convolutional neural network inspired via multi-layer convolutional sparse coding , Journal of Electronic Imaging, 2021, 30(2),023007-1-20

42.     Zhang F., Hou J., Wang J.(通讯作者), Wang W., Uniqueness guarantee of solutions of tensor tubal-rank minimization problem. IEEE Signal Processing Letters, 2020, 27: 540-544.

43.     Zhang F, Wang W.D.., Huang J.W., Wang J.J(通讯作者),,Wang Y., RIP-based performance guarantee for low-tubal-rank tensor recovery[J], Journal of Computational and Applied Mathematics, 2020, 374,112767.

44.     Zhang F, Wang W.D .,Hou J.Y., Wang J.J(通讯作者), , Huang J.W., Tensor restricted isometry property analysis for a large class of random measurement ensembles[J], Science China .Information Sciences,2020, In Press.

45.     Wang J.J., Zhang F, Huang J.W., Wang W.D., Yuan C. A nonconvex penalty function with integral convolution approximation for compressed sensing. Signal Processing, 2019, 158: 116–128.

46.     Feng Q, Wang J.J(通讯作者),, Zhang F. Block-sparse signal recovery based on truncated l1- minimisation in non-Gaussian noise[J]. IET Communications, 2019, 13(2): 251-258.

47.     Chen G, Wang J.J(通讯作者),, Zhang F, et al. Image denoising in impulsive noise via weighted Schatten p-norm regularization [J]. Journal of Electronic Imaging, 2019, 28(1): 013044.

48.     Huang J.W., Wang J.J(通讯作者),, Wang W.D., Zhang F. Sharp sufficient condition of block signal recovery via l2/l1-minimization. IET Signal Processing, 2019, http://ietdl.org/t/dj7po

49.     Kong W, Wang J.J(通讯作者), Zhang F, et al. Enhanced Block-Sparse Signal Recovery Performance via Truncated ℓ2/ℓ1−2 Minimization. Journal of Computational Mathematics, 2019, doi:10.4208/jcm.1811-m2017-0275

50.     Wang Z , Wang W , Wang J.J(通讯作者), , et al. Fast and efficient algorithm for matrix completion via closed-form 2/3-thresholding operator[J]. Neurocomputing, 2019,330(1): 212-222.

51.     Huang J.W., Wang J.J(通讯作者),. On asymptotic of extremes from generalized Maxwell distribution. Bull. Korean Math. Soc., 2018, 55(3): 679-698.

52.     Wang, W,D., Wang J.J(通讯作者),, Zhang, Z.L.: Block-sparse signal recovery via l2/l1-2minimisation method,  IET Signal Processing, 2018, doi: 10.1049/iet-spr.2016.0381

53.     Jiayi Liu, Wang J.J(通讯作者),, Feng Zhang. Reconstruction Analysis of Block Sparse Signal via Truncated ℓ2/ℓ1-minimization with Redundant Dictionaries, IET Signal Processing, 2018128): 1034 – 1042.

54.     Huang J.W., Wang J.J(通讯作者),., Luo G.W. Pu H. Higher-order expansion for moments of extreme for generalized Maxwell distribution, Communications in Statistics - Theory and Methods, 2018, 47(14): 3441-3452.

55.     Huang J.W., Wang J.J(通讯作者),. Higher order asymptotic behaviour of partial maxima of random sample from generalized Maxwell distribution under power normalization. Applied Mathematics-A Journal of Chinese Universities, 2018,33(2): 177-187.

56.     Feng N.C., Wang J.J(通讯作者),, Wang W.D. Sparse signal recovery with prior information by iterative reweighted least squares algorithm. Journal of Inverse and Ill-posed Problems, 2018,26(2): 171-184.

57.     Zhu, L., Wang J.J(通讯作者),., He, X., & Zhao, Y. (2018). An inertial projection neural network for sparse signal reconstruction via l1 2 minimization. Neurocomputing, 315, 89-95.

58.Wang W.D., Wang J.J(通讯作者),Enhancing Matrix Completion Using a Modified Second-Order Total Variation.Discrete Dynamics in Nature and Society, 
2018, Article ID 2598160

59.     Wang W.D., Wang J.J(通讯作者),., Zhang Z.L. Robust Signal Recovery With Highly Coherent Measurement Matrices. IEEE Signal Processing Letters, 2017, 24(3): 304-308.

60.     Huang J.W., Wang J.J(通讯作者),., Luo G.W., He J. Tail properties and approximate distribution and expansion for extreme of lgmd. Journal of Inequalities & Applications, 2017, 2017(1): 1-16.

61.     Huang J.W., Wang J.J(通讯作者),, Luo G.W. On the rate of convergence of maxima for the generalized Maxwell distribution, Statistics: A Journal of Theoretical and Applied Statistics, 2017, 51(5): 1105-1117.

62.     Liu C.Y., Wang J.J(通讯作者),, Wang W.D., Wang, Z. Non-convex block-sparse compressed sensing with redundant dictionaries. Iet Signal Processing, 2017, 11(2): 171-180.

63.     Wang Y., Wang J.J(通讯作者), Improved RIP Conditions for Compressed Sensing with Coherent Tight Frames. Discrete Dynamics in Nature and Society, 2017, 2017: 1-8.

64.     He S.Y., Wang Y, Wang J.J(通讯作者), Xu Z.B. Block-sparse compressed sensing with partially known signal support via non-convex minimisation. Iet Signal Processing, 2016, 10(7): 717-723.

65.  刘春燕, 王建军(通讯作者), 王文东,. 基于非凸极小化的扰动压缩数据分离[J]. 电子学报, 2017, 45(1):37-45.

66.  王建军, 袁建军, 王尧. 基于混合l2/l1范数极小化方法的块稀疏信号重构条件[J]. 数学学报, 2017, 60(4):619-630.

67.     Nie, F., Wang J.J(通讯作者),, Wang, Y., & Jing, J. (2017, July). Nonlinear Compressed Sensing Based on Kernel Sparse Representation. In 2017 IEEE 7th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER) (pp. 943-946). IEEE.

68.  王文东, 王建军(通讯作者), 王尧, 张自力. 基于相干性理论的非凸块稀疏压缩感知. 中国科学 信息科学, 201646(3):376-390

69.     Cai J, Tang Y, Wang J J(通讯作者). Kernel canonical correlation analysis via gradient descent. Neurocomputing, 2016, 182:322-331.

70.     Zhang J., Wang J.J(通讯作者)., Wang W.D. A perturbation analysis of block-sparse compressed sensing via mixed l2/l1 minimization. International Journal of Wavelets Multiresolution & Information Processing, 2016, 14(04): 3122-3127.

71.     Yuan J, Wang J. Perona–Malik Model with a New Diffusion Coefficient for Image Denoising[J]. International Journal of Image & Graphics, 2016, 16(02):1650011.

72.     Zhang F, Wang J J(通讯作者), Jing J, Low rank tensor completion via partial sum minimization of singular values, International Conference on Automatic Control and Information Engineering, Oct. 22-23, 2016, Hong Kong, 64: 16-19.(STP)

73.     Yang C Y, Wang J J, Chou J J, et al. Confirming robustness of fuzzy support vector machine via ξ–α bound. Neurocomputing, 2015, 162:256-266.

74.     Wang J.J., Zhang J., Wang W.D., et al. A perturbation analysis of nonconvex block-sparse compressed sensing. Communications in Nonlinear Science & Numerical Simulation, 2015, 29(1–3): 416–426.

75.  王文东, 王尧, 王建军(通讯作者). 基于迭代重赋权最小二乘算法的块稀疏压缩感知. 电子学报, 2015,43(5):922-928

76.     Wang Y., Wang J.J(通讯作者)., Xu Z.B., Restricted p-isometry properties of nonconvex block-sparse compressed sensinSignal Processing2014,104: 188–196.

77.     Yuan J.J., Wang J J., Active contours driven by local intensity and local gradient fitting energies, International Journal of Pattern Recognition and Artificial Intelligence, 2014,28(3):1455006

78.     Jing J, Wang J J. Recovery of Sparse Signal and Nonconvex Minimization. Applied Mechanics & Materials, 2014, 651-653:2177-2180.

79.     Wang Y., Wang J.J. (通讯作者),Xu Z.B., On recovery of block-sparse signals via mixed l2/lq(0<q<=1) norm minimization, EURASIP Journal on Advances in Signal Processing, 2013, 2013(76) 1-30.

80.     Wang Y., Wang J.J. (通讯作者),,Xu Z.B., A note on block-sparse signal recovery with coherent tight frames, Discrete Dynamics in Nature and Society2013,2013:1-7.

81.     Wang J.J., Yang C. Y., Gu Z.G., Lp Error estimate for minimal norm SBF interpolation, Journal of Inequalities and Applications 2013,2013:510-516.

82.     Wang J J, Guo H F, Jing J. Estimation of Approximation with Jacobi Weights by Multivariate Baskakov Operator. Journal of Function Spaces, 2013, 56(3):377-384.

83.     Wang J.J., Peng Z.X., Duan S.K, Jing J.,Derivatives of multivariate Bernstein operators and smoothness with Jacobi weights, Journal of Applied Mathematics, 2012,2012:1-9.

84.     Wang J.J.  Yang C.Y., Jing J., Estimation of approximating rate for neural networks in L(w,p), Journal of Applied Mathematics, 2012,2012:1-8.

85.     Wang J.J., Xu W.H., Zou B. Constructive estimation of approximation for trigonometric neural networks, International Journal of Wavelets, Multiresolution and Information Processing,201210(3):1250021-1-1250021-18.

86.     Wang J.J., Chen B. L. , Yang C. Y., Approximation of algebraic and trigonometric polynomials by feedforward neural networks, Neural Computing & Applications201221:73-80.

87.     Gao B.B., Wang J.J. (通讯作者), Huang H., L2-Loss Twin Support Vector Machine for Classification5th International Conference on BioMedical Engineering and Informatics (BMEI), IEEE2012:1265–1269.

88.     Yang C Y, Wang J J. Estimator for Fuzzy Support Vector Machine. Advanced Science Letters, 2012, 11(1): 479-484.

89.     Wang J.J., Xu Z.B., Neural networks and the best Trigonometric approximation, Journal of Systems Science and Complexity; 2011,24(2): 401-412.

90.     Wang J.J., Chen B.L., Yang C.Y., Sparse signal recovery based on lq(0<q<=1)minimization 2011 International Conference on Multimedia and Signal Processing,IEEE Computer Society, 2011,239-242.

91.     Wang J.J.  Yang C.Y., Duan S.K, Aproximation order for multivariate Durrmeyer

operators with Jacobi weights, Abstract & Applied Analysis, 20112011:1-12.

92.  彭联勇,王建军(通讯作者)Bernstein 型算子线性组合加Jacobi权逼近及高阶导数的等价定理, 应用数学, 201124(4)791-797.

93.     Wang J.J., Xu Z.B., New study of neural networks: the essential order of approximation,

 Neural Networks, 2010,23:618-624.

94.     Wang J.J., Han G.D., et al. Derivatives of Bernstein operators and smoothness with Jacobi weights. Taiwanese Journal of Mathematics, 2010, 14(4):1491-1500.

95.  常象宇,徐宗本,张海,王建军,梁勇,稳健Lq(0<q<1)正则化理论:解的渐近分布与变量选择一致性,中国科学:数学,2010,40(10):985-998.

96.     Wang J.J., Xu Z.B., Approximation with Jacobi weights by Baskakov operators. Taiwanese Journal of Mathematics,2009,13(1):157-168.

97.     Yang C.Y., Yang J.S., Wang J.J., Margin calibration in SVM class-imbalanced learning. Neurocomputing,2009,73:397-411.

98.     Wang J.J., Zou B., Chen B.L., How to measure the essential approximation capability of a FNN. 2009 Fifth International Conference on Natural Computation, IEEE Computer Society,394-398.

99.     Wang J.J., Huang H. Luo zhangtao,Baili Chen, Estimation of covering number in learning theoryFifth International Conference on Semantics, Knowledge and Grid, IEEE Computer Society,2009.10.388-391.

100.  Xu J., Zou B., Wang J.J., Generalization performance of ERM algorithm with geometrically ergodic markov chain samplesFifth International Conference on Natural Computation; IEEE Computer Society,2009,154-158

101.王建军, 徐宗本. 神经网络的加权本质逼近阶. 数学年刊:中文版, 2009, 30(06):741-750.

102.王建军, 徐宗本. 多元多项式函数的三层前向神经网络逼近方法. 计算机学报, 2009, 32(12):2482-2488.

103.王建军, 徐宗本. Baskakov算子线性组合加Jacobi权逼近及高阶导数的正逆定理. 系统科学与数学, 2008, 28(1):30-39.

104.  Wang J.J., Xu Z.B. and Jing J., Constructive approximation method of polynomial by neural networks. International conference on congnitive neurodynamics(2007), Springer Science Business Media B.V.2008,1033-1037.

105.  Yang C.Y., Wang J.J., Yang J.S.,Yu G.D., Imbalanced SVM learning with margin compensation. Lecture Notes in Computer Science, Germany: Springer-Verlag,  2008,LNCS5263,636-644.

106.  Wang J.J., Xue Y.C., Li F.J., Stechkin-marchaud type inequalities with Jacobi weights for Bernstein operators. Journal of Applied mathematics and computing, 2007,24(1-2):343-355

107.王建军,徐宗本.近似指数型神经网络的本质逼近阶. 中国科学, 2006, 36(6):579-592.

108.  Han G.D., Wang J.J., Multiple positive radial solutions of elliptic equations in an exterior domain. Monatshefte fur mathematik,2006,148(3): 217-228.

109.  Xu Z.B., Wang J.J., and Meng D.Y., Approximation bound of mixture networks in L(w,p) spaces. Lecture Notes in Computer Science, Germany: Springer-Verlag 2006, 3971:60-65.

110.王建军,薛银川,Baskakov算子加Jacobi权逼近及导数的正逆定理. 数学年刊,200526A(4):561-570.

111.王建军,薛银川,Baskakov型算子加权逼近下的Stechkin-Marchand不等式. 数学研究与评论,2004,24(4):710-714.

112.  Wang J.J., Xu Z.B.,and Xu W. J., Approximation bounds by neural networks in L(w, p). Lecture Notes in Computer Science, Germany: Springer-Verlag, 2004, 3173:1-6.


教学情况

Ø  教授课程

逼近论,高等数学,神经网络,学习理论,数值分析,支持向量机,最优化方法,模糊数学,应用统计与数据分析,数据挖掘、高维概率、机器学习等

Ø  学生指导

1.        2023年,飞桨杯重庆市首届人工智能创新大赛 二等奖

2.        2022年,挑战杯全国大学生课外学术科技竞赛,二等奖

3.        2021年,挑战杯全国大学生课外学术科技竞赛重庆赛区 特等奖

4.        2019年,高教社杯全国大学生数学建模竞赛,二等奖

5.        2018年,美国数学建模大赛,一等奖2

6.        2018年,高教社杯全国大学生数学建模竞赛,重庆市一等奖,二等奖各1

7.        指导西南大学学生科技创新团队入选2017年度全国大学生小平科技创新团队

8.        2017年,美国数学建模大赛,二等奖2

9.        2017年,全国大学生统计建模大赛,一等奖1项,

10.     2017年,“国家级大学生创新创业训练计划”项目

11.    2016年,美国数学建模大赛,一等奖、二等奖各一项

12.    2016年,高教社杯全国大学生数学建模竞赛,二等奖

13.    2016年,高教社杯全国大学生数学建模竞赛,重庆市二等奖3

14.    2015年,“国家级大学生创新创业训练计划”项目

15.    2012年,高教社杯全国大学生数学建模竞赛,一等奖

16.    2010年,高教社杯全国大学生数学建模竞赛,二等奖

17.    2012年,高教社杯全国大学生数学建模竞赛,重庆市一等奖

18.    2011年,高教社杯全国大学生数学建模竞赛,重庆市一等奖

奖项荣誉

1.        《复杂结构性高维数据稀疏建模的方法与算法应用》获重庆市自然科学三等奖,2018年度

2.        第三批重庆市学术技术带头人,20193

3.        重庆市首批英才计划·创新创业领军人才 201911

4.        巴渝学者特聘教授  201911

5.        全国大学生数学建模竞赛中荣获重庆赛区优秀指导教师奖,2016

6.        全国大学生数学建模竞赛中荣获重庆赛区优秀指导教师奖,2012

5   西南大学2010-2012年学年度优秀教师