[1]
|
Poppe R. A survey on vision-based human action recognition[J]. Image Vision Computing, 2010, 28(6):976-990. |
[2]
|
Wang J, Liu Z, Wu Y, et al. Learning actionlet ensemble for 3D human action recognition[J]. IEEE Transactions on Pattern Analysis Mchine Intelligence, 2014, 36(5):914-927. |
[3]
|
Bulling A, Blanke U, Schiele B. A tutorial on human activity recognition using body-worn inertial sensors[J]. Acm Computing Surveys, 2014, 46(3):1-33. |
[4]
|
Lara O D, Labrador M A. A survey on human activity recognition using wearable sensors[J]. IEEE Communications Surveys Tutorials, 2013, 15(3):1192-1209. |
[5]
|
Yang J, Wang S, Chen N, et al. Wearable accelerometer based extendable activity recognition system[C]//IEEE International Conference on Robotics and Automation. Piscataway, 2010:3641-3647. |
[6]
|
Davis K, Owusu E, Bastani V, et al. Activity recognition based on inertial sensors for ambient assisted living[C]//International Conference on Information Fusion, 2016:371-378. |
[7]
|
Su B, Tang Q F, Jiang J, et al, A novel method for short-time human activity recognition based on improved template matching technique[C]//ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry, 2016:233-242. |
[8]
|
Li L J, Su H, Lim Y, et al. Object bank:An object-level image representation for high-level visual recognition[J]. International Journal of Computer Vision, 2014, 107(1):20-39. |
[9]
|
Sadanand S, Corso J J. Action bank:A high-level representation of activity in video[C]//Computer Vision and Pattern Recognition(CVPR), 2012:1234-1241. |
[10]
|
Liu Z J, Lin W, Geng Y L, et al. Intent pattern recognition of lower-limb motion based on mechanical sensors[J]. Journal of Automatica Sinica, 2017, 4(4):651-660. |
[11]
|
Young A J, Simon A M, Hargrove L J. A training method for locomotion mode prediction using powered lower limb prostheses[J]. IEEE Transactions on Neural Systems Rehabilitation Engineering, 2014, 22(3):671-677. |
[12]
|
Young A J, Simon A M, Fey N P, et al. Intent recognition in a powered lower limb prosthesis using time history information[J]. Annals of Biomedical Engineering, 2014, 42(3):631-641. |
[13]
|
Yuan K B, Wang Q N, Wang L. Fuzzy-logic-based terrain identification with multisensor fusion for transtibial amputees[J]. IEEE/ASME Transactions on Mechatronics, 2015, 20(2):618-630 |
[14]
|
Zheng E H, Wang Q N. Noncontact capacitive sensing-based locomotion transition recognition for amputees with robotic transtibial prostheses[J]. IEEE Transactions on Neural Systems Rehabilitation Engineering, 2016, 25(2):161-170. |
[15]
|
Yang A Y, Jafari R, Sastry S S, et al. Distributed recognition of human actions using wearable motion sensor networks[J]. Journal of Ambient Intelligence Smart Environments, 2009, 1(2):103-115. |
[16]
|
Su B, Tang Q, Wang G, et al. Transactions on Edutainment XⅡ:The Recognition of Human Daily Actions With Wearable Motion Sensor System[M]. Germany:Springer, 2016:68-77. |
[17]
|
He W, Guo Y, Gao C, et al. Recognition of human activities with wearable sensors[J]. Eurasip Journal on Advances in Signal Processing, 2012, 2012(1):1-13. |
[18]
|
Xiao L. Li R F, Luo J. Recognition on human activity based on compressed sensing in body sensor networks[J]. Journal of Electronics Information Technology, 2013, 35(1):119-125. |
[19]
|
Li F, Pan J K. Human motion recognition based on triaxial accelerometer[J]. Journal of Computer Research and Development, 2016, 53(3):621-631. |
[20]
|
Sheng M, Jiang J, Su B, et al. Short-time activity recognition with wearable sensors using convolutional neural network[C]//Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry, 2016:413-416. |