HyperFace: A Deep Multi-Task Learning Framework for Face Detection, Landmark Localization, Pose Estimation, and Gender Recognition
We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural networks (CNN). The proposed method called, HyperFace, fuses the intermediate layers of a deep CNN using a separate CNN followed by a multi-task lea...
Saved in:
| Published in | IEEE transactions on pattern analysis and machine intelligence Vol. 41; no. 1; pp. 121 - 135 |
|---|---|
| Main Authors | , , |
| Format | Journal Article |
| Language | English |
| Published |
United States
IEEE
01.01.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects | |
| Online Access | Get full text |
| ISSN | 0162-8828 1939-3539 2160-9292 1939-3539 |
| DOI | 10.1109/TPAMI.2017.2781233 |
Cover
| Abstract | We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural networks (CNN). The proposed method called, HyperFace, fuses the intermediate layers of a deep CNN using a separate CNN followed by a multi-task learning algorithm that operates on the fused features. It exploits the synergy among the tasks which boosts up their individual performances. Additionally, we propose two variants of HyperFace: (1) HyperFace-ResNet that builds on the ResNet-101 model and achieves significant improvement in performance, and (2) Fast-HyperFace that uses a high recall fast face detector for generating region proposals to improve the speed of the algorithm. Extensive experiments show that the proposed models are able to capture both global and local information in faces and performs significantly better than many competitive algorithms for each of these four tasks. |
|---|---|
| AbstractList | We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural networks (CNN). The proposed method called, HyperFace, fuses the intermediate layers of a deep CNN using a separate CNN followed by a multi-task learning algorithm that operates on the fused features. It exploits the synergy among the tasks which boosts up their individual performances. Additionally, we propose two variants of HyperFace: (1) HyperFace-ResNet that builds on the ResNet-101 model and achieves significant improvement in performance, and (2) Fast-HyperFace that uses a high recall fast face detector for generating region proposals to improve the speed of the algorithm. Extensive experiments show that the proposed models are able to capture both global and local information in faces and performs significantly better than many competitive algorithms for each of these four tasks. We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural networks (CNN). The proposed method called, HyperFace, fuses the intermediate layers of a deep CNN using a separate CNN followed by a multi-task learning algorithm that operates on the fused features. It exploits the synergy among the tasks which boosts up their individual performances. Additionally, we propose two variants of HyperFace: (1) HyperFace-ResNet that builds on the ResNet-101 model and achieves significant improvement in performance, and (2) Fast-HyperFace that uses a high recall fast face detector for generating region proposals to improve the speed of the algorithm. Extensive experiments show that the proposed models are able to capture both global and local information in faces and performs significantly better than many competitive algorithms for each of these four tasks. We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural networks (CNN). The proposed method called, HyperFace, fuses the intermediate layers of a deep CNN using a separate CNN followed by a multi-task learning algorithm that operates on the fused features. It exploits the synergy among the tasks which boosts up their individual performances. Additionally, we propose two variants of HyperFace: (1) HyperFace-ResNet that builds on the ResNet-101 model and achieves significant improvement in performance, and (2) Fast-HyperFace that uses a high recall fast face detector for generating region proposals to improve the speed of the algorithm. Extensive experiments show that the proposed models are able to capture both global and local information in faces and performs significantly better than many competitive algorithms for each of these four tasks.We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural networks (CNN). The proposed method called, HyperFace, fuses the intermediate layers of a deep CNN using a separate CNN followed by a multi-task learning algorithm that operates on the fused features. It exploits the synergy among the tasks which boosts up their individual performances. Additionally, we propose two variants of HyperFace: (1) HyperFace-ResNet that builds on the ResNet-101 model and achieves significant improvement in performance, and (2) Fast-HyperFace that uses a high recall fast face detector for generating region proposals to improve the speed of the algorithm. Extensive experiments show that the proposed models are able to capture both global and local information in faces and performs significantly better than many competitive algorithms for each of these four tasks. |
| Author | Ranjan, Rajeev Patel, Vishal M. Chellappa, Rama |
| Author_xml | – sequence: 1 givenname: Rajeev orcidid: 0000-0003-2553-823X surname: Ranjan fullname: Ranjan, Rajeev email: rranjan1@umiacs.umd.edu organization: Department of Electrical and Computer Engineering, University of Maryland, College Park, MD – sequence: 2 givenname: Vishal M. surname: Patel fullname: Patel, Vishal M. email: pvishalm@gmail.com organization: Rutgers University, New Brunswick, NJ – sequence: 3 givenname: Rama surname: Chellappa fullname: Chellappa, Rama email: rama@umiacs.umd.edu organization: Department of Electrical and Computer Engineering, University of Maryland, College Park, MD |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/29990235$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kc1u1DAUhS1URKeFFwAJWWLDggz2jZPY7Eal01ZKRYWGdeTYN5XbxB7sRKgseXIynaGLLlhd6dzv-OecE3Lkg0dC3nK25Jypz5ub1fXVEhivllBJDnn-giyAlyxToOCILBgvIZMS5DE5SemOMS4Klr8ix6CUYpAXC_Ln8mGLca0NfqEr-hVxS6-nfnTZRqd7WqOO3vlbuo56wF8h3tMuRLrDZ3ZEM7rgP9FaezvoeVkHo3v3W-_lm5CQnqfRDQdhxugFeouRfkcTbr3b6a_Jy073Cd8c5in5sT7fnF1m9beLq7NVnRnBqzFDYSG3XGDVihKKVglgVnPdGWQtWikkGGE7IQS33Fpote2sqozsjOgqZfNT8nF_7jaGnxOmsRlcMtj32mOYUgOslLkomSpm9MMz9C5M0c-va4AXXJQcAGbq_YGa2gFts43zT-ND8y_dGYA9YGJIKWL3hHDW7CpsHitsdhU2hwpnk3xmMm58DHCM2vX_t77bWx0iPt0lecVy4PlfQ8CpLQ |
| CODEN | ITPIDJ |
| CitedBy_id | crossref_primary_10_1109_TCYB_2018_2859482 crossref_primary_10_1007_s00779_023_01730_3 crossref_primary_10_1007_s00371_020_01814_8 crossref_primary_10_1007_s11760_024_03333_8 crossref_primary_10_3390_s20216235 crossref_primary_10_3724_SP_J_1224_2020_00538 crossref_primary_10_1109_ACCESS_2022_3176621 crossref_primary_10_1016_j_ins_2020_04_041 crossref_primary_10_1109_JAS_2021_1003871 crossref_primary_10_1109_TIP_2023_3331309 crossref_primary_10_3390_drones8100521 crossref_primary_10_1007_s42979_023_02233_x crossref_primary_10_32604_cmc_2024_049911 crossref_primary_10_1007_s44196_023_00275_w crossref_primary_10_1109_ACCESS_2019_2917451 crossref_primary_10_3390_app12126239 crossref_primary_10_1109_ACCESS_2020_2990958 crossref_primary_10_1155_2022_8709591 crossref_primary_10_1155_2021_9995074 crossref_primary_10_3389_fgene_2019_00397 crossref_primary_10_1007_s13753_020_00314_6 crossref_primary_10_1016_j_infrared_2024_105223 crossref_primary_10_3390_inventions5020016 crossref_primary_10_1117_1_JEI_31_4_043012 crossref_primary_10_21869_2223_1560_2021_25_1_82_109 crossref_primary_10_1109_ACCESS_2024_3376441 crossref_primary_10_1109_TNNLS_2019_2933439 crossref_primary_10_1007_s11042_020_10291_z crossref_primary_10_1007_s11128_025_04674_0 crossref_primary_10_1016_j_patcog_2022_108591 crossref_primary_10_1109_TMM_2019_2916455 crossref_primary_10_1109_TBIOM_2023_3349218 crossref_primary_10_1109_ACCESS_2019_2962010 crossref_primary_10_1109_JIOT_2020_3021763 crossref_primary_10_1016_j_neucom_2019_06_046 crossref_primary_10_1587_transinf_2021EDL8029 crossref_primary_10_3389_fpls_2020_00141 crossref_primary_10_1109_TII_2020_2998107 crossref_primary_10_1080_02533839_2020_1751724 crossref_primary_10_1016_j_eswa_2023_119614 crossref_primary_10_1109_TCBB_2024_3439541 crossref_primary_10_1109_ACCESS_2021_3120098 crossref_primary_10_1109_TCYB_2020_3036935 crossref_primary_10_1038_s41598_024_71856_8 crossref_primary_10_1007_s13369_019_04241_7 crossref_primary_10_1016_j_neucom_2021_03_012 crossref_primary_10_1016_j_patrec_2023_10_017 crossref_primary_10_1016_j_neucom_2020_01_123 crossref_primary_10_1109_TIP_2021_3113185 crossref_primary_10_1016_j_energy_2023_128344 crossref_primary_10_3390_jmse12081343 crossref_primary_10_3390_fi11050105 crossref_primary_10_1007_s12559_021_09894_x crossref_primary_10_1016_j_neunet_2022_10_021 crossref_primary_10_1007_s10462_024_10936_7 crossref_primary_10_3389_fnbot_2019_00112 crossref_primary_10_1145_3533253 crossref_primary_10_1117_1_JEI_27_5_053023 crossref_primary_10_1109_ACCESS_2023_3301467 crossref_primary_10_1155_2022_1413597 crossref_primary_10_3390_s24165362 crossref_primary_10_1109_ACCESS_2020_2968837 crossref_primary_10_12720_jait_16_1_144_155 crossref_primary_10_1007_s11042_023_14750_1 crossref_primary_10_1007_s00530_022_01022_5 crossref_primary_10_1007_s11042_019_7424_8 crossref_primary_10_1088_1755_1315_783_1_012080 crossref_primary_10_3390_s20174787 crossref_primary_10_3390_s19163523 crossref_primary_10_1016_j_jksuci_2023_101869 crossref_primary_10_1109_JSEN_2022_3216057 crossref_primary_10_1177_0020294020941873 crossref_primary_10_1109_TBIOM_2019_2908436 crossref_primary_10_1016_j_neucom_2019_07_064 crossref_primary_10_1109_TNNLS_2020_3007790 crossref_primary_10_3390_electronics8050503 crossref_primary_10_1109_JSTSP_2023_3250956 crossref_primary_10_1007_s10044_020_00899_0 crossref_primary_10_1109_ACCESS_2020_3000902 crossref_primary_10_3233_IDT_190359 crossref_primary_10_4108_eetinis_v11i2_4318 crossref_primary_10_1016_j_neucom_2021_10_025 crossref_primary_10_3390_sym12122119 crossref_primary_10_1016_j_ifacol_2020_12_2431 crossref_primary_10_1007_s13748_019_00203_0 crossref_primary_10_1007_s43538_022_00069_2 crossref_primary_10_1109_ACCESS_2021_3130078 crossref_primary_10_1109_TNS_2024_3423695 crossref_primary_10_1016_j_neucom_2020_10_082 crossref_primary_10_1109_TPWRS_2023_3236330 crossref_primary_10_1038_s41598_020_58103_6 crossref_primary_10_1109_TITS_2019_2921325 crossref_primary_10_1109_THMS_2021_3123186 crossref_primary_10_1007_s11760_020_01680_w crossref_primary_10_1109_ACCESS_2018_2869465 crossref_primary_10_1109_TITS_2022_3203800 crossref_primary_10_1007_s11063_023_11397_9 crossref_primary_10_1016_j_fsir_2021_100226 crossref_primary_10_1007_s11227_022_04402_w crossref_primary_10_25046_aj060173 crossref_primary_10_1007_s10462_023_10654_6 crossref_primary_10_1109_TIP_2021_3059409 crossref_primary_10_3233_JIFS_211114 crossref_primary_10_1016_j_asoc_2021_107172 crossref_primary_10_3233_JIFS_210263 crossref_primary_10_1109_TNNLS_2022_3174031 crossref_primary_10_1016_j_apradiso_2023_110880 crossref_primary_10_1007_s00521_021_06690_4 crossref_primary_10_1145_3507902 crossref_primary_10_1016_j_neucom_2020_03_036 crossref_primary_10_1038_s41598_024_78746_z crossref_primary_10_1109_TCYB_2019_2917049 crossref_primary_10_1587_transinf_2020EDP7059 crossref_primary_10_3390_e21070647 crossref_primary_10_1109_ACCESS_2020_3004359 crossref_primary_10_1109_TBIOM_2021_3060641 crossref_primary_10_1109_TIM_2022_3181940 crossref_primary_10_1016_j_neucom_2018_12_074 crossref_primary_10_1007_s00521_022_07454_4 crossref_primary_10_1007_s11042_020_10401_x crossref_primary_10_1109_TSG_2020_3027491 crossref_primary_10_1587_transfun_2019EAP1134 crossref_primary_10_3390_s19092158 crossref_primary_10_1007_s10489_020_02074_8 crossref_primary_10_1109_TCSVT_2024_3409897 crossref_primary_10_1016_j_cmpb_2023_107428 crossref_primary_10_1177_0954407021999485 crossref_primary_10_1007_s10032_019_00337_w crossref_primary_10_1016_j_image_2021_116479 crossref_primary_10_1109_ACCESS_2023_3328210 crossref_primary_10_1109_TNNLS_2021_3052756 crossref_primary_10_1109_ACCESS_2019_2937599 crossref_primary_10_1109_TNSRE_2023_3342068 crossref_primary_10_3390_s20144021 crossref_primary_10_1109_JBHI_2020_2986376 crossref_primary_10_3390_app14051836 crossref_primary_10_1007_s11042_023_17612_y crossref_primary_10_1007_s10489_023_05235_7 crossref_primary_10_1016_j_cmpb_2020_105513 crossref_primary_10_1049_ipr2_12222 crossref_primary_10_1016_j_patcog_2024_110263 crossref_primary_10_1016_j_neucom_2019_07_039 crossref_primary_10_3390_app122312457 crossref_primary_10_1016_j_imavis_2019_11_005 crossref_primary_10_1109_JSEN_2023_3325201 crossref_primary_10_1007_s00521_019_04330_6 crossref_primary_10_2139_ssrn_3995175 crossref_primary_10_1017_S0269964819000147 crossref_primary_10_1007_s10462_019_09742_3 crossref_primary_10_1007_s11042_023_17731_6 crossref_primary_10_3390_bdcc3030043 crossref_primary_10_1049_2024_7886911 crossref_primary_10_1109_TAI_2021_3133816 crossref_primary_10_7717_peerj_cs_894 crossref_primary_10_1016_j_dsp_2022_103696 crossref_primary_10_1109_TCYB_2019_2955178 crossref_primary_10_1016_j_patcog_2020_107504 crossref_primary_10_1007_s00376_024_3222_y crossref_primary_10_1016_j_jvcir_2018_03_013 crossref_primary_10_1007_s12652_020_02463_4 crossref_primary_10_1016_j_asoc_2019_105890 crossref_primary_10_1016_j_measurement_2020_108158 crossref_primary_10_1109_ACCESS_2020_3011639 crossref_primary_10_1007_s12652_019_01470_4 crossref_primary_10_1016_j_ins_2021_03_027 crossref_primary_10_31590_ejosat_araconf67 crossref_primary_10_1080_07373937_2021_1885432 crossref_primary_10_3390_app10041531 crossref_primary_10_1016_j_neunet_2022_12_021 crossref_primary_10_1109_TGRS_2018_2837142 crossref_primary_10_1109_TCSVT_2023_3304724 crossref_primary_10_1007_s11042_019_08208_6 crossref_primary_10_1049_ipr2_12480 crossref_primary_10_1007_s11390_021_1325_9 crossref_primary_10_1016_j_asoc_2024_111854 crossref_primary_10_1109_TPAMI_2019_2910522 crossref_primary_10_1016_j_neucom_2019_01_046 crossref_primary_10_1109_TCYB_2019_2959403 crossref_primary_10_1145_3474595 crossref_primary_10_1007_s11042_022_13538_z crossref_primary_10_1007_s00607_019_00722_7 crossref_primary_10_1007_s11042_023_15710_5 crossref_primary_10_1109_TII_2022_3143605 crossref_primary_10_1007_s12652_021_03218_5 crossref_primary_10_1109_TCSVT_2022_3233191 crossref_primary_10_1109_ACCESS_2019_2954845 crossref_primary_10_3390_app10072253 crossref_primary_10_1007_s10489_022_03707_w crossref_primary_10_1016_j_ins_2022_10_074 crossref_primary_10_1109_TMM_2020_2993943 crossref_primary_10_1007_s12652_020_02750_0 crossref_primary_10_1109_ACCESS_2022_3141791 crossref_primary_10_3389_fict_2019_00018 crossref_primary_10_1016_j_image_2019_01_005 crossref_primary_10_1007_s00500_022_07074_z crossref_primary_10_1016_j_jag_2021_102341 crossref_primary_10_1109_TAFFC_2022_3178524 crossref_primary_10_1080_1463922X_2019_1697389 crossref_primary_10_1109_ACCESS_2020_3021508 crossref_primary_10_1016_j_measurement_2024_114256 crossref_primary_10_1016_j_aej_2023_01_017 crossref_primary_10_1117_1_JEI_29_2_023005 crossref_primary_10_1038_s41598_023_49739_1 crossref_primary_10_1109_ACCESS_2024_3485804 crossref_primary_10_3390_ai1040029 crossref_primary_10_1186_s13550_023_00955_w crossref_primary_10_1016_j_inffus_2020_06_014 crossref_primary_10_1007_s11063_021_10703_7 crossref_primary_10_1109_TGRS_2023_3264736 crossref_primary_10_1007_s00371_020_01974_7 crossref_primary_10_1109_ACCESS_2024_3351283 crossref_primary_10_1109_ACCESS_2022_3149380 crossref_primary_10_3390_electronics11030440 crossref_primary_10_1016_j_patcog_2021_107837 crossref_primary_10_1088_1361_6579_ab4102 crossref_primary_10_12720_jait_15_11_1252_1263 crossref_primary_10_1007_s11042_023_17295_5 crossref_primary_10_1145_3338842 crossref_primary_10_1186_s13643_021_01879_z crossref_primary_10_1109_TPAMI_2020_3035969 crossref_primary_10_1109_TIM_2018_2884364 crossref_primary_10_3389_fenrg_2022_998797 crossref_primary_10_1109_TCYB_2019_2931042 crossref_primary_10_1109_TPAMI_2020_2997456 crossref_primary_10_1002_eng2_12267 crossref_primary_10_3390_s20164491 crossref_primary_10_1109_TMI_2018_2878669 crossref_primary_10_1038_s41598_019_41034_2 crossref_primary_10_1117_1_JEI_29_1_013006 crossref_primary_10_1007_s11263_020_01308_z crossref_primary_10_1109_ACCESS_2020_3019318 crossref_primary_10_1016_j_neucom_2020_01_048 crossref_primary_10_1109_ACCESS_2020_3008793 crossref_primary_10_1109_ACCESS_2019_2939201 crossref_primary_10_1016_j_image_2021_116587 crossref_primary_10_1007_s10044_021_01026_3 crossref_primary_10_1088_1757_899X_806_1_012054 crossref_primary_10_1007_s13042_023_01933_3 crossref_primary_10_1109_TIA_2022_3186662 crossref_primary_10_1016_j_jmir_2023_03_033 crossref_primary_10_1038_s41598_020_76670_6 crossref_primary_10_3390_computers14010029 crossref_primary_10_2478_jaiscr_2022_0005 crossref_primary_10_1049_iet_cds_2019_0246 crossref_primary_10_1109_TIP_2020_2984373 crossref_primary_10_3390_app9204344 crossref_primary_10_1007_s11760_020_01753_w crossref_primary_10_1109_TCSVT_2020_3006236 crossref_primary_10_1109_TNNLS_2021_3054407 crossref_primary_10_1016_j_compeleceng_2024_109234 crossref_primary_10_1109_TAFFC_2021_3114158 crossref_primary_10_1007_s00521_019_04577_z crossref_primary_10_1109_ACCESS_2023_3262247 crossref_primary_10_1109_ACCESS_2023_3285246 crossref_primary_10_2139_ssrn_4073651 crossref_primary_10_1007_s00521_022_07823_z crossref_primary_10_1109_TIFS_2020_3036803 crossref_primary_10_1007_s11263_021_01513_4 crossref_primary_10_1016_j_isci_2023_108501 crossref_primary_10_1109_ACCESS_2020_2968489 crossref_primary_10_1109_TPAMI_2021_3069250 crossref_primary_10_3390_sci2040091 crossref_primary_10_1016_j_optlastec_2024_110997 crossref_primary_10_3390_s21175918 crossref_primary_10_1007_s11042_020_08800_1 crossref_primary_10_1088_1742_6596_2224_1_012062 crossref_primary_10_3390_jmse11040867 crossref_primary_10_3390_sci2030065 crossref_primary_10_1186_s42400_023_00137_0 crossref_primary_10_3389_fnins_2023_1213176 crossref_primary_10_1007_s00371_023_02809_x crossref_primary_10_1145_3458023 crossref_primary_10_1016_j_compmedimag_2020_101812 crossref_primary_10_1080_00140139_2019_1651904 crossref_primary_10_1016_j_scs_2020_102692 crossref_primary_10_1109_ACCESS_2020_3011112 crossref_primary_10_1016_j_chemolab_2021_104233 crossref_primary_10_1109_LGRS_2020_3013026 crossref_primary_10_1007_s12008_021_00814_9 crossref_primary_10_1016_j_patcog_2020_107342 crossref_primary_10_1109_TCSVT_2020_3045978 crossref_primary_10_1109_TIP_2020_2976765 crossref_primary_10_2478_cait_2024_0018 crossref_primary_10_3390_rs11212584 crossref_primary_10_3390_app11010089 crossref_primary_10_1109_TIM_2022_3162615 crossref_primary_10_3390_rs14020397 crossref_primary_10_1109_TCSS_2019_2910599 crossref_primary_10_1142_S0218001424560019 crossref_primary_10_1016_j_cviu_2023_103622 crossref_primary_10_1007_s11042_023_15336_7 crossref_primary_10_1109_TNNLS_2020_3015996 crossref_primary_10_3390_electronics11081210 crossref_primary_10_1007_s11633_022_1349_9 crossref_primary_10_1142_S0218488523400044 crossref_primary_10_1007_s11042_021_10918_9 crossref_primary_10_3390_s19153361 crossref_primary_10_1007_s11042_021_11783_2 crossref_primary_10_3390_s21041098 crossref_primary_10_3390_sci2020047 crossref_primary_10_1109_TPAMI_2021_3086914 crossref_primary_10_1109_TMM_2019_2919469 crossref_primary_10_1109_ACCESS_2023_3253968 crossref_primary_10_1007_s10489_020_01946_3 crossref_primary_10_1109_TIM_2018_2879706 crossref_primary_10_1117_1_JEI_28_1_013012 crossref_primary_10_3390_jlpea10010001 crossref_primary_10_3390_sci2010015 crossref_primary_10_1007_s10032_021_00364_6 crossref_primary_10_3233_JIFS_231423 crossref_primary_10_1007_s11548_024_03315_8 crossref_primary_10_1109_ACCESS_2019_2914319 crossref_primary_10_1016_j_neucom_2023_126284 crossref_primary_10_1007_s11042_022_13070_0 crossref_primary_10_1007_s11263_023_01935_2 crossref_primary_10_1007_s00530_023_01134_6 crossref_primary_10_1007_s00521_020_05167_0 crossref_primary_10_1117_1_JEI_28_1_013029 crossref_primary_10_1109_ACCESS_2021_3119615 crossref_primary_10_1016_j_jobe_2023_108126 crossref_primary_10_1155_2021_5552206 crossref_primary_10_1155_2021_2891303 crossref_primary_10_1016_j_buildenv_2019_106424 crossref_primary_10_1007_s11263_020_01378_z crossref_primary_10_1007_s11760_024_02999_4 crossref_primary_10_1109_TIM_2020_3026514 crossref_primary_10_1109_ACCESS_2020_2995764 crossref_primary_10_1109_ACCESS_2019_2960325 crossref_primary_10_3390_s19061346 crossref_primary_10_1080_17517575_2019_1668964 crossref_primary_10_1049_cit2_12090 crossref_primary_10_1109_TEVC_2023_3272663 crossref_primary_10_1016_j_matpr_2022_07_342 crossref_primary_10_3390_app12157473 crossref_primary_10_1109_TBIOM_2021_3100926 crossref_primary_10_1109_ACCESS_2023_3241606 crossref_primary_10_1186_s40494_022_00732_3 crossref_primary_10_1016_j_eswa_2021_116105 crossref_primary_10_1016_j_measurement_2022_110807 crossref_primary_10_1109_TCSVT_2019_2897243 crossref_primary_10_1007_s11704_020_8272_4 crossref_primary_10_1016_j_dsp_2020_102809 crossref_primary_10_1049_bme2_12061 crossref_primary_10_3390_rs15040953 crossref_primary_10_1007_s11036_021_01874_7 crossref_primary_10_1109_ACCESS_2019_2962974 crossref_primary_10_1109_TAFFC_2022_3215918 crossref_primary_10_1016_j_neucom_2019_10_094 crossref_primary_10_1016_j_chemolab_2019_04_007 crossref_primary_10_1109_TBIOM_2020_2973504 crossref_primary_10_1109_TNNLS_2020_2994325 crossref_primary_10_1109_TMI_2020_2974159 crossref_primary_10_3390_app15052473 crossref_primary_10_1109_JIOT_2019_2916777 crossref_primary_10_1109_TMM_2024_3391888 crossref_primary_10_1109_TPAMI_2020_3046323 crossref_primary_10_1109_ACCESS_2020_3023782 crossref_primary_10_3390_sym11060770 crossref_primary_10_1016_j_patcog_2021_108425 crossref_primary_10_1109_ACCESS_2024_3354373 crossref_primary_10_1007_s00371_022_02732_7 crossref_primary_10_1007_s11042_022_13653_x crossref_primary_10_3390_s23083825 crossref_primary_10_1017_S2633903X24000035 crossref_primary_10_1016_j_cviu_2021_103218 crossref_primary_10_1049_ipr2_12556 crossref_primary_10_1007_s10489_025_06231_9 crossref_primary_10_1109_ACCESS_2023_3288569 crossref_primary_10_3390_e24070974 crossref_primary_10_1109_ACCESS_2022_3165203 crossref_primary_10_1145_3527662 crossref_primary_10_1109_ACCESS_2022_3190404 crossref_primary_10_1109_TAFFC_2020_2983669 crossref_primary_10_2139_ssrn_5088950 crossref_primary_10_1016_j_eswa_2024_124425 crossref_primary_10_1007_s00371_023_02781_6 crossref_primary_10_1007_s13042_021_01413_6 crossref_primary_10_1016_j_neucom_2020_03_023 crossref_primary_10_1109_JIOT_2020_3039359 crossref_primary_10_1109_TPAMI_2021_3088859 crossref_primary_10_1016_j_isci_2020_101656 crossref_primary_10_1007_s11042_022_11958_5 crossref_primary_10_1109_ACCESS_2023_3297500 crossref_primary_10_1016_j_neucom_2021_09_055 crossref_primary_10_1007_s13198_022_01688_0 crossref_primary_10_1007_s11760_019_01538_w crossref_primary_10_1109_TBIOM_2021_3122307 crossref_primary_10_1109_ACCESS_2020_3026421 crossref_primary_10_31590_ejosat_1113302 crossref_primary_10_1142_S0218001421550016 crossref_primary_10_1007_s10489_022_03766_z crossref_primary_10_1109_JSTSP_2019_2955024 crossref_primary_10_1109_ACCESS_2020_2977346 crossref_primary_10_1007_s10489_022_03173_4 crossref_primary_10_1109_JSTSP_2023_3270621 crossref_primary_10_4218_etrij_2018_0621 crossref_primary_10_1016_j_imavis_2020_103912 crossref_primary_10_3390_s22218449 crossref_primary_10_1109_TIP_2024_3372457 crossref_primary_10_1016_j_displa_2021_102020 crossref_primary_10_1007_s11042_019_07848_y crossref_primary_10_1016_j_jclepro_2021_126242 crossref_primary_10_1109_ACCESS_2021_3077070 crossref_primary_10_3390_s20020328 crossref_primary_10_1016_j_patcog_2021_108449 crossref_primary_10_1109_TPAMI_2021_3073593 crossref_primary_10_1541_ieejeiss_143_871 crossref_primary_10_1016_j_cviu_2024_103999 crossref_primary_10_1007_s10044_022_01098_9 crossref_primary_10_1080_19479832_2024_2382737 crossref_primary_10_1016_j_media_2020_101824 crossref_primary_10_3233_JIFS_189508 crossref_primary_10_1016_j_patcog_2022_108753 crossref_primary_10_3390_s23104731 crossref_primary_10_1016_j_asoc_2023_110176 crossref_primary_10_1016_j_patrec_2020_03_002 crossref_primary_10_3390_jmse10050639 crossref_primary_10_3390_s23218676 crossref_primary_10_3390_s22218303 crossref_primary_10_1109_TIFS_2018_2800901 crossref_primary_10_1155_2022_3017149 crossref_primary_10_1007_s00500_019_04174_1 crossref_primary_10_1007_s11390_024_2264_z crossref_primary_10_1093_iwcomp_iwaa011 crossref_primary_10_1109_TNNLS_2021_3117685 crossref_primary_10_1016_j_jclepro_2020_122722 crossref_primary_10_1109_TCI_2018_2889933 crossref_primary_10_1109_TMM_2021_3081873 crossref_primary_10_1155_2021_5196000 crossref_primary_10_1016_j_compag_2020_105629 crossref_primary_10_1016_j_neucom_2020_12_090 crossref_primary_10_1109_ACCESS_2022_3157857 crossref_primary_10_3390_app12105252 crossref_primary_10_1109_TPAMI_2021_3103980 crossref_primary_10_1007_s41365_024_01606_y crossref_primary_10_1016_j_imavis_2022_104555 crossref_primary_10_1109_JIOT_2022_3201310 crossref_primary_10_1016_j_patcog_2021_108256 crossref_primary_10_1177_0263276419867752 crossref_primary_10_1016_j_cag_2019_08_013 crossref_primary_10_1109_TIP_2020_2991549 crossref_primary_10_1016_j_neucom_2021_07_006 crossref_primary_10_1109_ACCESS_2022_3214227 crossref_primary_10_1002_jnm_3020 crossref_primary_10_1109_ACCESS_2021_3071581 crossref_primary_10_1109_ACCESS_2019_2947519 crossref_primary_10_1016_j_sysarc_2024_103085 crossref_primary_10_1109_TIP_2024_3378180 crossref_primary_10_1109_TII_2018_2884211 crossref_primary_10_1109_TCSVT_2024_3366522 crossref_primary_10_3390_electronics11070993 crossref_primary_10_1109_JSEN_2021_3064060 crossref_primary_10_3390_math12132089 crossref_primary_10_1109_ACCESS_2020_2982970 crossref_primary_10_1016_j_eswa_2022_117368 crossref_primary_10_1109_ACCESS_2018_2882227 crossref_primary_10_1177_16878140211013138 crossref_primary_10_1007_s11042_023_17343_0 crossref_primary_10_1007_s11831_021_09639_x crossref_primary_10_3233_JIFS_189789 crossref_primary_10_1007_s00371_019_01707_5 crossref_primary_10_1016_j_diin_2019_07_008 crossref_primary_10_1016_j_neucom_2020_05_010 crossref_primary_10_1016_j_patrec_2020_10_003 crossref_primary_10_1109_TBIOM_2019_2961926 crossref_primary_10_1007_s42979_022_01359_8 crossref_primary_10_1109_JPROC_2023_3275192 crossref_primary_10_1007_s42979_023_01796_z crossref_primary_10_1016_j_asoc_2024_111274 crossref_primary_10_1016_j_patcog_2022_108672 crossref_primary_10_1109_TAES_2023_3293074 crossref_primary_10_1109_TITS_2020_3019390 crossref_primary_10_1109_TIP_2019_2911114 crossref_primary_10_1049_iet_ipr_2020_0199 crossref_primary_10_1109_TPAMI_2021_3076733 crossref_primary_10_1016_j_softx_2023_101516 crossref_primary_10_32604_cmc_2020_013590 crossref_primary_10_3390_app11104366 crossref_primary_10_1109_TAFFC_2020_3031841 crossref_primary_10_1016_j_engappai_2023_106391 crossref_primary_10_1155_2022_2103975 crossref_primary_10_3390_bioengineering11010101 crossref_primary_10_1142_S0218001421600028 crossref_primary_10_1109_TPAMI_2019_2902556 crossref_primary_10_1007_s00138_021_01234_1 crossref_primary_10_1016_j_iswa_2024_200391 crossref_primary_10_1186_s13634_022_00868_1 crossref_primary_10_2139_ssrn_4157631 crossref_primary_10_1109_TIP_2019_2948728 crossref_primary_10_4018_IJEHMC_309930 crossref_primary_10_1109_ACCESS_2019_2932146 crossref_primary_10_1109_ACCESS_2019_2934563 crossref_primary_10_1007_s10846_022_01652_x crossref_primary_10_1109_ACCESS_2023_3324403 crossref_primary_10_1109_TIM_2023_3238750 crossref_primary_10_1016_j_patcog_2025_111393 crossref_primary_10_1109_TCSS_2021_3059318 crossref_primary_10_4018_IJMDEM_2019040103 crossref_primary_10_1109_TCSI_2019_2921714 |
| Cites_doi | 10.5244/C.20.92 10.1109/ICCV.2015.425 10.1109/ICCV.2015.144 10.1109/CVPR.2014.212 10.1109/CVPR.2015.7298594 10.1007/s11263-007-0072-x 10.1109/ICCV.2013.103 10.1109/ICCV.2015.419 10.1109/CVPR.2016.371 10.1109/CVPR.2015.7299170 10.1109/ICCV.2015.18 10.1145/2647868.2654889 10.1109/CVPR.2013.75 10.1109/CVPR.2017.579 10.1109/CVPR.2016.90 10.1109/CVPR.2014.241 10.1109/CVPR.2015.7298642 10.1109/ICCV.2015.421 10.1109/ICCVW.2011.6130513 10.1109/CVPR.2016.92 10.1109/ICPR.2002.1047456 10.1109/CVPR.2014.218 10.1109/CVPR.2009.5206848 10.1007/s11263-013-0667-3 10.1109/CVPR.2013.446 10.1109/TPAMI.2009.167 10.1109/ICCV.2013.191 10.1006/cviu.1995.1004 10.1109/CVPR.2013.445 10.1109/CVPR.2014.81 10.1109/ICCV.2015.304 10.1145/2671188.2749408 10.1016/j.imavis.2013.12.004 10.1007/s11263-010-0380-4 10.1109/CVPR.2016.23 10.1007/s11263-009-0275-4 10.1109/ICCVW.2013.126 10.1007/978-1-4615-5529-2_5 10.1109/CVPR.2017.166 10.1109/TPAMI.2015.2469286 10.1109/ICCV.2013.244 10.1109/CVPR.2015.7298882 10.1023/B:VISI.0000029666.37597.d3 10.1109/ICCV.2011.6126456 10.1109/TPAMI.2008.106 10.1109/ICCVW.2013.59 10.1109/BTAS.2015.7358755 10.1109/CVPR.2014.239 10.1109/CVPR.2013.465 10.1109/CVPR.2007.383280 10.1023/B:VISI.0000013087.49260.fb 10.1109/TPAMI.2015.2448075 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
| DBID | 97E RIA RIE AAYXX CITATION CGR CUY CVF ECM EIF NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| DOI | 10.1109/TPAMI.2017.2781233 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | MEDLINE Technology Research Database MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 3 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Computer Science |
| EISSN | 2160-9292 1939-3539 |
| EndPage | 135 |
| ExternalDocumentID | 29990235 10_1109_TPAMI_2017_2781233 8170321 |
| Genre | orig-research Research Support, U.S. Gov't, Non-P.H.S Journal Article |
| GrantInformation_xml | – fundername: Intelligence Advanced Research Projects Activity (IARPA) – fundername: U.S. Government – fundername: IARPA R&D Contract grantid: 2014-14071600012 – fundername: Office of the Director of National Intelligence (ODNI) – fundername: IARPA – fundername: ODNI |
| GroupedDBID | --- -DZ -~X .DC 0R~ 29I 4.4 53G 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ IEDLZ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 UHB ~02 AAYXX CITATION 5VS 9M8 AAYOK ABFSI ADRHT AETIX AGSQL AI. AIBXA ALLEH CGR CUY CVF ECM EIF FA8 H~9 IBMZZ ICLAB IFJZH NPM PKN RIC RIG RNI RZB VH1 XJT Z5M 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| ID | FETCH-LOGICAL-c417t-e4d23d14e7b4625b9420da1afce0bed8482c4df4441d1dd2badfd97c8fc4f79d3 |
| IEDL.DBID | RIE |
| ISSN | 0162-8828 1939-3539 |
| IngestDate | Sat Sep 27 22:18:48 EDT 2025 Mon Jun 30 03:28:02 EDT 2025 Wed Feb 19 02:31:36 EST 2025 Thu Apr 24 23:05:32 EDT 2025 Wed Oct 01 03:57:32 EDT 2025 Wed Aug 27 02:50:17 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 1 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c417t-e4d23d14e7b4625b9420da1afce0bed8482c4df4441d1dd2badfd97c8fc4f79d3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0003-2553-823X |
| PMID | 29990235 |
| PQID | 2151461222 |
| PQPubID | 85458 |
| PageCount | 15 |
| ParticipantIDs | proquest_journals_2151461222 crossref_primary_10_1109_TPAMI_2017_2781233 pubmed_primary_29990235 crossref_citationtrail_10_1109_TPAMI_2017_2781233 ieee_primary_8170321 proquest_miscellaneous_2068346095 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | 2019-Jan.-1 2019-1-1 2019-01-00 20190101 |
| PublicationDateYYYYMMDD | 2019-01-01 |
| PublicationDate_xml | – month: 01 year: 2019 text: 2019-Jan.-1 day: 01 |
| PublicationDecade | 2010 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: New York |
| PublicationTitle | IEEE transactions on pattern analysis and machine intelligence |
| PublicationTitleAbbrev | TPAMI |
| PublicationTitleAlternate | IEEE Trans Pattern Anal Mach Intell |
| PublicationYear | 2019 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref57 ref13 ref56 ref12 ref59 ref15 ref58 ref14 ref53 ref52 kumar (ref31) 2008 ref55 ref11 ref54 ref10 ref17 liang (ref35) 2008; 5303 ref18 liu (ref37) 2016 zeiler (ref63) 2014 zhang (ref65) 2014 simonyan (ref47) 2014 ref51 ref50 huang (ref21) 2007 caruana (ref5) 1998 ref46 ref45 ref48 jain (ref23) 2010 ref42 ref41 ref44 ref43 gkioxari (ref16) 2014 mathias (ref39) 2014; 8692 ref8 ref9 ref4 ref3 ref40 ref34 ref36 cootes (ref7) 1995; 61 ref33 krizhevsky (ref29) 2012 ref32 ref2 zhu (ref71) 2012 ref1 ref38 sun (ref49) 2014 zhu (ref67) 2015 ref68 ref24 ref26 ref69 ref25 ref64 ref20 ref66 ioffe (ref22) 2015 zhu (ref70) 2012 ref28 hu (ref19) 2005; 2 ref27 kumar (ref30) 0 chen (ref6) 2014; 8694 ref60 ref62 ref61 |
| References_xml | – ident: ref11 doi: 10.5244/C.20.92 – ident: ref38 doi: 10.1109/ICCV.2015.425 – start-page: 21 year: 2016 ident: ref37 article-title: SSD: Single shot multibox detector publication-title: Proc Eur Conf Comput Vis – ident: ref61 doi: 10.1109/ICCV.2015.144 – ident: ref64 doi: 10.1109/CVPR.2014.212 – start-page: 448 year: 2015 ident: ref22 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift publication-title: Proc Int Conf Mach Learn – start-page: 1097 year: 2012 ident: ref29 article-title: ImageNet classification with deep convolutional neural networks publication-title: Proc Int Conf Adv Neural Inform Process Syst – start-page: 2879 year: 2012 ident: ref70 article-title: Face detection, pose estimation, and landmark localization in the wild publication-title: Proc IEEE Conf Comput Vis Pattern Recognit – ident: ref51 doi: 10.1109/CVPR.2015.7298594 – ident: ref1 doi: 10.1007/s11263-007-0072-x – ident: ref32 doi: 10.1109/ICCV.2013.103 – ident: ref60 doi: 10.1109/ICCV.2015.419 – ident: ref68 doi: 10.1109/CVPR.2016.371 – ident: ref33 doi: 10.1109/CVPR.2015.7299170 – ident: ref59 doi: 10.1109/ICCV.2015.18 – ident: ref24 doi: 10.1145/2647868.2654889 – start-page: 2879 year: 2012 ident: ref71 article-title: FaceDPL: Face detection, pose estimation, and landmark localization in the wild publication-title: IEEE Conf Comput Vis Pattern Recognit – ident: ref56 doi: 10.1109/CVPR.2013.75 – year: 2014 ident: ref16 article-title: R-CNNS for pose estimation and action detection – start-page: 340 year: 2008 ident: ref31 article-title: FaceTracer: A search engine for large collections of images with faces publication-title: Proc Eur Conf Comput Vis – start-page: 4998 year: 2015 ident: ref67 article-title: Face alignment by coarse-to-fine shape searching publication-title: Proc IEEE Conf Comput Vis Pattern Recognit – ident: ref27 doi: 10.1109/CVPR.2017.579 – ident: ref18 doi: 10.1109/CVPR.2016.90 – ident: ref26 doi: 10.1109/CVPR.2014.241 – ident: ref17 doi: 10.1109/CVPR.2015.7298642 – ident: ref25 doi: 10.1109/ICCV.2015.421 – ident: ref28 doi: 10.1109/ICCVW.2011.6130513 – volume: 5303 start-page: 72 year: 2008 ident: ref35 article-title: Face alignment via component-based discriminative search publication-title: Proc Eur Conf Comput Vis – ident: ref14 doi: 10.1109/CVPR.2016.92 – ident: ref48 doi: 10.1109/ICPR.2002.1047456 – start-page: 1988 year: 2014 ident: ref49 article-title: Deep learning face representation by joint identification-verification publication-title: Proc Int Conf Adv Neural Inform Process Syst – ident: ref43 doi: 10.1109/CVPR.2014.218 – year: 2007 ident: ref21 article-title: Labeled faces in the wild: A database for studying face recognition in unconstrained environments – ident: ref8 doi: 10.1109/CVPR.2009.5206848 – ident: ref4 doi: 10.1007/s11263-013-0667-3 – ident: ref50 doi: 10.1109/CVPR.2013.446 – ident: ref13 doi: 10.1109/TPAMI.2009.167 – ident: ref3 doi: 10.1109/ICCV.2013.191 – volume: 61 start-page: 38 year: 1995 ident: ref7 article-title: Active shape models-their training and application publication-title: Comput Vis Image Underst doi: 10.1006/cviu.1995.1004 – start-page: 818 year: 2014 ident: ref63 article-title: Visualizing and understanding convolutional networks publication-title: European Conf Comput Vis – ident: ref34 doi: 10.1109/CVPR.2013.445 – ident: ref15 doi: 10.1109/CVPR.2014.81 – ident: ref9 doi: 10.1109/ICCV.2015.304 – volume: 2 start-page: ii?342 year: 2005 ident: ref19 article-title: Head pose estimation by non-linear embedding and mapping publication-title: Proc IEEE Int Conf Image Process – ident: ref12 doi: 10.1145/2671188.2749408 – ident: ref58 doi: 10.1016/j.imavis.2013.12.004 – year: 2014 ident: ref47 article-title: Very deep convolutional networks for large-scale image recognition – ident: ref45 doi: 10.1007/s11263-010-0380-4 – ident: ref69 doi: 10.1109/CVPR.2016.23 – ident: ref10 doi: 10.1007/s11263-009-0275-4 – year: 0 ident: ref30 article-title: Face alignment by local deep descriptor regression – start-page: 94 year: 2014 ident: ref65 article-title: Facial landmark detection by deep multi-task learning publication-title: Proc Eur Conf Comput Vis – ident: ref57 doi: 10.1109/ICCVW.2013.126 – start-page: 95 year: 1998 ident: ref5 article-title: Multitask learning publication-title: Learning to Learn doi: 10.1007/978-1-4615-5529-2_5 – ident: ref20 doi: 10.1109/CVPR.2017.166 – ident: ref66 doi: 10.1109/TPAMI.2015.2469286 – ident: ref62 doi: 10.1109/ICCV.2013.244 – ident: ref55 doi: 10.1109/CVPR.2015.7298882 – ident: ref40 doi: 10.1023/B:VISI.0000029666.37597.d3 – year: 2010 ident: ref23 article-title: FDDB: A benchmark for face detection in unconstrained settings – ident: ref53 doi: 10.1109/ICCV.2011.6126456 – ident: ref41 doi: 10.1109/TPAMI.2008.106 – ident: ref44 doi: 10.1109/ICCVW.2013.59 – ident: ref42 doi: 10.1109/BTAS.2015.7358755 – ident: ref52 doi: 10.1109/CVPR.2014.239 – volume: 8692 start-page: 720 year: 2014 ident: ref39 article-title: Face detection without bells and whistles publication-title: Proc Eur Conf Comput Vis – volume: 8694 start-page: 109 year: 2014 ident: ref6 article-title: Joint cascade face detection and alignment publication-title: Proc Eur Conf Comput Vis – ident: ref46 doi: 10.1109/CVPR.2013.465 – ident: ref2 doi: 10.1109/CVPR.2007.383280 – ident: ref54 doi: 10.1023/B:VISI.0000013087.49260.fb – ident: ref36 doi: 10.1109/TPAMI.2015.2448075 |
| SSID | ssj0014503 |
| Score | 2.7153723 |
| Snippet | We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 121 |
| SubjectTerms | Algorithms Artificial neural networks deep convolutional neural networks Deep Learning Face Face - diagnostic imaging Face detection Face recognition Feature extraction Female Fuses Gender Identity gender recognition head pose estimation Humans Image Processing, Computer-Assisted - methods Landmarks landmarks localization Localization Machine learning Male multi-task learning Pattern Recognition, Automated - methods Pose estimation Posture - physiology |
| Title | HyperFace: A Deep Multi-Task Learning Framework for Face Detection, Landmark Localization, Pose Estimation, and Gender Recognition |
| URI | https://ieeexplore.ieee.org/document/8170321 https://www.ncbi.nlm.nih.gov/pubmed/29990235 https://www.proquest.com/docview/2151461222 https://www.proquest.com/docview/2068346095 |
| Volume | 41 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2160-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014503 issn: 0162-8828 databaseCode: RIE dateStart: 19790101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VnuiBQltoSkGuxI3NNna8m4TbCrpaqi6q0FbqLfJjwqE0qbrZS4_95YwdJwIEiFuUjBNHM-P5xp4HwDshFNl1buO8mthYCs1jrVyyOwqjc00YFt0-5PLLdHElz68n11swGnJhENEHn-HYXfqzfNuYjdsqO3XF5FKXNf4ky6ddrtZwYiAnvgsyIRjScHIj-gSZpDhdXc6Wn10UVzYWGRm01DXPoWW4cLVefrFHvsHK37GmtznzXVj2s-1CTW7Gm1aPzcNvhRz_93eew7MAPtmsk5YXsIX1Huz2jR1Y0PM92PmpSuE-PC7IV72fK4Mf2Ix9QrxjPm03Xqn1DQsFWr-xeR_mxQgHM0dOtK2P9KpH7ELV9lbRwwtnPEPy54hdNmtkZ7TM3IYbRMa69nbsax_b1NQHcDU_W31cxKF1Q2wkz9oYpRWp5RIzLcnD0oUUiVVcVQYTjTaXuTDSVpLAmOXWCq1sZYvM5JWRVVbY9CVs102Nh8C4SSznKpvo3Eqk4SkShqkUebbIkZsIeM_A0oS65q69xvfS-zdJUXr-l47_ZeB_BO-HMXddVY9_Uu875g2UgW8RHPdyUgbFX5cOQUlCjUJEcDI8JpV15zCqxmZDNMk0T6Wr9BfBq06-hnf3Ynn052--hqc0s6LbAzqG7fZ-g28IFbX6rVeHH7fxBoo |
| linkProvider | IEEE |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB5V5QAcWmihDRQwEjc229hxmoTbCrrawm5Voa3UW-THhENpUnV3Lxz55YwdJwIEiFuUjBNHM-P5xp4HwBshFNl1buOizmwsheaxVi7ZHYXRhSYMi24fcnF-MruUH6-yqy0YDbkwiOiDz3DsLv1Zvm3Nxm2VHbticqnLGr-XSSmzLltrODOQme-DTBiGdJwciT5FJimPlxeTxZmL48rHIieTlrr2ObQQl67ayy8WybdY-Tva9FZnuguLfr5dsMn1eLPWY_Ptt1KO__tDj2AnwE826eTlMWxhswe7fWsHFjR9Dx7-VKdwH77PyFu9myqD79iEfUC8ZT5xN16q1TULJVq_sGkf6MUICTNHTrRrH-vVjNhcNfZG0cO5M58h_XPELtoVslNaaG7CDSJjXYM79rmPbmqbJ3A5PV2-n8WheUNsJM_XMUorUssl5lqSj6VLKRKruKoNJhptIQthpK2Jj9xya4VWtrZlborayDovbfoUtpu2wUNg3CSWc5VnurASaXiKhGJqRb4tcuQmAt4zsDKhsrlrsPG18h5OUlae_5XjfxX4H8HbYcxtV9fjn9T7jnkDZeBbBEe9nFRB9VeVw1CScKMQEbweHpPSupMY1WC7IZrkpEilq_UXwUEnX8O7e7F89udvvoL7s-ViXs3Pzj89hwc0y7LbETqC7fXdBl8QRlrrl141fgCW4wnX |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=HyperFace%3A+A+Deep+Multi-Task+Learning+Framework+for+Face+Detection%2C+Landmark+Localization%2C+Pose+Estimation%2C+and+Gender+Recognition&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Ranjan%2C+Rajeev&rft.au=Patel%2C+Vishal+M.&rft.au=Chellappa%2C+Rama&rft.date=2019-01-01&rft.issn=0162-8828&rft.eissn=2160-9292&rft.volume=41&rft.issue=1&rft.spage=121&rft.epage=135&rft_id=info:doi/10.1109%2FTPAMI.2017.2781233&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TPAMI_2017_2781233 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon |