GERBIL Experiment

Your Experiments are completed

Experiment URI:
Type: D2KB
Matching: Ma - strong annotation match
Annotator Dataset Micro F1 score Micro Precision Micro Recall Macro F1 score Macro Precision Macro Recall InKB Macro F1 score InKB Macro Precision InKB Macro Recall InKB Micro F1 score InKB Micro Precision InKB Micro Recall EE Macro F1 score EE Macro Precision EE Macro Recall EE Micro F1 score EE Micro Precision EE Micro Recall avg millis/doc confidence threshold GSInKB Macro F1 score GSInKB Macro Precision GSInKB Macro Recall GSInKB Micro F1 score GSInKB Micro Precision Error Count Timestamp GERBIL version
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) ACE2004 0.6536 0.6536 0.6536 0.7762 0.7762 0.7762 0.7863 0.779 0.8037 0.6791 0.6586 0.7009 0.7054 0.731 0.702 0.5581 0.6316 0.5 8,561.3158 0 0.8037 0.8037 0.8037 0.7009 0.7009 0 2017-01-22 19:47:07 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) AIDA/CoNLL-Complete 0.5945 0.5945 0.5945 0.5892 0.5892 0.5892 0.5736 0.5533 0.6153 0.5885 0.5651 0.614 0.5708 0.6902 0.5444 0.6236 0.7828 0.5183 3,620.6059 0 0.6182 0.6182 0.6182 0.614 0.614 0 2017-01-22 21:03:22 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) AIDA/CoNLL-Test A 0.5863 0.5863 0.5863 0.5678 0.5678 0.5678 0.5513 0.5315 0.5858 0.5779 0.5567 0.6009 0.5473 0.6763 0.5094 0.6305 0.7904 0.5244 8,295.4861 0 0.5858 0.5858 0.5858 0.6009 0.6009 0 2017-01-22 20:09:11 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) AIDA/CoNLL-Test B 0.5693 0.5693 0.5693 0.5835 0.5835 0.5835 0.5722 0.5542 0.6083 0.563 0.5424 0.5852 0.5882 0.6924 0.5758 0.5998 0.7359 0.5062 7,850.9048 0 0.6083 0.6083 0.6083 0.5852 0.5852 0 2017-01-22 20:10:24 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) AIDA/CoNLL-Training 0.6026 0.6026 0.6026 0.5955 0.5955 0.5955 0.579 0.558 0.6238 0.5974 0.5727 0.6244 0.5719 0.6928 0.5447 0.6277 0.7925 0.5197 4,313.4841 0 0.628 0.628 0.628 0.6244 0.6244 0 2017-01-22 20:48:10 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) AQUAINT 0.6795 0.6795 0.6795 0.6732 0.6732 0.6732 0.7034 0.7111 0.7011 0.7116 0.7168 0.7064 0.406 0.3983 0.43 0.1818 0.1633 0.2051 9,812.32 0 0.7011 0.7011 0.7011 0.7064 0.7064 0 2017-01-22 19:47:10 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) DBpediaSpotlight 0.681 0.6928 0.6697 0.7 0.7039 0.6967 0.7243 0.765 0.6967 0.7061 0.7466 0.6697 0.6724 0.6724 0.6724 0 0 0 2,354.1579 0 0.7 0.7039 0.6967 0.681 0.6928 1 2017-01-22 20:26:08 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) Derczynski IPM NEL The annotator caused too many single errors. 2017-01-22 19:43:59 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) IITB 0.4151 0.4165 0.4136 0.4169 0.4169 0.4168 0.5055 0.4156 0.6567 0.5088 0.4148 0.6578 0.0761 0.4782 0.0434 0.0757 0.4643 0.0412 13,667.8738 0 0.6663 0.6664 0.6663 0.6599 0.662 1 2017-01-22 20:02:56 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) KORE50 0.2917 0.2917 0.2917 0.2607 0.2607 0.2607 0.2607 0.2607 0.2607 0.2947 0.2979 0.2917 0.96 0.96 0.96 0 0 0 8,066.94 0 0.2607 0.2607 0.2607 0.2917 0.2917 0 2017-01-22 19:45:42 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) MSNBC 0.7802 0.7812 0.7791 0.7811 0.7821 0.7802 0.7737 0.7818 0.7724 0.7761 0.784 0.7684 0.7808 0.822 0.7744 0.806 0.7642 0.8526 10,686.7 0 0.7734 0.7744 0.7724 0.7696 0.7708 0 2017-01-22 19:42:33 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) Microposts2013-Test 0.0921 0.093 0.0912 0.3388 0.3388 0.3388 0.3317 0.3317 0.3317 0 0 0 0.3444 0.3579 0.3388 0.1672 1 0.0912 2,735.4784 0 12 2017-01-22 20:44:43 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) Microposts2013-Train 0.0925 0.0936 0.0914 0.3208 0.3208 0.3208 0.3108 0.3108 0.3108 0 0 0 0.3281 0.3456 0.3208 0.1675 1 0.0914 1,869.5824 0 30 2017-01-22 21:06:33 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) Microposts2014-Test 0.4809 0.4809 0.4809 0.6866 0.6866 0.6866 0.6975 0.7191 0.6866 0.5347 0.6022 0.4809 0.8341 0.8341 0.8341 0 0 0 3,092.8749 0 0.6866 0.6866 0.6866 0.4809 0.4809 0 2017-01-22 20:37:34 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) Microposts2014-Train 0.5068 0.5068 0.5068 0.6638 0.6638 0.6638 0.6799 0.7109 0.6639 0.5614 0.6288 0.5071 0.7748 0.7748 0.7748 0.0027 0.0013 0.25 2,157.8944 0 0.6643 0.6643 0.6643 0.5071 0.5071 0 2017-01-22 21:04:38 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) N3-RSS-500 0.6346 0.6353 0.634 0.634 0.634 0.634 0.535 0.53 0.555 0.5353 0.4809 0.6034 0.76 0.8 0.743 0.7771 0.93 0.6674 4,562.4208 0 0.711 0.711 0.711 0.6034 0.6034 1 2017-01-22 20:16:56 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) N3-Reuters-128 0.6989 0.6989 0.6989 0.7209 0.7209 0.7209 0.6881 0.7327 0.6751 0.6908 0.7199 0.664 0.6808 0.6929 0.7146 0.7166 0.6577 0.7871 6,885.9297 0 0.7298 0.7298 0.7298 0.664 0.664 0 2017-01-22 19:53:40 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) OKE 2015 Task 1 evaluation dataset 0.6024 0.6024 0.6024 0.6105 0.6105 0.6105 0.6299 0.6031 0.6873 0.6294 0.5987 0.6636 0.5659 0.5949 0.5607 0.45 0.6338 0.3488 8,182.3762 0 0.6873 0.6873 0.6873 0.6636 0.6636 0 2017-01-22 19:52:51 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) OKE 2015 Task 1 example set 0.6667 0.6667 0.6667 0.75 0.75 0.75 0.8222 0.7222 1 0.75 0.6 1 0.5556 0.6667 0.5 0.5 1 0.3333 3,290 0 1 1 1 1 1 0 2017-01-22 19:39:08 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_2016 (NIF WS) OKE 2015 Task 1 gold standard sample 0.7092 0.7113 0.7071 0.7322 0.7337 0.731 0.7502 0.7498 0.7659 0.7459 0.7411 0.7508 0.716 0.724 0.7118 0.3333 0.3704 0.303 7,658.5104 0 0.7879 0.7895 0.7867 0.7533 0.7558 0 2017-01-22 19:51:16 1.2.4