GERBIL Experiment

Your Experiments are completed

Experiment URI:
Type: D2KB
Matching: Ma - strong annotation match
Annotator Dataset Micro F1 score Micro Precision Micro Recall Macro F1 score Macro Precision Macro Recall InKB Macro F1 score InKB Macro Precision InKB Macro Recall InKB Micro F1 score InKB Micro Precision InKB Micro Recall EE Macro F1 score EE Macro Precision EE Macro Recall EE Micro F1 score EE Micro Precision EE Micro Recall avg millis/doc confidence threshold GSInKB Macro F1 score GSInKB Macro Precision GSInKB Macro Recall GSInKB Micro F1 score GSInKB Micro Precision Error Count Timestamp GERBIL version
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) ACE2004 0.6928 0.6928 0.6928 0.7903 0.7903 0.7903 0.7911 0.7975 0.7934 0.7056 0.7149 0.6966 0.7708 0.7884 0.7896 0.6533 0.6282 0.6806 22,367.8246 0 0.7934 0.7934 0.7934 0.6966 0.6966 0 2017-01-23 18:10:15 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) AIDA/CoNLL-Complete 0.5932 0.5932 0.5932 0.5903 0.5903 0.5903 0.5933 0.6207 0.596 0.6055 0.6193 0.5923 0.471 0.4958 0.538 0.5499 0.5098 0.5968 7,122.0668 0 0.5989 0.5989 0.5989 0.5923 0.5923 0 2017-01-23 20:34:40 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) AIDA/CoNLL-Test A 0.59 0.59 0.59 0.5724 0.5724 0.5724 0.5736 0.6001 0.5685 0.5952 0.6098 0.5812 0.4731 0.4892 0.537 0.5705 0.5233 0.6271 17,524.4398 0 0.5685 0.5685 0.5685 0.5812 0.5812 0 2017-01-23 18:52:10 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) AIDA/CoNLL-Test B 0.5662 0.5662 0.5662 0.5838 0.5838 0.5838 0.5865 0.6092 0.588 0.5767 0.5888 0.5651 0.4872 0.5098 0.5646 0.5286 0.4924 0.5705 16,412.4502 0 0.588 0.588 0.588 0.5651 0.5651 0 2017-01-23 18:52:19 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) AIDA/CoNLL-Training 0.6004 0.6004 0.6004 0.596 0.596 0.596 0.5995 0.6283 0.6042 0.6151 0.6291 0.6017 0.4665 0.4939 0.5317 0.55 0.5107 0.5958 8,290.2981 0 0.6085 0.6085 0.6085 0.6017 0.6017 0 2017-01-23 20:02:21 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) AQUAINT 0.6657 0.6657 0.6657 0.6587 0.6587 0.6587 0.712 0.7503 0.6857 0.7245 0.7604 0.6919 0.2203 0.2037 0.27 0.1143 0.0792 0.2051 24,992.26 0 0.6857 0.6857 0.6857 0.6919 0.6919 0 2017-01-23 18:09:49 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) DBpediaSpotlight 0.6502 0.6614 0.6394 0.668 0.6717 0.6648 0.7054 0.7667 0.6648 0.7081 0.7932 0.6394 0.5 0.5 0.5 0 0 0 26,289.9825 0 0.668 0.6717 0.6648 0.6502 0.6614 1 2017-01-23 18:14:00 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) Derczynski IPM NEL The annotator caused too many single errors. 2017-01-23 17:57:14 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) IITB 0.5188 0.5207 0.517 0.5164 0.5164 0.5164 0.5514 0.4889 0.6405 0.558 0.4927 0.6433 0.4069 0.6078 0.3149 0.4281 0.6288 0.3245 33,682.4951 0 0.6501 0.6502 0.6501 0.6453 0.6474 1 2017-01-23 18:49:44 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) KORE50 0.2361 0.2361 0.2361 0.2057 0.2057 0.2057 0.231 0.2817 0.2057 0.2881 0.3696 0.2361 0.3 0.3 0.3 0 0 0 19,694.88 0 0.2057 0.2057 0.2057 0.2361 0.2361 0 2017-01-23 18:05:34 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) MSNBC 0.7895 0.7906 0.7885 0.791 0.792 0.7901 0.7994 0.833 0.7731 0.7997 0.8336 0.7684 0.5859 0.5345 0.7079 0.7364 0.6111 0.9263 25,174.5 0 0.774 0.7751 0.7731 0.7696 0.7708 0 2017-01-23 17:57:23 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) Microposts2013-Test 0.1842 0.186 0.1824 0.4023 0.4023 0.4023 0.3779 0.3779 0.3779 0 0 0 0.4137 0.44 0.4023 0.3085 1 0.1824 5,529.6745 0 12 2017-01-23 20:01:31 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) Microposts2013-Train 0.204 0.2064 0.2016 0.399 0.399 0.399 0.3684 0.3684 0.3684 0 0 0 0.4132 0.4458 0.399 0.3355 1 0.2016 3,508.6503 0 30 2017-01-23 20:32:14 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) Microposts2014-Test 0.4514 0.4514 0.4514 0.6714 0.6714 0.6714 0.6866 0.7204 0.6714 0.5423 0.679 0.4514 0.7299 0.7299 0.7299 0 0 0 6,637.219 0 0.6714 0.6714 0.6714 0.4514 0.4514 0 2017-01-23 19:45:46 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) Microposts2014-Train 0.4874 0.4874 0.4874 0.6534 0.6534 0.6534 0.6773 0.7262 0.6535 0.5743 0.6984 0.4877 0.6705 0.6705 0.6705 0.0017 0.0009 0.25 4,104.9312 0 0.6539 0.6539 0.6539 0.4877 0.4877 0 2017-01-23 20:29:09 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) N3-RSS-500 0.6857 0.6864 0.685 0.685 0.685 0.685 0.5897 0.602 0.594 0.5807 0.579 0.5824 0.766 0.785 0.773 0.8013 0.8055 0.7971 9,616.5631 0 0.696 0.696 0.696 0.5824 0.5824 1 2017-01-23 19:09:03 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) N3-Reuters-128 0.6898 0.6898 0.6898 0.7166 0.7166 0.7166 0.6984 0.7826 0.671 0.6944 0.7678 0.6339 0.6466 0.6388 0.7074 0.6809 0.5766 0.8313 17,789.8906 0 0.7101 0.7101 0.7101 0.6339 0.6339 0 2017-01-23 18:26:57 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) OKE 2015 Task 1 evaluation dataset 0.5783 0.5783 0.5783 0.5877 0.5877 0.5877 0.6299 0.6642 0.6414 0.6322 0.6483 0.6168 0.3995 0.4047 0.448 0.3803 0.3484 0.4186 19,584.5347 0 0.6414 0.6414 0.6414 0.6168 0.6168 0 2017-01-23 18:22:05 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) OKE 2015 Task 1 example set 0.6667 0.6667 0.6667 0.75 0.75 0.75 0.8222 0.7222 1 0.75 0.6 1 0.5556 0.6667 0.5 0.5 1 0.3333 29,029 0 1 1 1 1 1 0 2017-01-23 17:50:27 1.2.4
AG_IMP_NEWSF_CONTEXT_ACR_POPULAR_2016 (NIF WS) OKE 2015 Task 1 gold standard sample 0.7181 0.7202 0.716 0.7406 0.7421 0.7394 0.7563 0.7851 0.7468 0.7648 0.7902 0.741 0.6719 0.6719 0.684 0.3855 0.32 0.4848 18,977.7708 0 0.7688 0.7704 0.7676 0.7434 0.7459 0 2017-01-23 18:19:22 1.2.4