{ "v1_Abstract": "Citation metrics and h indices differ using different bibliometric databases. We compiled the number of publications, number of citations, h index and year since the first publication from 340 soil researchers from all over the world. On average, Google Scholar has the highest h index, number of publications and citations per researcher, and the Web of Science the lowest. The number of papers in Google Scholar is on average 2.3 times higher and the number of citations is 1.9 times higher compared to the data in the Web of Science. Scopus metrics are slightly higher than that of the Web of Science. The h index in Google Scholar is on average 1.4 times larger than Web of Science, and the h index in Scopus is on average 1.1 times larger than Web of Science. Over time, the metrics increase in all three databases but fastest in Google Scholar. The h index of an individual soil scientist is about 0.7 times the number of years since the first publication. There is a large difference between the number of citations, number of publications and the h index using the three databases. From this analysis it can be concluded that the choice of the database affects widely used citation and evaluation metrics but that bibliometric transfer functions exist to relate the metrics from these three databases. We also investigated the relationship between journal\u2019s impact factor and Google Scholar\u2019s h5-index. The h5-index is a better measure of a journal\u2019s citation than the 2 or 5 year window impact factor.", "v1_col_introduction": "introduction : Scientific impact measures are increasingly being used for academic promotions, grant evaluations and evaluation of job vacancy candidates. They are also being used for the evaluations of university departments and research centres. Traditionally, the impact factor of a journal has been used \u2013 a metric developed by Garfield (1955) whereby the citations and number of papers published over a given period are divided. For most journals it shows considerable inter-annual fluctuation and it provides no information on individual papers nor individual authors. Since 2005, the h index has been used as an index for quantifying the scientific productivity of scientists based on their publication record (Hirsch, 2005). It is a personal index and provides information on the number of publications of an author and the number of citations: A scholar with an index of h has published h papers with at least h citations each. The h index can also be calculated for journals, departments, universities or countries.\nThe three widely used bibliometric databases for analysis and evaluations of\ncitations and the h index are Web of Science (Thomson Reuters), Scopus (Elsevier), and Google Scholar. Some papers have compared citations between these three databases. Although Google Scholar and Scopus seem to provide higher numbers of citations (Falagas et al., 2008), there is mixed information on the h index. For example, Bar-Ilan (2008) compared the h index for 47 highly-cited Israeli researchers across the three databases and concluded that the results from Google Scholar are considerably different from Web of Science and Scopus. Mingers and Lipitakis (2010) looked at 4,600 publications from three UK Business Schools, and found that Web of Science poorly covers the management discipline compared to Google Scholar. De Groote and Raszewski (2012) examined 31 faculty members from nursing faculty in the Midwestern USA, and concluded that more than one database should be used to calculate the h index. They further recommended that since the h index rankings differ among databases, comparisons between researchers should be done only within a specified database.\nThe difference between the three databases has been fairly well established and\nthe three databases will calculate different citations and h indices. As far as we know, the relationships between the three databases have not been investigated and derived. The aims of this paper are therefore: (i) to compare citations and h index across the three databases, (ii) to derive transfer functions to convert metrics from one database to\n11\n12\n13\n14\n15\n16\n17\n18\n19\n20\n21\n22\n23\n24\n25\n26\n27\n28\n29\n30\n31\n32\n33\n34\n35\n36\n37\n38\n39\n40\n41\n42\nPeerJ reviewing PDF | (v2013:09:794:1:1:NEW 1 Oct 2013)\nR ev ie w in g M an\nus cr ip t\nthe others, and (iii) to compare impact factors for journals and the h index. Hereto we have compared the data from 340 researchers and 31 journals. Since we are all three soil scientists, we have used only soil researchers and journals in this study.\nSoil science is a study of soil as a natural phenomenon and resource (Brevik and\nHartemink, 2010). It is a relatively small discipline, in terms of number of researchers, number of papers per annum, and citations. The IUSS (International Union of Soil Sciences) database lists about 50,000 soil scientists worldwide, but only a fraction of these are in research and actively publish, with a guestimate of 5,000 to 10,000 publishing researchers. The \u201csoil\u201d topic has lower number of papers and citations when compared to other subjects of natural resources such as \u201cair\u201d and \u201cwater\u201d (Minasny et al., 2007). The number of published papers in 2011 according to Scopus with \u201csoil\u201d in the abstract and keywords is 39,504, with a rate of increase of about 2,000 papers per year. In comparison the number of papers in 2011 on \u201cair\u201d is 1.4 times larger and the number of papers on \u201cwater\u201d is 3.5 times larger. The h index ratios for water, air, and soil (for the papers published in 2011) are 1.7, 1.3, and 1.0. Nevertheless soil is becoming more important with strong links to global issues of food security, biodiversity, land use change, and climate change (McBratney et al., 2014). While this study only used soil researchers the bibliometric results are illustrative to other agricultural, environmental, earth science and biology disciplines, and to small scientific disciplines in general.", "v2_Abstract": "Citation metrics and h indices differ using different bibliometric databases. We compiled the number of publications, number of citations, h index and year since first publication for 340 soil researchers from all over the world. On average, Google Scholar has the highest h index, number of publications and citations per researcher, and the Web of Science the lowest. The number of papers in Google Scholar is on average 2.3 times larger and the number of citations is 1.9 times larger compared with data in the Web of Science. Scopus metrics are slightly larger than that of the Web of Science. Over time, the metrics increase in all three databases but fastest in Google Scholar. The h index of an individual soil scientist is about 0.7 times the number of years since the first publication. About 10% of the h index is caused by self-citation but that may be higher for younger authors. There is a large difference between the number of citations, number of publications and the h index using the three different databases. We also compared journal impact factor and the h5-index from Google Scholar in 31 soil science journals. The h5-index is a better measure of a journal\u2019s citation than the 2or 5-year window impact factor. From this analysis it can be concluded that the choice of the database affects widely used citation and evaluation metrics but that pedobibliometric transfer functions exist to relate the metrics from these three databases.", "v2_col_introduction": "introduction : Scientific impact measures are increasingly being used for academic promotions, grant evaluations and evaluation of job vacancy candidates. They are also being used for the evaluations of university departments and research centres. Traditionally, the impact factor of a journal was being used \u2013 a metric developed by Garfield (1955) whereby the citations and number of papers published over a given period (usually 2 years) are divided. For most journals it shows considerable inter-annual fluctuation and it provides no information on individual papers nor individual authors. Since 2005, the h index has been used as an index for quantifying the scientific productivity of scientists based on their publication record (Hirsch, 2005). It is a personal index and provides information on the number of publications of an author and the number of citations: A scholar with an index of h has published h papers with at least h citations each. The h index can also be calculated for journals, departments, universities or countries.\nThe three widely used bibliometric databases for analysis and evaluations of\ncitations and the h index are Web of Science (Thomson Reuters), Scopus (Elsevier), and Google Scholar. Some papers have compared citations between these three databases. Although Google Scholar and Scopus seem to provided higher number of citations (Falagas et al., 2008), there is mixed information on the h index. For example, Bar-Ilan (2008) compared the h index for 47 highly-cited Israeli researchers across the three databases and concluded that the results from Google Scholar are considerably different from Web of Science and Scopus. Mingers and Lipitakis (2010) looked at 4,600 publications from three UK Business Schools, and found that Web of Science poorly covers the management discipline compared to Google Scholar. De Groote and Raszewski (2012) examined 31 faculty members from nursing faculty in the mid-west USA, and concluded that more than one databases should be used to calculate the h index. They further recommended that since the h index rankings differ among databases, comparisons between researchers should be done only within a specified database.\nThe difference between the three databases has been fairly well established and\nthe three databases will calculate different citations and h indices. As far as we know, the relationships between the three databases have not been investigated and derived. The aims of this paper are therefore: (i) to compare citations and h index across the\n10\n11\n12\n13\n14\n15\n16\n17\n18\n19\n20\n21\n22\n23\n24\n25\n26\n27\n28\n29\n30\n31\n32\n33\n34\n35\n36\n37\n38\n39\n40\n41\nPeerJ reviewing PDF | (v2013:09:794:0:1:NEW 9 Sep 2013)\nR ev ie w in g M an\nus cr ip t\nthree databases, (ii) to derive transfer functions to convert metrics from one database to the others, and (iii) to compare impact factors for journals and the h index. Hereto we have compared the data from 340 researchers and 31 journals. Since we are all three soil scientists, we have used only soil researchers and journals in this study.\nData and Methods\nGoogle Scholar (GS) is a bibliographic database freely available from Google. It was introduced in 2004 and contains scholarly works across many disciplines and sources, including theses, books, reports, abstracts, peer-reviewed and non-reviewed articles, and web pages that are deemed scholarly. Google Scholar lists these automatically from its search engine activities (Harzing and van der Wal, 2009; Vine, 2006). An individual Google Scholar page was featured in 2012, where a researcher can create a webpage, with fields of interest. Google Scholar automatically searches and populates the individual\u2019s publications, calculates and displays the individual's total number of citations, h index, and i10 index. Scopus, or SciVerse Scopus, is a bibliographic database from Elsevier which contains abstracts and citations for academic journal articles, conference papers, and book chapters. Inclusion in the database is through the Scopus Content Selection and Advisory Board. Although its record goes back as early as 1823, its citations is reliable after 1995. The Web of Science is a bibliographic database from Thompson Reuters which only contained abstracts and citations for articles listed in the Web of Science indexed journals since 1900 (Harzing and van der Wal, 2009).\nData from researchers with the following areas of interest in: \u201csoil science\u201d, \u201csoil\u201d,\n\u201cpedology\u201d, \u201csoil physics\u201d, \u201csoil biology\u201d, \u201csoil chemistry\u201d, \u201csoil fertility\u201d, \u201csoil erosion\u201d, \u201csoil ecology\u201d, and \u201csoil carbon\u201d were retrieved from the Google Scholar\u2019s author page. The same researcher was located in Scopus and the Web of Science. In Scopus, the \u2018Author Identifier\u2019 tool was used to locate the researcher. In the Web of Science, the author\u2019s surname and first name\u2019s initial was used, together with \u201csoil\u201d in the search subject. When the name and publication were inconsistent across all three databases, the researcher was not included in our analysis. At the end, we collected data from 340 researchers and this included: number of total citations, h index, number of papers, and year of the first publication. These data were obtained for each researcher and from each of the three databases. The publications and citations are until June 2013.\n42\n43\n44\n45\n46\n47\n48\n49\n50\n51\n52\n53\n54\n55\n56\n57\n58\n59\n60\n61\n62\n63\n64\n65\n66\n67\n68\n69\n70\n71\n72\n73\nPeerJ reviewing PDF | (v2013:09:794:0:1:NEW 9 Sep 2013)\nR ev ie w in g M an\nus cr ip t", "v1_text": "results and discussion : Number of papers, citations and h index Table 1 shows the statistics of h index, number of publications, number of citations, and year of the first paper for 340 soil researchers in the three databases. Our data encompass a wide range of researchers from early-career to well-established and highly-cited researchers. The database is much larger and more diverse than previous studies where a small and focussed group of researchers was used to compare citation metrics between the databases (e.g. Franceschet, 2010; Meho and Rogers 2008; Patel et al., 2013). The median number of papers for the 340 soil researchers ranged from 23 (Web of Science) to 79 (Google Scholar) with Scopus having intermediate values. The number of citations is also highest in Google scholar, with a median of 866 citations per author whereas it is 291 in the Web of Science. The h index and its annual increase are lowest 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 PeerJ reviewing PDF | (v2013:09:794:1:1:NEW 1 Oct 2013) R ev ie w in g M an us cr ip t in the Web of Science. This pattern holds for all of the metrics presented here: Google Scholar has the highest numbers and the Web of Science the lowest whereas the Scopus numbers are in between. Part of this may be the different types of publications included and also the periods of time covered by the 3 databases are slightly different. A simple linear regression without intercept was performed between the citation indices of the three databases (Table 2). Google Scholar has on average 2.3 times more articles and 1.9 times more citations than the Web of Science. The Scopus database (all years) has 1.1 times more papers than the Web of Science but a similar number of citations compared to the Web of Science. Since the citations are more correct and complete after 1995, a revision was made to the relationship for post 1995 authors; it shows that Scopus has about 1.2 times more citations than the Web of Science. The 20% higher citations are consistent with the findings by Falagas et al. (2008) in the field of medicine. Similarly, for articles in medical journals, Kulkarni et al. (2009) found that Google Scholar and Scopus retrieved more citations compared to Web of Science (1.22 and 1.20 times respectively).