Monday, 25 August 2014

SSRN Top Downloads For Management Educator: Courses, Cases & Teaching eJournal

Equality of Google Scholar with Web of Science Citations: Case of Malaysian Engineering Highly Cited Papers

Nader Ale Ebrahim,

Hadi Salehi,

Mohamed Amin Embi,

Mahmoud Danaee,

Marjan Mohammadjafari,

Azam Zavvari,

Masoud Shakiba and

Masoomeh Shahbazi-Moghadam

University of Malaya (UM) - Department of
Engineering Design and Manufacture, Faculty of EngineeringUniversity of
Malaya (UM) - Research Support Unit, Centre of Research Services,
Institute of Research Management and Monitoring (IPPP), Islamic Azad
University, Najafabad Branch, Universiti Kebangsaan Malaysia - Faculty
of Education, Islamic Azad University (IAU) - Department of Agriculture,
University of Malaya (UM), National University of Malaysia, National
University of Malaysia and Universiti Teknologi Malaysia (UTM)

Date posted to database: 9 Aug 2014

Last Revised: 10 Aug 2014

SSRN Top Downloads

Sunday, 17 August 2014

Equality of Google Scholar with Web of Science Citations: Case of Malaysian Engineering Highly Cited Papers - E-LIS repository

Equality of Google Scholar with Web of Science Citations: Case of Malaysian Engineering Highly Cited Papers

Ale Ebrahim, Nader and Salehi, Hadi and Embi, Mohamed Amin and Danaee, Mahmoud and Mohammadjafari, Marjan and Zavvari, Azam and Shakiba, Masoud and Shahbazi-Moghadam, Masoomeh
Equality of Google Scholar with Web of Science Citations: Case of Malaysian Engineering Highly Cited Papers.
Modern Applied Science, 2014, vol. 8, n. 5, pp. 63-69.

[Journal article (Print/Paginated)]



Equality of Google Scholar.pdf

Download (205Kb)

English abstract

This study
uses citation analysis from two citation tracking databases, Google
Scholar (GS) and ISI Web of Science, in order to test the correlation
between them and examine the effect of the number of paper versions on
citations. The data were retrieved from the Essential Science Indicators
and Google Scholar for 101 highly cited papers from Malaysia in the
field of engineering. An equation for estimating the citation in ISI
based on Google scholar is offered. The results show a significant and
positive relationship between both citation in Google Scholar and ISI
Web of Science with the number of versions. This relationship is higher
between versions and ISI citations (r = 0.395, p<0.01) than between
versions and Google Scholar citations (r = 0.315, p<0.01). Free
access to data provided by Google Scholar and the correlation to get ISI
citation which is costly, allow more transparency in tenure reviews,
funding agency and other science policy, to count citations and analyze
scholars’ performance more precisely.
Item type:
Journal article (Print/Paginated)

Keywords: Bibliometrics, Citation analysis,
Evaluations, Equivalence, Google Scholar, High cited, ISI Web of
Science, Research tools, H-index
Subjects: C. Users, literacy and reading. > CD. User training, promotion, activities, education.
G. Industry, profession and education. > GH. Education.
Depositing user:

Dr. Nader Ale Ebrahim

Date deposited: 16 Aug 2014 00:14
Last modified: 16 Aug 2014 00:14


"SEEK" links will first look for possible matches inside E-LIS and query Google Scholar if no results are found.
Chadegani, A., Salehi, H., Yunus, M. M., Farhadi, H., Fooladi, M.,
Farhadi, M., & Ale Ebrahim, N. (2013). A Comparison between Two Main
Academic Literature Collections: Web of Science and Scopus Databases.
Asian Social Science, 9(5), 18-26.
I. F. (2011). Is Google Scholar useful for Bibliometrics? A Webometric
Analysis. In E. Noyons, P. Ngulube & J. Leta (Eds.), Proceedings of
Issi 2011: The 13th Conference of the International Society for
Scientometrics and Informetrics, Vols 1 and 2 (pp. 19-25). Leuven: Int
Soc Scientometrics & Informetrics-Issi.
Ebrahim, N. (2013). Introduction to the Research Tools Mind Map.
Research World, 10(4), 1-3.
Ebrahim, N., Ahmed, S., & Taha, Z. (2009). Virtual Teams: a
Literature Review. Australian Journal of Basic and Applied Sciences,
3(3), 2653-2669.
Ebrahim, N., Salehi, H., Embi, M. A., Habibi Tanha, F., Gholizadeh, H.,
& Motahar, S. M. (2014). Visibility and Citation Impact.
International Education Studies, 7(4), 120-125.
Ebrahim, N., Salehi, H., Embi, M. A., Habibi Tanha, F., Gholizadeh, H.,
Motahar, S. M., & Ordi, A. (2013). Effective Strategies for
Increasing Citation Frequency. International Education Studies, 6(11),
D. R., Oliveira Jr, O. N., & da Fontoura Costa, L. (2012).
Three-feature model to reproduce the topology of citation networks and
the effects from authors’ visibility on their h-index. Journal of
Informetrics, 6(3), 427-434. doi:
K. (2004). Do open-access articles have a greater research impact?
College & Research Libraries 65(5), 372-382.
N., Bauer, K., Glover, J., & Wang, L. (2006). Three options for
citation tracking: Google Scholar, Scopus and Web of Science. Biomedical
Digital Libraries, 3(1), 7.
S. (2011). Anne-Wil Harzing: The publish or perish book: Your guide to
effective and responsible citation analysis. Scientometrics 88(1),
339-342. doi:
L., Marx, W., Schier, H., Rahm, E., Thor, A., & Daniel, H. D.
(2009). Convergent validity of bibliometric Google Scholar data in the
field of chemistry-Citation counts for papers that were accepted by
Angewandte Chemie International Edition or rejected but published
elsewhere, using Google Scholar, Science Citation Index, Scopus, and
Chemical Abstracts. Journal of Informetrics, 3(1), 27-35.
A., & Delgado-Lopez-Cozar, E. (2013). Google Scholar and the
h-index in biomedicine: The popularization of bibliometric assessment.
Medicina Intensiva, 37(5), 343-354.
I. D., Plume, A. M., McVeigh, M. E., Pringle, J., & Amin, M.
(2007). Do open access articles have greater citation impact?: A
critical review of the literature. Journal of Informetrics, 1(3),
L., Guns, R., & Rousseau, R. (2013). Measuring co-authors'
contribution to an article's visibility. Scientometrics 95(1), 55-67.
K., & Şengül, G. (2012). Self Archiving in Atılım University. In S.
Kurbanoğlu, U. Al, P. Erdoğan, Y. Tonta & N. Uçak (Eds.), E-Science
and Information Management (Vol. 317, pp. 79-86): Springer Berlin
M., Salehi, H., Yunus, M. M., Farhadi, M., Aghaei Chadegani, A.,
Farhadi, H., & Ale Ebrahim, N. (2013). Do Criticisms Overcome the
Praises of Journal Impact Factor? Asian Social Science, 9(5), 176-182.
E. (1972). Citation analysis as a tool in journal evaluation. Science
178, :471-479.
Hardy, R., Oppenheim, C., Brody, T., & Hitchcock, S. (2005). Open Access Citation Information.
Hooper, S. L. (2012). Citations: not all measures are equal. Nature 483(7387), 36-36.
P. (2012). Google Scholar Metrics for Publications The software and
content features of a new open access bibliometric service. Online
Information Review 36(4), 604-619.
R., & Colbert-Lewis, D. (2011). Citation searching and bibliometric
measures: Resources for ranking and tracking. College & Research
Libraries News, 72(8), 470-474.
K., & Thelwall, M. (2007). Google Scholar citations and Google
Web-URL citations: A multi-discipline exploratory analysis. Journal of
the American Society for Information Science and Technology, 58(7),
P. O., & von Ins, M. (2010). The rate of growth in scientific
publication and the decline in coverage provided by Science Citation
Index. Scientometrics 84(3), 575-603.
S. (2001a). Free online availability substantially increases a paper's
impact. Nature 411(6837), 521-521.
Lawrence, S. (2001b). Online or invisible. Nature 411(6837), 521.
M. J., & Snyder, C. M. (2013). Does Online Availability Increase
Citations? Theory and Evidence from a Panel of Economics and Business
Journals: SSRN working paper.
E., & Delgado López-Cózar, E. (2014). Google Scholar Metrics
evolution: an analysis according to languages. Scientometrics 98(3),
D., & Stergiou, K. I. (2005). Equivalence of results from two
citation analyses: Thomson ISI’s Citation Index and Google’s Scholar
service. Ethics in Science and Environmental Politics, 5, 33-35.
D. C., & Musakali, J. J. (2013). Publish or Perish: Remaining
Academically Relevant and Visible In the Global Academic Scene through
Scholarly Publishing. Paper presented at the Conference and Programme
Chairs, South Africa.
D. J., Laakso, M., & Björk, B.-C. (2013). A longitudinal comparison
of citation rates and growth among open access journals. Journal of
Informetrics, 7(3), 642-650.
Thomson Corporation. (2013). Essential Science Indicators, Product
Overview. from
W. P., & Wilson, C. S. (2004). Measuring the citation impact of
research journals in clinical neurology: A structural equation modelling
analysis. Scientometrics 60(3), 317-332.
C.-T. (2009). The e-Index, Complementing the h-Index for Excess
Citations. PLoS ONE, 4(5), e5429.
J., Zhao, Z. Y., Zhang, X., Chen, D. Z., Huang, M. H., Lei, X. P., . . .
Zhao, Y. H. (2012). International scientific and technological
collaboration of China from 2004 to 2008: a perspective from paper and
patent analysis. Scientometrics 91(1), 65-80.

Equality of Google Scholar with Web of Science Citations: Case of Malaysian Engineering Highly Cited Papers - E-LIS repository

Tuesday, 12 August 2014

Assessing Scholarly Productivity: The Numbers Game


Assessing Scholarly Productivity

To evaluate the work of scholars
objectively, funding agencies and tenure committees may attempt to
quantify both its quality and impact. Quantifying scholarly work is
fraught with danger, but the current emphasis on assessment in academe
suggests that such measures can only become more important. There are a
number of descriptive statistics associated with scholarly productivity.
These fall broadly into two categories: those that describe individual
researchers and those that describe journals.

Rating Researchers

Raw Citation Counts

One way to measure the impact of a paper is to simply count how many
times it has been cited by others. This can be accomplished by finding
the paper in Google Scholar and
noting the "Cited by" value beneath the citation. Such numbers may be
added together, or perhaps averaged over a period of years, to provide
an informal assessment of scholarly productivity. Better yet, use Google Scholar Citations
to keep a running list of your publications and their "cited by"
numbers. For more information on determining where, by whom, and how
often an article has been cited, see IC Library's guide on Cited Reference Searching.


The h-index, created by Jorge E. Hirsh of the University of California, San Diego, is described by its creator as follows:

A scientist has index h if h of his/her Np papers have at least h citations each, and the other (Np - h) papers have no more than h citations each.1
In other words, if I have an h-index of 5, that means that my five
most-cited papers each have been cited five or more times. This can be
visualized by a graph, on which each point represents a paper. The
scholar's papers are ranked along the x-axis by decreasing
number of citing papers, while the actual number of citing papers is
shown by the point's position along the y-axis. The grey line
represents the equality of paper rank and number of citating articles.
The h-index is equal to the number of points above the grey line.

The value of h will depend on the database used to calculate it. 2
Thomson Reuter's Web of Science and Elsevier's Scopus (neither is
available at IC) offer automated tools for calculating this value. In
November of 2011, Google Scholar Citations became generally available. This will calculate h based on the Google Scholar database. An add-on for Firefox called the Scholar H-Index Calculator is also based on Google Scholar data.

Comparisons of h are only valid within a discipline, since
standards of productivity vary widely between fields. Researchers in the
life sciences, for instance, will generally have higher h values than those in physics.1

A large number of modifications to the h-index have been proposed, many attempting to correct for factors such as length of career and co-authorship.

ImpactStory (currently in beta)
is a service that attempts to show the impact of research not only
through citations but through social media (i.e., how often an article
has been tweeted about, saved to social bookmarking services, etc.).

Rating Journals

Rightly or wrongly, the quality of a paper is sometimes judged by the
reputation of the journal in which it is published. Various metrics
have been devised to describe the importance of a journal.

Impact Factor

The Impact Factor (IF) is a proprietary measure calculated annually by Thomson Reuters
(formerly by ISI). This figure is based on how often papers published
in a given journal in the preceding two years are cited during the
current year. This number is divided by the number of "citable items"
published by that journal during the preceding two years to arrive at
the IF. Weaknesses of this metric include sensitivity to inflation
caused by extensive self-citation within a journal and by single,
highly-cited articles. For more information about the IF, see the essays of Dr. Eugene Garfield,
founder of ISI. Determining a journal's IF requires access to Thomson
Reuters Journal Citation Reports, not available at IC Library.


The eigenfactor is a more recent, and freely-available metric,
devised at the University of Washington by Jevin West and Carl
Where the IF counts all citations to a given article as being equal,
the eigenfactor weights citations based on the impact of the citing
journal. Its creators assert that it can be viewed as "a rough estimate
of how often a journal will be used by scholars." Eigenfactor values are
freely avialable at

SCImago Journal Rank Indicator

The SCImago Journal Rank indicator (SJR) is another open-source metric.4
It uses an algorithm similar to Google's PageRank. Currently, this
metric is only available for those journals covered in Elsevier's Scopus
database. Values may be found at


Hirsch, J.E. An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America 102, 16569 -16572 (2005).

Meho, L.I.
& Yang, K. Impact of data sources on citation counts and rankings
of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology 58, 2105-2125 (2007).

Bergstrom, C. Eigenfactor: Measuring the value and prestige of scholarly journals. College & Research Libraries News 68, (2007).

González-Pereira, B., Guerrero-Bote, V.P. & Moya-Anegón, F. The SJR indicator: A new indicator of journals' scientific prestige.

Assessing Scholarly Productivity: The Numbers Game - Ithaca College Library

How to make your paper more accessible through self-archiving | Editage Insights


How to make your paper more accessible through self-archiving

you’ve completed your study and published your paper in the journal of
your choice. Now you want to make your paper accessible to more and more
readers within and outside the scientific community so as to increase
its impact. One way to increase the visibility of your paper is through

What is self-archiving?
is the practice of placing digital versions of scientific literature
online. When you self-archive your research, you make it freely
available to anyone on the Internet. In other words, self-archiving
makes your research widely “visible, accessible, harvestable,
searchable, and useable,”1 thus increasing its reach and impact, and possibly the number of citations it receives.
Benefits of Self-Archiving

Where does self-archiving fit in the publishing process?
the figure below shows, you can self-archive different versions of your
research paper: (a) the version before peer review, called the
“pre-print,” and (b) the version that has been peer reviewed and
accepted for publication, called the “refereed post-print.” All versions
of papers made available online are referred to as “e-prints.”

Figure: Stages when manuscripts may be self-archived (adapted5)
Stages when manuscripts may be self-archived

Where can you self-archive? 
Research articles can be self-archived in repositories, which are electronic archives, or on personal servers.6
Institutional repositories: Many
universities and research institutions own repositories where all their
members can deposit their research papers. This enables researchers
from that institution to view each other’s work and gives anyone
interested a broad view of all work being conducted through that
Subject-based repositories: Some archives are subject-area specific and tend to be very popular in their respective disciplines, for example, PubMed for biomedical studies; RePEc for economics; and arXiv most popularly for physics, mathematics, and computer science.
Personal servers or profile pages: Researchers can upload their work onto their own web pages. Further, some social networking sites for researchers, like ResearchGate , have sections dedicated to uploaded publications. 
Copyright issues related to self-archiving

  • Self-archiving
    the pre-print version of your article does not infringe any copyright
    agreement since it is done prior to submission to the publisher. Hence,
    it is not a legal matter. Sometimes, though increasingly rarely,
    journals might disallow self-archiving pre-prints, which is a matter of
    journal policy and not copyright. There tend to be some discipline-based
    differences here, with self-archiving being a common and accepted
    practice in the physical sciences (physics, computer science, etc.), but
    not so much so in the biomedical sciences.
  • On
    the other hand, the copyrights of refereed post-prints usually belong
    to the journal, and self-archiving these can lead to a legal breach if
    the journal’s policy is not followed. Journals/publishers have different
    copyright policies with regard to self-archiving post-prints. The table
    below shows the differing policies of some popular publishers with
    respect to self-archiving.7
     Most publishers allow self-archiving of some sort, but remember to check your journal/publishers policy before self-archiving.

Why is self-archiving not widely done? 

self-archiving carries such benefits, why is it not widely prevalent?
Here are some of the reasons for this and counter arguments in support
of self-archiving

  1. Lack of awareness of its benefits: A
    large proportion of authors are unaware of the option of self-archiving
    and its benefits. Therefore, even if the authors’ institutions have
    repositories, authors themselves don’t bother with self-archiving unless
    their institutions mandate it.

  2. Concern about the quality of self-archived articles: In
    some fields of study, such as computer science, pre-prints are archived
    much more than post-prints. Self-archiving pre-prints allows for
    research to be scrutinized by the larger scientific community before it
    goes through peer review. Further, in all archiving repositories,
    pre-prints are clearly marked as such. As for post-prints, their quality
    need not be questioned because they are merely a copy of the journal’s
    peer-reviewed published version.

  3. Fear of infringing the journal’s copyright policies: Most
    journals, in their instructions for authors, clearly state their
    copyright policies with regard to self-archiving. As long as you read
    and understand these policies, most of which allow authors to
    self-archive, you do not risk infringing any agreements.

  4. Perception that self-archiving is time consuming and cumbersome: Contrary to this belief, self-archiving takes only about 10 minutes9 for the first paper when you have to create a profile/account, and only a small percentage of people find it “very difficult.”8 For all subsequent papers, the process is even easier and faster.
  5. Fear of disrupting the current scholarly publishing model: Institutions
    may refrain from creating repositories for fear that such archives may
    be seen as a substitute for journals. However, in a previous study8,
    two major publishers in physics—the American Physical Society (APS) and
    the Institute of Physics Publishing Ltd. (IOPP)—confirmed that the
    physics pre-print server axXiv did not in any way threaten their own
    business models. Thus, publishers and self-archiving servers may well be
    able to coexist peacefully.

The role of self-archiving in open access 

constitutes what is called the “green route” to open access. This means
that authors can make their research papers available and readers can
access them—both free of cost. This is different from publishing in an
open access journal, such as Public Library of Science (PLOS)
publications, where authors pay the journal a publication fee, after
which the published study is made available to the public for free—a
model known as the “gold route” to open access. It is important to note
that self-archiving “is not an alternative to publishing in learned
journals, but an adjunct, a complementary activity where an author
publishes his or her article in whatever journal s/he chooses and simply
self-archives a copy.”5

Future of article access 

published in subscription journals are usually accessible only to
researchers whose institutional libraries have subscriptions to those
journals. Researchers affiliated to smaller institutions that cannot
afford extensive journal subscriptions would not be able to access these
papers. Moreover, the chances of such research reaching a wider
audience of lay people and experts in unrelated fields are slim. Today,
the world is 
moving toward a system where the intellectual output of the research
community can be freely disseminated to the world at large. New journals
adopting the gold route of open access, that is, with an
article-processing charge for authors, are emerging, and even
traditional publishers that work with a subscription fee-based model are
offering more open access options. Funding bodies are encouraging
scientists to embrace the concept of allowing free access to published
literature. For example, Research Councils UK (RCUK) has recently
announced a policy stating that from April 2013, all science papers that
have received funding from grant agencies affiliated to RCUK must be
made freely available within six months of publication.10 Public
institutions like the US National Institutes of Health (NIH) mandate
that all articles arising from NIH funds be archived in PubMed upon
acceptance for publication. For a substantial number of journals, the
NIH public access policy requires that the final published version of
all NIH-funded research articles be made available on PubMed Central not
later than 12 months after publication.11 Newer models of open access are also being explored. 



sum, self-archiving is free, easy, and immensely beneficial. Moreover,
it is in line with the noble evolving trend of free widespread
dissemination of research findings for rapid global advancement of
science. So go ahead and consider self-archiving a viable option to
contribute to the progress of science and to increase your own research
impact by making your work more accessible.

How to make your paper more accessible through self-archiving | Editage Insights


Dear all,

With much pleasure, Academic Development Centre (ADeC) and the Research Support Unit (RSU)  would like to invite all academicians to participate in the following workshop on:


Details of the workshop are as follows :
Date              : 2 - 5 September 2014 (Tuesday to Friday)
Venue            : Interactive Learning Room (ILR) , Level 14 Wisma R&D
Time              : 9.00 am - 5.00 pm (4 days = 24 CPD Points)
Facilitator     : Dr. Nader Ale Ibrahim (Research Support Unit, IPPP, University of Malaya)
About the workshop:
"Do you have problems getting your papers accepted by ISI journals?
Does it take you a long time to produce an article?"
These two-part workshop aims to provide UM academic staff with useful and proven resources to help you write papers faster and help you stand a better chance of getting your papers accepted by high-tier, competitive journals. 
During the workshop, you will learn how to:
- keep up to date with current developments in your chosen field,
- find high-quality relevant literature related to your field,
- access important writing and editing, paraphrasing and summarizing tools,
- evaluate other researchers and journals,
- publish in relevant journals,
- increase the visibility of your publications
Tentative Program:
This workshop is divide into two parts as per below : 
(kindly refer to attachment for details)

2 - 3 September 2014 Part 1 : Effective Use of Research Tools 
4 - 5 September 2014 Part 2 : How to do Literature Review and Write a Review Paper *

*Participants may choose to attend both parts of the workshop or attend only one part of the workshop. However, if you choose to attend only Part 2, please ensure that you meet the following prerequisites for Part 2, :
·  Familiarity with academic/scientific databases (e.g. Elselvier, Web of Knowledge, EBSOHost, Google Scholar) 
·  Familiarity reference management software (preferably EndNote)
·  Familiarity with journal paper submission procedures
·  Familiarity with academic social networks

To register for this workshop, please log in to your UMMAIL account and fill in the online registration form below:
·  Please bring your own laptops. (Computers are not provided)
·  Parking is not provided at Wisma R&D due to limited space (parking is reserved for occupants of this building only). We will be providing transport services (JPPHB van) to and from Wisma R&D for participants registered.
For further inquiries, please call 03-22463358 (Ummu Saadah) or email us.

Thank you.

Warm regards,

Academic Development Centre (ADeC)
Level 14, Wisma R & D
University of Malaya
Jalan Pantai Baru
59990 Kuala Lumpur
Tel: +603-2246 3349
Fax : +603-2246 3352