altmetrics: a manifesto

No one can read everything.  We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these altmetrics reflect the broad, rapid impact of scholarship in this burgeoning ecosystem. We call for more tools and research based on altmetrics.

As the volume of academic literature explodes, scholars rely on filters to select the most relevant and significant sources from the rest. Unfortunately, scholarship’s three main filters for importance are failing:

  • Peer-review has served scholarship well, but is beginning to show its age. It is slow, encourages conventionality, and fails to hold reviewers accountable. Moreover, given that most papers are eventually published somewhere, peer-review fails to limit the volume of research.
  • Citation counting measures are useful, but not sufficient. Metrics like the h-index are even slower than peer-review: a work’s first citation can take years.  Citation measures are narrow;  influential work may remain uncited.  These metrics are narrow; they neglect impact outside the academy, and also ignore the context and reasons for citation.
  • The JIF, which measures journals’ average citations per article, is often incorrectly used to assess the impact of individual articles.  It’s troubling that the exact details of the JIF are a trade secret, and that  significant gaming is relatively easy.

Tomorrow’s filters: altmetrics

In growing numbers, scholars are moving their everyday work to the web. Online reference managers Zotero and Mendeley each claim to store over 40 million articles (making them substantially larger than PubMed); as many as a third of scholars are on Twitter, and a growing number tend scholarly blogs.

These new forms reflect and transmit scholarly impact: that dog-eared (but uncited) article that used to live on a shelf now lives in Mendeley, CiteULike, or Zotero–where we can see and count it. That hallway conversation about a recent finding has moved to blogs and social networks–now, we can listen in. The local genomics dataset has moved to an online repository–now, we can track it. This diverse group of activities forms a composite trace of impact far richer than any available before. We call the elements of this trace altmetrics.

Altmetrics expand our view of what impact looks like, but also of what’s making the impact. This matters because expressions of scholarship are becoming more diverse. Articles are increasingly joined by:

  • The sharing of “raw science” like datasets, code, and experimental designs
  • Semantic publishing or “nanopublication,” where the citeable unit is an argument or passage rather than entire article.
  • Widespread self-publishing via blogging, microblogging, and comments or annotations on existing work.

Because altmetrics are themselves diverse, they’re great for measuring impact in this diverse scholarly ecosystem. In fact, altmetrics will be essential to sift these new forms, since they’re outside the scope of traditional filters. This diversity can also help in measuring the aggregate impact of the research enterprise itself.

Altmetrics are fast, using public APIs to gather data in days or weeks. They’re open–not just the data, but the scripts and algorithms that collect and interpret it. Altmetrics look beyond counting and emphasize semantic content like usernames, timestamps, and tags. Altmetrics aren’t citations, nor are they webometrics; although these latter approaches are related to altmetrics, they are relatively slow, unstructured, and closed.

How can altmetrics improve existing filters?

With altmetrics, we can crowdsource peer-review. Instead of waiting months for two opinions, an article’s impact might be assessed by thousands of conversations and bookmarks in a week. In the short term, this is likely to supplement traditional peer-review, perhaps augmenting rapid review in journals like PLoS ONE, BMC Research Notes, or BMJ Open. In the future, greater participation and better systems for identifying expert contributors may allow peer review to be performed entirely from altmetrics. Unlike the JIF, altmetrics reflect the impact of the article itself, not its venue. Unlike citation metrics, altmetrics will track impact outside the academy, impact of influential but uncited work, and impact from sources that aren’t peer-reviewed. Some have suggested altmetrics would be too easy to game; we argue the opposite. The JIF is appallingly open to manipulation; mature altmetrics systems could be more robust, leveraging the diversity of of altmetrics and statistical power of big data to algorithmically detect and correct for fraudulent activity. This approach already works for online advertisers, social news sites, Wikipedia, and search engines.

The speed of altmetrics presents the opportunity to create real-time recommendation and collaborative filtering systems: instead of subscribing to dozens of tables-of-contents, a researcher could get a feed of this week’s most significant work in her field. This becomes especially powerful when combined with quick “alt-publications” like blogs or preprint servers, shrinking the communication cycle from years to weeks or days. Faster, broader impact metrics could also play a role in funding and promotion decisions.

Road map for altmetrics

Speculation regarding altmetrics (Taraborelli, 2008; Neylon and Wu, 2009; Priem and Hemminger, 2010) is beginning to yield to empirical investigation and working tools. Priem and Costello (2010) and Groth and Gurney (2010) find citation on Twitter and blogs respectively.  ReaderMeter computes impact indicators from readership in reference management systems. Datacite promotes  metrics for datasets. Future work must continue along these lines.

Researchers must ask if altmetrics really reflect impact, or just empty buzz. Work should correlate between altmetrics and existing measures, predict citations from altmetrics, and compare altmetrics with expert evaluation. Application designers should continue to build systems to display altmetrics,  develop methods to detect and repair gaming, and create metrics for use and reuse of data. Ultimately, our tools should use the rich semantic data from altmetrics to ask “how and why?” as well as “how many?”

Altmetrics are in their early stages; many questions are unanswered. But given the crisis facing existing filters and the rapid evolution of scholarly communication, the speed, richness, and breadth of altmetrics make them worth investing in.

Jason Priem, University of North Carolina-Chapel Hill (@jasonpriem)
Dario Taraborelli, Wikimedia Foundation (@readermeter)
Paul Groth, VU University Amsterdam (@pgroth)
Cameron Neylon, Science and Technology Facilities Council (@cameronneylon)

Creative Commons License

v 1.0 – October 26, 2010
v 1.01 – September 28, 2011: removed dash in alt-metrics

15 Comments

  1. Posted October 27, 2010 at 2:28 am | Permalink

    Great ideas – but with respect to divorcing a metric from the publication venue, I’m skeptical that it’s possible. After all, the Matthew Effect became the long tail in web talk.
    Also, it might be useful to contrast Altmetrics with usage metrics which are also being proposed as alternatives to traditional citation-based metrics

  2. Posted October 28, 2010 at 9:41 am | Permalink

    Hi Christina, that’s a good point, but author-level metrics (and for what matters any aggregate institution-level measures) are already divorced from individual publication outlets, aren’t they?I discuss what I believe to be the main differences between usage metrics and metrics based on richer usage patterns (such as personal bookmarking/annotation) in my COOP ’08 paper linked above. The bottom line is: usage metrics are the equivalent (in terms of robustness) of 1st generation ranking algorithms based on click-through rates.

  3. Steve Hitchcock
    Posted October 29, 2010 at 12:02 pm | Permalink

    Nice ideas. Do you mean something like scintilla.nature.com? You end by imploring researchers to invest in alt-metrics, but have not yet answered your own questions on the validity of the new metrics. “Researchers must ask if alt-metrics really reflect impact, or just empty buzz.” That should be the other way round: first show the effect on impact then try to convince researchers. “Work should correlate between alt-metrics and existing measures, predict citations from alt-metrics, and compare alt-metrics with expert evaluation.” This seems to be the way to go.

  4. Jason Priem
    Posted October 29, 2010 at 5:57 pm | Permalink

    Hi Steve. As I understand, Scintilla filters news and blog posts, but it does it with keywords rather than measuring impact from a variety of sources. So while it’s a cool project, I wouldn’t call it an alt-metrics tool.The early data suggest alt-metrics measure “real” impact (scare quotes because even the citation people will tell you this is tricky to define). Moreover, it’s encouraging that alt-metrics, like citations, are built around native scholarly processes (saving, linking, etc); we’re not asking for popularity votes, we’re observing the ways scholars naturally interact with their work. So alt-metrics show a lot of promise, and we’d like to see more work in this area.We’re not arguing that alt-metrics is ready for prime-time yet. When we suggest it’s time to invest in alt-metrics, we mean just that: let’s start building systems and doing research and see if alt-metrics live up to their promise. We think they will.

  5. Grove Patel
    Posted January 18, 2011 at 1:55 pm | Permalink

    It would have been more honest of you to mention Thomson Reuters response to the Rockefeller University Press article…
    http://community.thomsonreuters.com/t5/Citation-Impact-Center/Thomson-Scientific-Corrects-Inaccuracies-In-Editorial/ba-p/717/message-uid/717

  6. Posted January 19, 2011 at 9:52 pm | Permalink

    Hi Grove,
    Thanks for the link to Thomson’s rejoinder to the Rossner, Van Epps and Hill (2007) article “Show me the data,” which we link to above. It’s always nice to have a full perspective on an issue, and there are certainly good arguments for the value of the JIF.

    However, I disagree that it’s dishonest for us not to link to it, any more than it was dishonest for you to not link to Rosner et. al’s reply to the reply, “Irreproducible results: a response to Thomson Scientific.” Our goal with the manifesto isn’t to put the Journal Impact Factor on trial; that’s been done enough.

    Our point, rather, is to present a better, fuller alternative to the way we measure impact now. I think the JIF, as used today, has deep flaws…but even if it didn’t, why not look into ways to understand and track impact even better?

  7. Julian Newman
    Posted June 19, 2011 at 9:12 pm | Permalink

    It would be kind of helpful to let new readers know what ALT in “alt-metrics” stands for. Is it an acronym for something, or does it just stand for “alternative”? If it only means “alternative” why should we believe the various assertions that are made about alt-metrics? Are we really sure that we want ANY metrics at all, or are these just something to enable bureaucrats to get under our feet?

  8. jason
    Posted June 19, 2011 at 10:31 pm | Permalink

    Good thought, Julian. The “alt” does indeed stand for “alternative,” and that should be more evident. We’ll change it in the next version of the manifesto (along with losing the hyphen, which has already disappeared from most of our other altmetrics stuff online).

    I don’t think think the term, though, has much to do with the value (or lack thereof) of our assertions. Regardless of what “alt” means, I think it’s increasingly accepted that evaluators (some of whom are “bureaucrats,” but some of whom are fellow scholars on hiring, tenure, and grant committees) and working researchers alike are pretty overwhelmed by the quantity of knowledge being produced. We need ways to get a handle on what’s out there, and what’s good.

    Traditional metrics, while useful, have let us down because they only give us a few ways to measure “good.” But eschewing all measurement of science is throwing the baby out with the bathwater. As scientists, we measure things all the time; shouldn’t we examine our own work as closely? Rather than getting rid of metrics (which, let’s face it, are not going away), we should be adding metrics, so that we get a messier but richer picture of what’s going on in science. “Impact,” like lots of things, is hard to define and measure. But certainly using new communication technologies to build our understanding of research impact is a better plan than just throwing up our hands and going home.

  9. Dana Roth
    Posted February 14, 2012 at 7:44 pm | Permalink

    re: “As the volume of academic literature explodes, scholars rely on filters to select the most relevant and significant sources from the rest.” Isn’t another filter the journal in which ‘relevant and significant sources’ are published? Most serious researchers combine perusing journal contents pages with literature searching.

  10. jason
    Posted February 14, 2012 at 8:01 pm | Permalink

    @Dana, absolutely the journal is a filtering mechanism. I’d maintain it’s a broken one, because requires lots of expensive manual curation, hides valuable research in peer review for a year or more, permits only binary yes/no filtering, and only supports one judgement per article (since you can’t publish in multiple journals). These were all unavoidable bugs in a system built on paper. But there’s no need to suffer them if we built a system on the Web.

  11. Posted November 15, 2012 at 9:09 am | Permalink

    It would be kind of helpful to let new readers know what ALT in “alt-metrics” stands for. Is it an acronym for something, or does it just stand for “alternative”? If it only means “alternative” why should we believe the various assertions that are made about alt-metrics? Are we really sure that we want ANY metrics at all, or are these just something to enable bureaucrats to get under our feet?

  12. Posted December 24, 2012 at 5:59 pm | Permalink

    I very much like the general thrust here, especially the two ideas that (1) metrics should be specific to a particular paper rather than tied to the journal that published it and (2) that the internet allows effective community review in many cases. However, I have a concern that an important feature of peer review, which has been under threat for some time, is further endangered.

    What peer review provides, but which most of the alternatives do not, is the assurance that someone with expertise has read the paper very carefully looking for errors. Assessment “by thousands of conversations and bookmarks in a week” may not involve *anyone* actually taking the time to read the paper carefully. While prominent and important publications that are read (or skimmed) by many will no doubt be read carefully by some, and social media allows us to point to those readers (and contact them if necessary), the vast majority of publications will never be read in detail by anyone (including some of the authors).

    Of course, even under peer review, a careful reading does not always happen. A careful review takes much more time than a typical reading. Reviewers under pressure to review quickly (and under other pressures as well) are increasingly likely to be less than completely thorough. However, despite this tendency, I know from reading the comments of other reviewers that most papers are in fact carefully evaluated during peer review (still).

    Put simply, peer review continues to provide an important check that prevents the formation of scientific consensus around suspect data, or around an argument that has not been thoroughly checked by anyone.

  13. Luiz Felipe Franco Belussi
    Posted January 12, 2013 at 12:00 pm | Permalink

    The idea of altmetrics is great, period.

    But there are some big challenges: one concern is that eventually the media used to gather information and assess impacts (specially blog entries and tweets) is much more susceptible to artificial manipulation (by spam bots and the alike).

    That’s one more difficulty in the journey to assessing the “”real”" impact of research.

  14. Ali
    Posted February 4, 2013 at 5:05 pm | Permalink

    Publication and reputation conventions vary by field. In my field (CS) journal articles, while important as a repository for established results, is not what people pay attention to. Rather the focus is on conferences and workshops: peer-reviewed yet reasonably fast dissemination of research. The h-index may appear slow, but there’s also the g-index. I’d much rather have peers who understand the field be a first filter. The claim that this somehow excludes “new” ideas is laughable. New ideas show up all the time, how else would there be any visible progress in science?

  15. Posted April 2, 2014 at 6:56 pm | Permalink

    Resurrect the Readermeter !!

310 Trackbacks

  1. [...] posts, etc. The incoming citations are of course very helpful for discovery, and the basis for alternative metrics. This entry was posted in Thoughts and tagged nlm-dtd, pdf. Bookmark the permalink. [...]

  2. [...] out the alt.metrics manifesto he recently [...]

  3. [...] Björn Brembs to Björn's feed, The Life Scientists, Science 2.0 Alt-metrics: a manifesto – altmetrics.org – http://altmetrics.org/manifes… [...]

  4. [...] speaking of peer review, here’s a new attempt to measure scholarly impact: Alt-metrics: A Manifesto: These new forms reflect and transmit scholarly impact: that dog-eared (but uncited) article that [...]

  5. By My open access conversion « Anne Peattie on December 13, 2010 at 6:23 am

    [...] not. Is there a better way of defining the impact of your research? Definitely. Start with the Alt-Metrics Manifesto if you don’t believe [...]

  6. By Quora on December 13, 2010 at 10:29 pm

    How do you determine how influential/important a scientific paper is?…

    Since we’re just starting to ask this question seriously, it’s the kind of thing that can only be answered retrospectively for now.  As we collect more rich activity and attention data of the kind that PLoS and Mendeley are gathering, we’ll learn mo…

  7. By January 2011 Topic on December 16, 2010 at 8:38 pm

    [...] and building prototypes to support this approach.  His recent alt-metrics publications include the alt-metrics manifesto, Scientometrics 2.0: Toward new metrics of scholarly impact on the social Web, and How and why [...]

  8. [...] I collected data on PLoS comments as part of a larger investigation of alt-metrics. As evident from the graphic, the number articles with comments has held more or less steady as the [...]

  9. [...] a frontend for our crawler–giving working scholars and funders the opportunity to try out alt-metrics for [...]

  10. [...] broached the idea of altmetrics on several occasions already, our next session dove into the details of how we might construct [...]

  11. [...] is “What problems are we trying to solve?”  I am very familiar with the criticisms of the impact factor, but I’m interested in returning to the basic [...]

  12. [...] altmetrics home Has Journal Commenting Failed Twitter Survey Report (Sept 2010) on scribd More stats for the PLoS [...]

  13. By Link, don’t pass around files | Book of Trogool on January 25, 2011 at 3:09 pm

    [...] there’s an impact question to consider. As alternative impact metrics take hold in journal publishing, view and download numbers take on new importance for authors. If [...]

  14. [...] Publikationsworkflows: “ Originäre Web-Werkzeuge und -Konzepte wie HTML, Wikis, Weblogs, Alternative Metriken etc. sind grundsätzlich besser dazu geeignet, die Potentiale des Webs für das wissenschaftliche [...]

  15. By The impact factor game | Science Library on February 17, 2011 at 4:43 am

    [...] factor of journals should not be used for evaluating research • The misused impact factor • alt-metrics: a manifesto • The mismeasurement of science • Impact factor wars: Episode V–The Empire Strikes [...]

  16. [...] Alt-metrics Manifesto (altmetrics.org) [...]

  17. [...] alternative impact metrics (some that PLoS now provides). He cited people such as Jason Priem (see alt-metrics: a manifesto) and commented that changing the focus from Journal to article, would change the publication [...]

  18. [...] research” possible.  With the growing popularity of #altmetrics (or less twitter-like: alt-metrics) it is also starting to make inroads into measuring (academic) research impact (N.B. for those in [...]

  19. By Quora on April 14, 2011 at 10:01 am

    What are some good alternatives to Google Scholar?…

    Are you interested in conducting search? It’s not there yet: in the category of “free”, google scholar, used together with Harzing’s POP application (http://www.harzing.com/pop.htm), is unbeatable, IMO. But check out current developments under #alt…

  20. [...] The DataCite consortium are addressing the challenges of making data sets accessible and visible. Alt-Metrics have emerged to suggest alternative views of impact which move away from the more traditional [...]

  21. [...] out by (among others) Jason Priem, PhD student in Information and Library Science on the project Alt-metrics which is about tracking scholarly impact on the social web. Preem and his colleagues wants to track [...]

  22. By Joe Paz » Tweets on May 19, 2011 at 2:42 am

    [...] [...]

  23. [...] their SLideshare presentations (that should have ORCIDs for scientists!), and their “AltMetrics“ 4) Aggregating all RSC articles, new and old (with some work on the archive!) under the [...]

  24. [...] who are fully embracing the possibilities of Web 2.0.  This has called for new methods of metrics (altmetrics), which better reflect today’s research practices and take advantage of the use of current social [...]

  25. By Jason Priem, alt-metrics - IRISC on June 30, 2011 at 6:52 am

    [...] – from http://altmetrics.org/manifesto/ [...]

  26. By Random Hacks on July 2, 2011 at 12:55 pm

    [...] “Altmetrics” Manifesto: http://altmetrics.org/manifesto/ [...]

  27. [...] number of projects on altmetrics including: Julie M. Birkholz and Shenghui Wang (2011) Who are we talking about?: the validity of [...]

  28. [...] community has recently emerged in an effort to achieve this. Complete with a manifesto – at altmetrics.org - this community is striving to understand and measure the products and practices of scholarly [...]

  29. By Altmetrics | juliembirkholz on July 29, 2011 at 12:06 pm

    [...] of online metrics for evaluating science; piggy backing on other discussions in the field such as alt-metrics (which Gamble also [...]

  30. [...] authors have begun to call for investigation of “altmetrics”. [...]

  31. By Inundata – DataCite 2011, recap on August 26, 2011 at 3:39 am

    [...] Related: Altmetrics Manifesto [...]

  32. [...] are also showing interest in the possibilities of a well-configured identity service. The altmetrics movement is essentially predicated on being able to append various signifiers of scholarly output [...]

  33. [...] aren’t what’s important in science and aren’t the best way to measure impact. The Alt-Metrics project and many other initiatives have sprung up over the last few years looking for better ways [...]

  34. By Evaluating Research By the Numbers on October 3, 2011 at 9:28 pm

    [...] then turned to a brief discussion about some of the alternative metrics now being proposed by various journals and publishers. Some of the simplest measures in this [...]

  35. [...] At ACRLog, Bonnie Swoger of SUNY Geneseo tells us why some students are interested in impact factors, h-indexes, etc., and points to a manifesto on altmetrics. [...]

  36. By Graham Steel – Publish or Parish » PhD2Published on October 27, 2011 at 8:32 am

    [...] Article Level Metrics (ALM) which is a much needed alternative to IF.  Also see the likes of  http://altmetrics.org/manifesto/, http://beyond-impact.org/ and [...]

  37. [...] collecting and displaying Article-Level Metrics for its articles.  Jason Priem and others have articulated the promise of altmetrics and begun digging into what these metrics [...]

  38. By more about total-Impact « Research Remix on October 31, 2011 at 6:34 pm

    [...] The Altmetrics Manifesto is a good, easily-readable introduction to this literature, while the proceedings of the recentaltmetrics11 workshop goes into more detail. You can check out the shared altmetrics library on Mendeley for more even relevant research. Finally, the poster Uncovering impacts: CitedIn and total-Impact, two new tools for gathering altmetrics, recently submitted to the 2012 iConference, describes a case study using total-Impact to evaluate a set of research papers funded by NESCent; it has some brief statistical analysis and some visualisations of the results. [...]

  39. By Mendeley Binary Battle Top 40 | is it just me on November 4, 2011 at 10:03 am

    [...] I especially like the following projects as they could really help making science more open, using alternative metrics or innovative approaches of approaching the whole science [...]

  40. [...] I especially like the following projects as they could really help making science more open, using alternative metrics or innovative approaches of approaching the whole science [...]

  41. [...] scholarly communication. One of the major adherents of this view is Jason Priem, co-founder of the altmetrics project, whose website states: In the 17th century, scholar-publishers created the first scientific [...]

  42. [...] Total-Impact fulfills an unmet need for how researchers can collect and display a variety of altmetrics in one place. The app’s contributors (including PLoS authors Heather Piwowar and Egon [...]

  43. [...] Sciences on academics’ use of social media and understanding these sources to inform “altmetrics” (alternative metrics of impact). Share this:TwitterFacebookLike this:LikeBe the first to [...]

  44. [...] Priem, D. Taraborelli, P. Groth, C. Neylon (2010), Alt-metrics: A manifesto, (v.1.0), 26 October 2010. [...]

  45. By Daily post 11/28/2011 : DrAlb on November 28, 2011 at 1:30 pm

    [...] altmetrics: a manifesto – altmetrics.org [...]

  46. [...] efforts to understand and use these new data sources to inform alternative metrics of impact, or “altmetrics.” Altmetrics could be used in evaluating scholars or institutions, complementing unidimensional [...]

  47. [...] altmetrics Web site provides access to altmetrics: a manifesto which describes how “the growth of new, online scholarly tools allows us to make new filters; [...]

  48. By Mendeley Binary Battle Top 40 | Wolfgang Reinhardt on December 13, 2011 at 10:26 am

    [...] I especially like the following projects as they could really help making science more open, using alternative metrics or innovative approaches of approaching the whole science [...]

  49. [...] thinker in open science, open access and open data. He is one of the original authors of the Altmetrics manifesto, co-author of the Panton Principles for open data in science, and founding Editor-in-Chief of the [...]

  50. [...] strengths and weaknesses of such analytic tools may be helpful if the altmetrics initiative which, in its manifesto, describes how “the growth of new, online scholarly tools allows us to make new filters; these [...]

  51. [...] Jüngst wies Manuela Schulz im medinfo-Weblog auf die Entwicklungslinie der Altmetrics hin (http://medinfo.netbib.de/archives/2011/12/02/3944), die soziale Netzwerkeffekte – z.B. vernetzte Literaturorganisationssysteme wie Mendeley oder Research Gate, aber auch Twitter – für die Messung eines Impacts nutzen wollen: Die präzise Vernetzbarkeit auch von Dokumententeilen mit konkreten Akteuren lassen feinkörnigere Messverfahren als die Zitationszählung auf Artikelebene zu. Im Altmetrics-Manifesto findet sich dies so angesprochen: „Semantic publishing or “nanopublication,” where the citeable unit is an argument or passage rather than entire article.” (http://altmetrics.org/manifesto/) [...]

  52. [...] can be seen from the altmetrics manifesto the research community has strong interests in developing metrics which can help to identify [...]

  53. [...] de cara a la evaluacion de la investigación, como Academic Search, de Microsoft o las propuestas alt-metrics. También se ofrece una perspectiva de las novedades introducidas por las principales empresas: ISI [...]

  54. [...] articles, and replacing them with a list of articles in an appendix. Jason Priem (co-founder of the altmetrics project) commented on Davis’s post, describing the change as “a lovely example of how [...]

  55. [...] BMJ Group was interested because the Eysenbach paper had caused a stir in the Altmetrics community, a project set up to discuss the post-peer review environment. Peer-review has served scholarship [...]

  56. [...] publication to the web, and publish earlier, the web offers a better way to filter science or as Altmetrics (project set up to discuss the post-peer review environment) puts it: “Instead of waiting months [...]

  57. [...] traditional measures of impact (i.e. the number of citations), as well as new measurements such as altmetrics, researchers get a greater level of information about the impact and reach of their [...]

  58. [...] traditional measures of impact (i.e. the number of citations), as well as new measurements such as altmetrics, researchers get a greater level of information about the impact and reach of their [...]

  59. [...] using both traditional measures of impact (i.e. the number of citations) alongside new ones such as altmetrics, Figshare gives researchers a greater level of information, and realtime measurements, of the true [...]

  60. By Journal News « sharmanedit on January 20, 2012 at 3:35 pm

    [...] as “an attempted improvement that makes things worse than they already were”. Altmetrics may be on the rise, but it looks like this one won’t be taking [...]

  61. [...] Images created by these artists deserve citations like papers. The conference exposed me to the altmetrics advances in this area which was eye opening but would be nice if I could track my images too on [...]

  62. [...] Research Blogging Network. Hopefully these commentraies will be of use to some and should add to Altmetrics profiles for these papers, using systems like Total [...]

  63. [...] thread I learned about after the main ideas above formed involves new ways to measure science and altmetrics (and thanks to Jonathan Stray the heads-up on the almetrics [...]

  64. By Altmetrics « News from JURN.org on January 30, 2012 at 3:46 pm

    [...] Altmetrics: a manifesto… [...]

  65. [...] can now be accessed almost instantaneously via social media. The proponents of altmetrics have a manifesto which asks the question I was thinking while reading the article: How much does the conversation [...]

  66. [...] more information about altmetrics, read”Altmetrics: A Manifesto” and follow the discussion on [...]

  67. By Altmetrics: a manifesto « my memex on January 31, 2012 at 1:12 pm

    [...] the Altmetrics manifesto created by UNC graudate student Jason Priem (see more in the Chronicle of Higher Education [...]

  68. [...] blogged about or bookmarked”. The article talks about Jason Priem (who helped write the altmetrics manifesto) and a new project called Total Impact that, although in its infant stages, is a way to search the [...]

  69. By It Must Be Measured: #Scio12 #Altmetrics | Whizbang on January 31, 2012 at 4:31 pm

    [...] Science Online I attended a discussion of Alternative Metrics or altmetrics: As the volume of academic literature explodes, scholars rely on filters to select the most [...]

  70. [...] to how many people or visits or clicks or downloads a given online resource is getting. So-called "altmetrics" and the more-established webometrics or statistical cybermetrics seek to recognise the need of [...]

  71. [...] a few different techniques and he explained how they could be applied to LIS, including using altmetrics instead of/as well as traditional citation index searching, for a number of reasons, including the [...]

  72. [...] There’s been a lot of debate about the validity of impact factors over the years (and there have been many attempts to measure impact but none wholly accurate).  Just this week on Twitter, the discussion took off again after the publication of an article by Jennifer Howard entitled “Scholars seek betters ways to track online impact” in The Chronicle of Higher Education (January 29th 2012 ) which highlights the work on “alternative metrics” done by Jason Priem (a graduate student in library sciences at the University of North Carolina) who helped write a manifesto on “altmetrics” (see:  http://altmetrics.org/manifesto/). [...]

  73. [...] Priem helped write a manifesto, posted on the Web site altmetrics.org, which articulates the problems with traditional evaluation [...]

  74. [...] information is made available, we will need ways to evaluate the impact of that research. Altmetrics.org is a good please to start if you are interested in learning [...]

  75. [...] I’m pleased that I still have my Delicious account and will be interested  to see how the service becomes embedded within Delicious. It will also be interesting to see if the resource sharing capabilities provided by Twitter, and the ways in which such sharing can now be analysed will have a role to play in the development of altmetrics. As described in the altmetrics manifesto: [...]

  76. [...] [...]

  77. By Impactos alternativos on February 10, 2012 at 8:39 am

    [...] que genéricamente se denomina altmetrics o métricas alternativas –que incluso tienen su propio manifiesto–, aunque varían bastante entre [...]

  78. [...] Better Ways to Track Impact Online“) über die Diskussionen zu diesem Thema, die im Kontext des Altmetrics-Manifest geführt werden. Teilen Sie dies mit:TwitterFacebookDruckenMehrStumbleUponDiggE-MailRedditGefällt [...]

  79. By Investigación y OpenData « train2manage on February 13, 2012 at 6:43 am

    [...] web de los investigadores es cada vez mayor (con iniciativas que miden el impacto en la web, como Altmetrics) y, afortunadamente, no parece que vaya a haber marcha [...]

  80. [...] Priem helped write a manifesto, posted on the Web site altmetrics.org, which articulates the problems with traditional evaluation [...]

  81. [...] demanding shorter slots. Three workshops on the Web Science Curriculum, Health Web Science and Altmetrics, preceded the conference as a whole, and a lively poster session demonstrated not only how many [...]

  82. [...] J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), Alt-metrics: A manifesto, (v.1.0), 26 October … [...]

  83. [...] dataset has moved to an online repository, and now, we can track it,” it is written in Altmetrics Manifesto. “Altmetrics are fast, using public APIs to gather data in days or weeks. They’re [...]

  84. [...] rely on filters to select the most relevant and significant sources from the rest,” the altmetrics manifesto argues. “Unfortunately, scholarship’s three main filters for importance are [...]

  85. [...] investigación que va a producir abundante literatura en los próximos meses dentro de la llamada altmetrics. El trabajo no entra a valorar otra serie de cuestiones como el número de seguidores de las [...]

  86. [...] and ask the entire world for help, or talk about their research plans and get critiqued. Meanwhile, altmetrics are being generated in real time to assess the validity of data, and scientists peer review on [...]

  87. [...] a bibliotecas con la bibliometría y los medios sociales, y por otro lado, con la Altmetrics, http://altmetrics.org/manifesto/ , aunque en este caso se refiera a la producción científica, a la ciencia, sin embargo tiene en [...]

  88. [...] are used, they are often implicit ones extractable from the code repository itself, like Ohloh. Altmetrics are a solution to this [...]

  89. By Proposal | related-work.net blog on March 12, 2012 at 3:52 am

    [...] http://altmetrics.org/manifesto/ as an emerging trend from the web-science trust community. Their goal is to revolutionize the review process and create better filters for scientific publications making use of link structures and public discussions. (Might be interesting for us). [...]

  90. [...] http://altmetrics.org/manifesto/ as an emerging trend from the web-science trust community. Their goal is to revolutionize the review process and create better filters for scientific publications making use of link structures and public discussions. (Might be interesting for us). [...]

  91. By Papers aren’t just for people | Mendeley Blog on March 14, 2012 at 11:39 pm

    [...] the manufacturing plants of the industrial revolution, both grant funders and researchers want this revolution to happen. So why isn’t it happening? It’s happening because long ago we signed away [...]

  92. [...] of open peer review, community-based publication, socially networked reader/writing strategies, altmetrical analytics, and open-source publishing platforms, particularly as they inform or relate to [...]

  93. By Quora on March 15, 2012 at 11:04 pm

    What would happen to science if Elsevier went down?…

    It’s not so much Elsevier’s efficiency that’s at question, but their sustainability. They’ve been able to reap huge profits from academic libraries, but academic library budgets are doing the opposite of going up, so I agree the effects will be neg…

  94. [...] one of the best representatives of this body of work is the Altmetrics Manifesto (Priem, Taraborelli, Groth, & Neylon, 2010). The manifesto notes that traditional forms of [...]

  95. [...] “Altmetrics: A Manifesto.”  (http://altmetrics.org/manifesto/)  (Viewed March 30, 2012) [...]

  96. By The Future of Metrics in Science | DCXL on April 6, 2012 at 3:01 pm

    [...] a graduate student at UNC’s School of Information and Library Science, coined the term “altmetrics” rather recently, and the idea has taken off like wildfire. altmetrics is the creation and [...]

  97. [...] patterns are much more varied and diffuse than co-authorship. By incorporating measures such as altmetrics (e.g., downloads, mentions, favorites, shares, like) and social connections between humanists[5], [...]

  98. [...] be able to monitor “in real time” how a publication reverbates in the communication system. The Altmetrics Manifesto (Priem, Taraborelli, Groth, & Neylon, 2010) even advocates the use of “real-time [...]

  99. [...] Because we are a publishing support service and not a publisher, we aren’t involved in the selection process for vetting what actually gets published.  What we do suggest, however, is that scholars can put pressure on publishers to offer them access to their “value analytics.” While the number of citations an article gets is usually held up as the gold standard for determining its “impact,” particularly in the sciences, increasing numbers of people are getting interested in alternative forms of measuring impact, also known as “altmetrics.” [...]

  100. [...] blogosferą naukową czy nowymi trendami w mierzeniu i ocenianiu aktywności naukowej w Sieci (altmetrics). Być może to dobre wprowadzenie do konkretnych szkoleń, które CITTRU organizuje przecież w [...]

  101. [...] leaves the search for new metrics (“altmetrics“) as perhaps the greatest hope for near-term improvement in our post-publication [...]

  102. [...] Altmetrics.org was mentioned as one different approach. As their site says: Altmetrics expand our view of what impact looks like, but also of what’s making the impact. [...]

  103. [...] danger is all the more real because of the rise of Altmetrics.  A few years back when arXiv was establishing itself, journal impact factor was about the only [...]

  104. [...] is hitting its stride: 30 months after the Altmetrics manifesto1, there are 6 tools listed. This is great [...]

  105. [...] Altmetrics, a service which maps the reputation of scientists by monitoring how people use their papers on [...]

  106. [...] these apply to you, then you likely have an opportunity to help academia rise above publication-based metrics of academic impact, even if just an inch at a [...]

  107. [...] who wrote an altmetrics manifesto and recently co-authored a paper on altmetrics to be presented at the 17th International Conference [...]

  108. [...] さらに踏み込むと、そうした様々なデジタルツールが研究分野で利用された結果、文献個々の評価(インパクト)をリアルタイムに計測出来るのでは、という流れにも発展し、altmetrics という概念が生み出されていくワケですが、これについては僕自身最も関心ある分野のひとつで、自身プレゼンテーションにも無理矢理関連付けて触れたワケですが、Tim Berners-Lee が同僚との間で論文を手軽に共有するために World Wide Web を開発した1990年12月以来20年が経過した今、ソーシャルメディアの台頭・普及により、ようやくウェブが科学に変革をもたらす時が来た、というのがこの分野で頻繁に使われる文句となってきた気がします(これとかこれとかこれ)。 [...]

  109. [...] Altmetrics - tracking system that attempts to note not just the electronic article usage in digital forms like Twitter or CiteULike, but also other information resources like datasets or blogs.  This is tough to tackle but the various tools below are starting to develop some interesting methodologies [...]

  110. [...] Altmetrics に強い興味があってこれらの情報を日々ウォッチ。 [...]

  111. By June 2012 on June 15, 2012 at 12:14 pm

    [...] Altmetrics Manifesto “No one can read everything.  We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these altmetrics reflect the broad, rapid impact of scholarship in this burgeoning ecosystem. We call for more tools and research based on altmetrics.” [...]

  112. [...] metrics that could reliably estimate the impact of an author’s research. The emerging field of altmetrics seeks to change that [...]

  113. [...] my opinion, altmetrics is the key to innovate OA relations. PLoS is the most important contribute to altmetrics [...]

  114. [...] my opinion, altmetrics is the key to innovate OA relations. PLoS is the most important contribute to altmetrics [...]

  115. By Impact Factor Boxing 2012 « O'Really? on June 29, 2012 at 6:29 am

    [...] was the Finch report on Open Access. And if that wasn’t enough fun, there’s been the Altmetrics movement gathering pace [2], alongside a hint that the impact factor may be losing its grip on the [...]

  116. By The Future Article | Reading eBooks in London on July 2, 2012 at 8:47 pm

    [...] Taylor brought up new measures for impact, like altmetrics that look at weblinks, mass media, tweets and usage counts. But do academic publishers look at this [...]

  117. [...] infrastructure to recognize the value of outreach in non-traditional publications such as blogging. Altmetrics are being gathered and used as further ways to measure impact of researcher’s output and [...]

  118. [...] vision is summarized in: J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), Altmetrics: A manifesto, (v.1.0), 26 October 2010. http://altmetrics.org/manifesto“ These scholars plainly see as [...]

  119. [...] of North Carolina-Chapel Hill, who coined the term “altmetrics.” In his post, “Altmetrics: a Manifesto,” Jason noted the limitations and slowness of peer review and citations. He suggests that the [...]

  120. [...] analysing personal influence, and the approaches they use may be of interest to those involved in alt.metrics work. As described in a paper on Altmetrics in the Wild: Using Social Media to Explore Scholarly [...]

  121. [...] scholarship. Their vision is summarized in: J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), Altmetrics: A manifesto, (v.1.0), 26 October 2010. http://altmetrics.org/manifesto via about – [...]

  122. [...] altmetrics movement. If this conversation is not yet on your radar, I recommend beginning with the Altmetrics Manifesto. Beyond the old ideal of engaging our colleagues work closely, I am not endorsing any one approach, [...]

  123. [...] Ainsi, grâce (à cause de?) Research Gate (ou Mendeley, ou Academia.edu [5]…) vous pourrez travailler sur votre réputation en ligne, améliorer votre index de citation. Cette tendance va de pair avec l’apparition des “altmetrics“. [...]

  124. [...] a tentare di formulare sistemi di più ampio respiro per la misurazione dell’ impatto, come Altmetrics o le article-level metrics adottate dalla Public Library of Science. Entrambi combinano una [...]

  125. [...] many of the new areas of study – from statistical cybermetrics to the increasingly popular altmetrics – focus on how links shared affect [...]

  126. [...] good metric for the quality of the work is doubtful, nevertheless I cannot ignore that metrics and alt-metrics are (rightly or wrongly) used to assess researchers. I’m happy to see that blogs and social media [...]

  127. By Get visible or vanish « phd with kids on September 4, 2012 at 4:47 am

    [...] Lamp got up then and spoke about altmetrics, about finding ratings that make you sound good and unashamedly using them, about getting work out [...]

  128. [...] a bibliotecas con la bibliometría y los medios sociales, y por otro lado, con la Altmetrics, http://altmetrics.org/manifesto/ , aunque en este caso se refiera a la producción científica, a la ciencia, sin embargo tiene en [...]

  129. By Altmetrics por todas partes « Primer cuartil (Q1) on September 9, 2012 at 7:43 pm

    [...] fortuna en un escaso margen de tiempo. En apenas unos meses, desde que se acuñara el término Altmetrics han surgido empresas (Altmetric, Plum Analytics), proyectos (total-impact), y todo tipo de papers [...]

  130. [...] impact of the article itself, not its venue, that needs to be assessed. Alternative metrics (‘altmetrics‘) are under [...]

  131. By Altmetrics | Biblioteksbloggen on September 25, 2012 at 12:45 pm

    [...] Läs mer om Altmetrics på deras hemsida och varför inte testa din impact via altmetrics. [...]

  132. [...] metrics for online reputation (i.e. services such as Klout) and assessment of research impact (e.g. alt.metrics); in both of these areas the potential benefits of metrics have been identified, but their [...]

  133. [...] the application and have just launched ImpactStory (more on that next week). Priem, who wrote the altmetrics manifesto, welcomed the appearance of Plum Analytics. “Looks to me like they’d be pretty direct [...]

  134. By Social media = academic impact « another rambler on September 29, 2012 at 12:20 pm

    [...] in content is illuminating in highlighting the problems of using social media to judge impact. Altmetrics needs to move yet further away from measuring numbers of interaction to the content and agents of [...]

  135. [...] altmetrics [...]

  136. [...] remember that ‘conversations in corridors’ will still take place. Despite the rise of altmetrics and the increasingly advanced analysis of online data on articles and citations, no algorithms can [...]

  137. By British Library Data Citation on October 29, 2012 at 9:21 am

    [...] the need to incorporate other, less traditional, measures of esteem. One view is expressed at altmetrics.org. It is not for us to say whether they are right or wrong but it is important both to understand how [...]

  138. By More Caught My Eye | Against-the-Grain.com on October 29, 2012 at 12:52 pm

    [...] a recent public talk and workshop led by Jason Priem, a co-author of the “well-regarded AltMetrics Manifesto” and a founder of the Web tool called ImpactStory.   While Mr. Priem gave ample attention to [...]

  139. [...] may change as new, more open measurements of scholarly impact become more mainstream. Measuring and evaluating the impact and quality of publicly-funded research [...]

  140. [...] are the #altmetrics that I want to see for individual research [...]

  141. By ReRank.it | rerank on November 3, 2012 at 9:09 pm

    [...] ranking is based on data provided by the ImpactStory API. ImpactStory aggregates altmetrics: diverse impacts from your articles, datasets, blog posts, and more. The source code for ReRank can [...]

  142. By ReRank.it | What is it? on November 3, 2012 at 9:20 pm

    [...] ranking is based on data provided by the ImpactStory API. ImpactStory aggregates altmetrics: diverse impacts from your articles, datasets, blog posts, and more. The source code for ReRank can [...]

  143. [...] of journals, there is an emerging recognition that there are other tools available. For example, altmetrics  tracks how an article is shared and saved in the social media world. A new BioMed Central article [...]

  144. [...] J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), Altmetrics: A manifesto [...]

  145. [...] be more relevant to today’s fat-moving digital environment, which are know as altmetrics. The altmetrics manifesto explains [...]

  146. By Inundata – PLOS Altmetrics workshop on November 8, 2012 at 11:49 pm

    [...] still not be such a good indicator of real impact. A recent news piece in Science as well as the original manifesto written by Jason, Paul, and Dario is also worth reading. Pedro Beltrao also posted a summary of the [...]

  147. [...] it provides additional avenues to calculate impact metrics – similar to those observed by AltMetrics.org and [...]

  148. [...] Atmetrics – probably will be incorporated to enhance the recommender service. [...]

  149. [...] citations and usage statistics. The PLOS Article-Level Metrics project was started in 2008. Thealtmetrics manifesto was published in October 2010 and described the fundamental ideas. By October 2011 we had a number [...]

  150. [...] Altmetrics movement, helping to open a space for HSS academics to articulate unique values and practices that [...]

  151. [...] as a container is an important value metric and one that needs to continue, the rapidly evolving alternative metrics (altmetrics) movement is concerned with more than replacing traditional journal assessment [...]

  152. [...] Now the sad thing is that a tenure committees probably would not factor this in, but imagine being able to put something in your review packet that says: I did this experiment, wrote a paper, and over one million people learned about my research. Talk about alt metrics… [...]

  153. [...] ImpactStory was developed by two specialists in metrics of academic research. Heather Piwowar a postdoctoral fellow at Duke University and the University of British Columbia studying ”research data availability and data reuse“. And Jason Priem, PhD student in information science at University of North Carolina-Chapel Hill. Jason is credited for putting term altmetrics out there and an author of the altmetric manifesto. [...]

  154. [...] article, and they may be taken into consideration when making hiring or tenure decisions. The altmetrics manifesto argues that new forms of scholarly and popular communication (e.g. social media) require a rethink [...]

  155. [...] fonte immagine: altmetrics [...]

  156. [...] altmetrics.org/manifesto [...]

  157. [...] For the first time ever, there was an entire session devoted entirely to a discussion about the bourgeoning field. The session, called “Altmetrics beyond the numbers”, was run by Sarah Venis (Medicins sans [...]

  158. [...] by my thesis advisors, they were cited by Dr. Henning – we’re simply working a lot about alt-metrics at the [...]

  159. [...] この辺りは最近話題となりつつあるaltmetrics(ソーシャルメディアを活用した研究評価指標)とも絡んでいて、とてもムラムラする箇所です。 Mendeleyはaltmetricsに欠かせない(altmetricsにとってもMendeleyは欠かせない)存在になりつつあること再認識。 Mendeley人気に拍車がかかれば、必然的にaltmetricsに注目が集まる・・来年のSPARC Japanセミナーあたりではきっとaltmetricsをテーマとしたセミナーが開催され、第一人者のJasonあたりが来日して・・そんなことを妄想しながら聴き入りました。 [...]

  160. [...] 「図書館員は、インパクト評価に関する研究者の知識と関心を支える重要な立場にある」 最近、altmetrics(オルトメトリクス)と呼ばれる新たな研究評価指数が注目を浴び初めています。altmetricsは、ソーシャルメディアを活用して研究成果の影響度を「論文レベル」でリアルタイムに測定し、伝統的な研究評価指標を補完することが期待されています。 今日これからVictorが紹介されるMendeley機関版は、機関内での学術情報がどの様に流通しているのかを俯瞰し視覚化してくれる点が最大の特徴であり魅力だと思います。この根底にはaltmetricsの概念があり、これは今後とても重要視されるだろう、特に図書館員にとっては・・冒頭の引用には、そんな意味が込められているのではと思います。 [...]

  161. By Open Access 2012 Utrecht « Mediatheekfcj’s Blog on November 28, 2012 at 10:43 am

    [...] Library. Traditional metrics are limited. Is peer review ‘broken’? The Alt Metrics Manifesto  http://altmetrics.org/manifesto/ gives solutions to the current problems. Some of the online tools mentioned by Bianca Kramer [...]

  162. [...] http://altmetrics.org/manifesto/  [...]

  163. [...] have the opportunity to see use data from users around the world to use in the development of alt.metrics. Mendeley was chosen for this blog for it is targeted at working in teams: now only authors of the [...]

  164. [...] systems that can contribute to Alternative Metrics  – Already people are developing platforms, such as Impact Story. CSSP presents the perfect [...]

  165. [...] info on Altmetrics for Scopus and the Altmetric manifesto. This entry was posted in Bibliometrics, Citation metrics, Publishing and tagged Bibliometrics, [...]

  166. [...] Altmetrics: a manifesto; [...]

  167. By Infobib » Altmetrics in VuFind on December 18, 2012 at 1:13 pm

    [...] wer nicht weiß, was das alles soll: hier geht es zum Altmetrics-Manifesto. Die Konfiguration lautet: class=’altmetric-embed’ [...]

  168. [...] Reuters’s InCites, though of course there are alternative approaches being developed, such as AltMetrics. The question I wish to ask is whether Bradford’s Law is any longer sufficient in a world of [...]

  169. [...] ha portato alla formulazione di metriche di più ampio respiro, che includevano anche il web, come Altmetrics o  i criteri di misurazione della Public Library of Science, che combinano una serie di dati, tra [...]

  170. [...] of citing data in countable ways or Data Citation isn’t explicitly mentioned once. Nor altmetrics for that [...]

  171. [...] this again has a lesson, and it is one that will become increasingly important to take on board as AltMetrics become more important to judging academic success. The issue is what makes up a figure – the 113 [...]

  172. [...] metrics expert and “altmetrics” leader, Jason Priem, explored and quantitatively estimated scholarly Twitter use in his Nov [...]

  173. [...] author, commenter, or reviewer), the diversification of journal impact factor into a multitude of altmetrics (new filters for quantifying and understanding scholarly contributions), and especially, the [...]

  174. [...] upgraded version 4 of Library OneSearch supports the inclusion of an Altmetrics plug-in for LOS developed by Ex [...]

  175. By Inundata – Altmetrics as a discovery tool on January 24, 2013 at 12:10 am

    [...] » Altmetrics is all the rage these days in the scientometrics world. One rationale for developing these metrics [...]

  176. [...] to this interesting post from Altmetrics.org, conventional scholarly content filtering using Peer-Review, Citation and the [...]

  177. By Mendeley – vakliteratuur 2.0 « wetenschapper20 on February 7, 2013 at 10:03 am

    [...] biedt Mendeley een vorm van Altmetrics: een manier om de impact van een artikel op een andere manier te bepalen dan via traditionele [...]

  178. [...] innovate on everything about the publishing process, from open peer review, to the integration of altmetrics, to the simple idea of publishing articles as they come in (like a blog) rather than in separate [...]

  179. [...] new friends! Of course, hanging up is not the only purpose. OAI8 features some sessions about altmetrics, which I am particularly interested in, makes me looking forward [...]

  180. [...] normalized down). Pat also provided a good introduction to alternative metrics or alt-metrics (this is another good introduction) and the alt-metrics bookmarklet, which provides article level [...]

  181. [...] demanding shorter slots. Three workshops on the Web Science Curriculum, Health Web Science and Altmetrics, preceded the conference as a whole, and a lively poster session demonstrated not only how many [...]

  182. [...] a publishing world where open access, altmetrics and great shifts in technology related to internet search engines are availing way more visibility, [...]

  183. [...] also drew attention to ‘altmetrics’, an attempt to devise and use alternative means by which to recognise academic output. By [...]

  184. [...] Altmetrics is still in early days, Carpenter said. It’s a valuable system that focuses not just on the journal, but also on the researcher who contributed. To find out more, visit altmetrics.org. [...]

  185. [...] is a “manifesto” outlining the details behind altmetrics which discusses the bottle necks currently occurring in the status quo of peer [...]

  186. [...] measuring method. Due to the lag time required to publish, citation counts can take years to form. altmetrics take note of mentions in social media, such as tweeting and re-tweeting in twitter, blog mentions, [...]

  187. [...] altmetrics was the most significant thing Matthew pointed me toward, a movement started a couple of years ago. Their work hinges around a manifesto, and broadly speaking this movement encompasses all of what I’ve been thinking about. The very fact they’ve termed it a manifesto is indicative of the size of the problem. They couldn’t just write a normal paper, the altmetrics people, and I, are both hinting that a wholesale change is necessary to resolve the engrained issues in the way academic literature is handled (and some associated problems). While it’s always reassuring to find somebody has had the same kind of thoughts as yourself, it’s also daunting and worrying to understand quite how large the scale of the issue is. Going down the altmetrics rabbit hole, there is no sigh of the depth abating. There’s a lot of stuff down there, mostly juicy, the occasional dropping. The occasional juicy dropping. [...]

  188. [...] traditional publication models seem bleak. A few weeks ago, I did a short blog post on the topic of altmetrics, which aims to provide new mechanisms of measuring an individual output’s impact and [...]

  189. [...] can pre-publish manuscripts and data to receive feedback from the scientific community and Altmetrics, which is attempting to redefine the traditional impact factor by considering other types of [...]

  190. By New metrics need fresh data | Think Links on March 15, 2013 at 3:52 pm

    [...] of the ideas in the altmetrics manifesto was that almetrics allow a diversity of metrics. With colleagues in the VU University [...]

  191. [...] of the ideas in the altmetrics manifesto was that almetrics allow a diversity of metrics. With colleagues in the VU University [...]

  192. [...] con otros recursos. En esta mesa se analizaron diferentes herramientas e indicadores usando Altmetrics, señalando su potencial y su futuros desarrollos, pero también sus limitaciones y debilidades, [...]

  193. [...] for their output that may be outside of the traditional publication system, alternative metrics or altmetics are being developed to serve that. Figshare can let you assess how many people have seen your items, [...]

  194. [...] zum Impact-Faktor von Zeitschriften oder auch zum Hirsch-Index eines Autors wie zum Beispiel “Altmetrics” und andere Verfahren zur Wirkung, zum Impact von [...]

  195. By Altmetrics | Forskningsrelaterat on April 4, 2013 at 1:22 pm

    [...] Läs mer om Altmetrics på deras hemsida och varför inte testa din impact via ImpactStory. [...]

  196. [...] in some cases, provide information on research impact not based on the Impact Factor. Amongst these alternative or article-level metrics tools is F1000Prime. F1000Prime adds expert commentary and context to the raw numbers – social [...]

  197. [...] and users can pay for additional storage space or more collaboration features. Mendeley embraced alternative metrics, a hallmark of open access publications like PLOS ONE. Mendeley released an incredibly useful Open [...]

  198. [...] http://altmetrics.org/manifesto/ [...]

  199. [...] solution to both problems is a system of “alternative metrics” (altmetrics) of scholarly influence that seeks to replace or amend the established standards of peer review, [...]

  200. [...] Piwowar spoke on ImpactStory, an open source tool that utilizes altmetrics to describe the broader “impact flavor” and re-use of research data and other [...]

  201. By Infobib » Plum Analytics (PlumX) on April 19, 2013 at 12:24 pm

    [...] Research Impact” auf die Fahne. Es handelt sich also um ein YAAP (Yet Another Altmetrics Project), hinter dem mit Andrea Michalek und Mike Buschman zwei Primo-Köpfe [...]

  202. [...] From Altmetrics: A Manifesto: [...]

  203. By TUHH Library: The future of publishing on April 30, 2013 at 7:14 am

    [...] zum Impact-Faktor von Zeitschriften oder auch zum Hirsch-Index eines Autors wie zum Beispiel “Altmetrics” und andere Verfahren zur Wirkung, zum Impact von [...]

  204. [...] have in more unreviewed formats like blog posts or pre-print repositories. Alternative metrics (altmetrics) are a big factor in this reliability, as high  volume of traffic, downloads and online [...]

  205. [...] better ways to attribute and praise individuals for discrete chunks of research. This is where altmetrics are expected to extend citation-based metrics to detail the full range of impact that research (not [...]

  206. [...] Altmetrics is one of the hotly debated topics in the Open Science movement today. In summary, the idea is that traditional bibliometric measures (citation counts, impact factors, h factors, …) are too limited because they miss all the scientific activity that happens outside of the traditional journals. That includes the production of scientific contributions that are not traditional papers (i.e. datasets, software, blog posts, etc.) and the references to scientific contributions that are not in the citation list of a traditional paper (blogs, social networks, etc.). Note that the altmetrics manifesto describes altmetrics as a tool to help find scientists publications worth reading. I find it hard to believe that its authors have not thought of applications in evaluation of researchers and institutions, which will inevitably happen if altmetrics ever takes off. [...]

  207. [...] The signs are good that ORCID will take off. I hope so, particularly so that innovative third party services can come in and offer new approaches. I am a big fan of the idea of impact story, a beta service that uses ORCID to drive a whole digital footprint approach to tracing the web metrics and social shares of academic online outputs, alongside citations. This broadened attention is fundamental to the altmetrics manifesto. [...]

  208. By Impact Story | Sexy Statistics on May 13, 2013 at 4:27 pm

    [...] aggregates altmetrics: diverse impacts from your articles, datasets, blog posts, and [...]

  209. By Article-Level Metrics | librarythings@uow on May 14, 2013 at 3:32 am

    [...] the way it draws together traditional metrics (terms like citation, impact factor or h-index) and altmetrics - at journal, personal and article level. Rather than presenting emerging data streams like [...]

  210. [...] efforts to get at more sophisticated measures are already underway, including (to name just a few) Alternative Metrics for Science, Data Citation Principles, Improving Future Research Communication and e-Scholarship, [...]

  211. By Just say no to impact factors | 1tourism.com on May 18, 2013 at 2:17 pm

    [...] be considered”, not just publications. One way to achieve this may be through greater use of altmetrics, which offer new insights into the impact of research. But even here we need to be conscious of the [...]

  212. By What is impact? | Naturally Selected on May 20, 2013 at 12:52 pm

    [...] These usage measures are encapsulated in the growing ‘altmetrics’ landscape (for a summary see). F1000Prime recommendations, which provide a machine-readable star rating of papers along with a [...]

  213. [...] Altmetrics (or alternative metrics) was a term aptly coined in a tweet by Jason Priem (co-founder of ImpactStory). Altmetrics measure the number of times a research output gets cited, tweeted about, liked, shared, bookmarked, viewed, downloaded, mentioned, favourited, reviewed, or discussed. It harvests these numbers from a wide variety of open source web services that count such instances, including open access journal platforms, scholarly citation databases, web-based research sharing services, and social media. [...]

  214. [...] research is based on journal prestige, but some scientists and startups are beginning to use alternative metrics in an effort to refocus on the science itself (rather than the publishing [...]

  215. [...] A good introduction to the ambitions of altmetrics may be found at altmetrics.org/manifesto (2) Thelwall, M., Haustein, S., Larivière, V., Sugimoto, C.R. (2013) “Do altmetrics work? [...]

  216. [...] been looking into some altmetrics stuff recently (measuring and aggregating social commentary around academic articles) and I thought [...]

  217. By Impact Factor’s flaws, in 200 words - sMemo on June 6, 2013 at 8:44 am

    [...] Altmetrics.org [...]

  218. By New metrics need fresh data - Web & Media on June 6, 2013 at 9:35 am

    [...] of the ideas in the altmetrics manifesto was that almetrics allow a diversity of metrics. With colleagues in the VU University [...]

  219. By Tech Roundup | LibraryTechTalk on June 7, 2013 at 8:46 pm

    [...] from a variety of sources and measure the impact that their scholarly output has had using altmetrics like “number of times bookmarked on CiteULike” or “number of readers in Mendeley”. [...]

  220. [...] Altmetrics is a term that has come to mean the broadening of what we count as scholarship and how we value it. I would describe services like figshare, PeerJ and mendeley as cool social scholarship. What the ORCID ecosystem does is enable established currency to be brought alongside the newer social media currencies, and those cool social scholarship services therefore come into their own.Then layering across all of that, altmetrics-focussed services like impactstory and plum analytics. [...]

  221. [...] are numerous apps, websites, and tools working to provide this type of data. Altmetrics.org has a manifesto describing the terms of the terrain, but even more helpfully, they provide a tools link collecting [...]

  222. [...] As has now been widely reported, NISO have a $200K grant from the Alfred P Sloan Foundation to develop standards for AltMetrics. [...]

  223. By Blogging and Tenure | lauren's library blog on June 25, 2013 at 8:45 pm

    [...] The obvious answer: it doesn’t count. But there is an emerging question: what is the role of altmetrics in [...]

  224. [...] example, Jeff Jarvis (another CUNY colleague) has 123,667  Twitter followers. That’s a kind of “altmetric” – a measure of his reach and influence. Increasingly, book publishers, even some employers, [...]

  225. By IWMW 2013 (1) | The shape of things on July 1, 2013 at 9:09 am

    [...] Altmetrics manifesto: http://altmetrics.org/manifesto/ [...]

  226. [...] emergent alternative to traditional citations as an impact measure is altmetrics. By combining information about how often an article is downloaded, shared, blogged, cited, [...]

  227. [...] Lozano points out that impact factors were developed in the early 20th century to help American university libraries with their journal purchasing decisions.  Of course, throughout the last century, printed, bound journals were the main way in which scholarly research was distributed. All that’s changing. [...]

  228. [...] altmetricsは比較的新しい概念――altmetrics: a manifestoが初めて公開(v [...]

  229. [...] to assessment that start to pop up. And what about alternative means for accounting impact such as altmetrics and online environments such as ImpactStory? There seems to be a sweep of possibilities if we have [...]

  230. [...] There’s a heated debate going on about impact factors: their meaning, use and mis-use, etc.  Science has an editorial discussing impact factor distortions.  One academic association, the American Society for Cell Biology, has put together a declaration (with 8500+ signers so far)–San Francisco Declaration on Research Assessment (DORA)–highlighting the problems caused by the abuse of journal impact factors and related measures. Problems with impact factors have in turn led to alternative metrics, for example see altmetrics. [...]

  231. [...] reale, la diffusione nel mondo accademico di articoli scientifici e non. Lo sostiene il manifesto Altmetrics, che misura l’impatto di  pubblicazioni all’esterno del mondo accademico, ricerche [...]

  232. By Altmetrics: A Primer | SFObound on July 17, 2013 at 6:22 pm

    [...] The altmetrics manifesto [...]

  233. [...] the misuse of these metrics. Some even suggest a more reliable ranking system, like altmetrics or the h-index to better suss out the value of an individual researchers or paper.  Australia’s [...]

  234. [...] to be found in Jason Priem, Dario Taraborelli, Paul Groth, and Cameron Neylon’s (2010) “Altmetrics: a manifesto,” which proclaims that the entire peer-review process is “slow, encourages conventionality, and [...]

  235. [...] Questa proposta di metriche alternative mi sembra un po’ più [...]

  236. [...] is one of a range of products that makes use of Altmetrics data to tell a story about the impact of scholarly research. In an increasingly social online [...]

  237. [...] theme is echoed by Cameron Nylon one of the co-producers of the Altmetrics Manifesto.  Nylon tentatively points out that altmetrics have a greater role to play in Africa, as African [...]

  238. [...] 6. Altmetrics: a manifesto. Altmetrics. [viewed 08 August 2013]. Available from: http://altmetrics.org/manifesto [...]

  239. [...] MyOpenArchive というプロジェクトを始めてオープンアクセスに興味を抱き(2007年)、ソーシャルメディアを活用した研究成果共有の可能性に思いを馳せて Mendeley を溺愛し(2010年)、学術コミュニケーションにおけるソーシャルメディア活用の可能性とその結果としての研究評価(インパクト)に関心が移って漂流した結果辿り着いた先は altmetrics(2011年)、な私の関心事。 [...]

  240. [...] there is a new term which tries to capture online mentions of research articles called “altmetrics” originally defined by Jason Priem.[2, 3] Publishers such as Elsevier and PLoS are developing new [...]

  241. By Professional Development – A belated ALA report on August 22, 2013 at 8:52 pm

    [...] same, but with Highwire “under the hood.” The MUSE folks said they are also looking at almetrics and trying to find ways to measure the “impact” of humanities content. Project MUSE has [...]

  242. [...] research, and debate. As scholars increasingly move their work to the web, and with an estimated third of all scholars now active on Twitter, conversations that previously took place within campus walls are now open for the world to pitch [...]

  243. By Defining social media reach, impact, and virality. on September 2, 2013 at 7:18 pm

    [...] there is a new term which tries to capture online mentions of research articles called “altmetrics” originally defined by Jason Priem.[2, 3] Publishers such as Elsevier and PLoS are developing new [...]

  244. [...] 「図書館員は、インパクト評価に関する研究者の知識と関心を支える重要な立場にある」 最近、altmetrics(オルトメトリクス)と呼ばれる新たな研究評価指数が注目を浴び初めています。altmetricsは、ソーシャルメディアを活用して研究成果の影響度を「論文レベル」でリアルタイムに測定し、伝統的な研究評価指標を補完することが期待されています。 今日これからVictorが紹介されるMendeley機関版は、機関内での学術情報がどの様に流通しているのかを俯瞰し視覚化してくれる点が最大の特徴であり魅力だと思います。この根底にはaltmetricsの概念があり、これは今後とても重要視されるだろう、特に図書館員にとっては・・冒頭の引用には、そんな意味が込められているのではと思います。 [...]

  245. [...] この辺りは最近話題となりつつあるaltmetrics(ソーシャルメディアを活用した研究評価指標)とも絡んでいて、とてもムラムラする箇所です。 Mendeleyはaltmetricsに欠かせない(altmetricsにとってもMendeleyは欠かせない)存在になりつつあること再認識。 Mendeley人気に拍車がかかれば、必然的にaltmetricsに注目が集まる・・来年のSPARC Japanセミナーあたりではきっとaltmetricsをテーマとしたセミナーが開催され、第一人者のJasonあたりが来日して・・そんなことを妄想しながら聴き入りました。 [...]

  246. [...] scholarship. Their vision is summarized in: J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), Altmetrics: A manifesto, (v.1.0), 26 October 2010. http://altmetrics.org/manifesto via about – [...]

  247. [...] Altmetrics に強い興味があってこれらの情報を日々ウォッチ。 [...]

  248. By Is Altmetric for me | Hazman Labs, Inc on September 7, 2013 at 12:07 am

    [...] is interesting when I start to hear about Altmetric from the website. Basically, altmetrics is the creation and study of new metrics based on the social web for [...]

  249. [...] http://altmetrics.org/manifesto/ http://www.wooga.com/2012/07/ [...]

  250. [...] The Altmetrics Manifesto http://altmetrics.org/manifesto/ [...]

  251. [...] こうしたエリート主義には、問題もいろいろある。すなわち、参加者と評価者の数が少ないために、データが不足したり、評価基準が仲間内で固定されたり、工作活動に対して脆弱であったり[18] といった弊害が生じる。こうした弊害を解消するために、ネット上での一般の人々の評価を反映させるオルトメトリクス(altmetrics)が提唱されている。オルトメトリクスとは、非伝統的な(alternative)評価指標(metrics)という意味である。具体的には、ソーシャルメディアでの評価、Mendeley などの学術論文引用管理ソフトウェアでの使用状況、ソーシャルブックマーク、オンライン上でのリンクなどのデータから研究者の業績を評価する。 [...]

  252. [...] peers, and vulnerability to manipulation.[18] To prevent such harmful effects, some propose altmetrics, alternative metrics of academic works by means of the reputation of social media, storage at [...]

  253. [...] increasingly subjected to metric assessments based on their success in using social media (via altmetrics) and the reluctance of some to take up new activities in an already very demanding working life. [...]

  254. [...] “Some people say, ‘I don’t care about popular science; I only care about quality science. The only measure we have [of science quality] is the consensus of the scientific community. One could call that popularity; one could call it expert consensus.”—Information scientist Jason Priem,  University of North Carolina, Chapel Hill, and author of altmetrics: a manifesto [...]

  255. [...] data. Altmetrics are indicators of scholarly activity and impact on the web. Have a look at the altmetrics manifesto for a thorough [...]

  256. [...] Priem, J.; Taraborelli, D.; Groth, P.; Neylon, C. altmetrics: a manifesto, (v.1.0). 2010-10-26. [2] 坂東 慶太.Altmetrics の可能性 [...]

  257. [...] potentially makes the tools available to researchers more homogeneous and ignores niches. As the alt metrics manifesto suggests, the traditional “filters” in scholarly communication of peer review, [...]

  258. [...] Groth is universitair docent bij de Vrije Universiteit in Amsterdam. Hij is initiatiefnemer van het altmetrics manifesto. Paul heeft bovendien meegewerkt aan een korte film over sociale media en open access in 2012. Zie [...]

  259. [...] data. Altmetrics are indicators of scholarly activity and impact on the web. Have a look at the altmetrics manifesto for a thorough [...]

  260. By The Future of Metrics in Science | Data Pub on October 14, 2013 at 9:12 pm

    [...] a graduate student at UNC’s School of Information and Library Science, coined the term “altmetrics” rather recently, and the idea has taken off like [...]

  261. [...] ~ J. Priem, D. Taraborelli, P. Groth, & C. Neylon, altmetrics.org [...]

  262. By Defining social media terms | Heidi Allen on October 23, 2013 at 3:38 am

    [...] there is a new term which tries to capture online mentions of research articles called “altmetrics” originally defined by Jason Priem.[2, 3] Publishers such as Elsevier and PLoS are developing new [...]

  263. [...] de description sémantique des relations citant-cité ou même par rapport à certains indicateurs altmetrics , mais qui a le mérite d’être facilement analysable par des outils efficaces, utilisés par les [...]

  264. By Dantalus on October 28, 2013 at 10:24 am

    [...] these apply to you, then you likely have an opportunity to help academia rise above publication-based metrics of academic impact, even if just an inch at a [...]

  265. By Exploring Altmetrics - THATCamp Virginia 2013 on October 30, 2013 at 3:14 pm

    […] You can read the classic altmetrics manifesto here. […]

  266. […] on the quality of the article, PLOS calls it ‘Article-Level Metric’ and there is also Altmetrics and others have their own ideas. But how do we measure the success of this paper in a timely manner […]

  267. By Defining social media terms | Heidi Allen on November 4, 2013 at 12:10 am

    […] there is a new term which tries to capture online mentions of research articles called “altmetrics” originally defined by Jason Priem.[2, 3] Publishers such as Elsevier and PLoS are developing new […]

  268. […] się jedynie na liczeniu cytowań. Warto przeczytać manifest altmetrics, który dostępny jest tutaj. Jego autorzy podkreślają, że w nauce od zawsze potrzebne są różnego rodzaju filtry, gdyż […]

  269. By Evaluating Impact: What’s your number? | PLOS Tech on November 6, 2013 at 10:51 pm

    […] trends have been summarized with the terms Article-level Metrics and altmetrics and will be the focus of the panel next Saturday. As altmetrics is still a young discipline, more […]

  270. […] publication to the web, and publish earlier, the web offers a better way to filter science or as Altmetrics (project set up to discuss the post-peer review environment) puts it: “Instead of waiting months […]

  271. […] publication to the web, and publish earlier, the web offers a better way to filter science or as Altmetrics (project set up to discuss the post-peer review environment) puts it: “Instead of waiting months […]

  272. […] time to update our measures of visibility and alternative metrics must be part of any modern system for quantifying research […]

  273. […] Altmetriikka on nähty myös webnatiivin sukupolven vastaiskuna aikansa eläneelle tutkimuskulttuurille. Vuonna 2010 julistettiinkin verkossa manifesti, “Altmetrics manifesto”. […]

  274. By Blessay | Brooklyn Chick, Mere's Blog on November 19, 2013 at 10:46 pm

    […] Altmetrics was introduced as a tool for scholars and librarians. For a complete explanation go to http://altmetrics.org/manifesto/. The main concept behind Altmetrics is that its system can easily tell you which articles are the […]

  275. […] more senior posts. It’s simple arithmetic. So we shouldn’t expect reform of publishing, or alt-metrics, to save people from perishing. These reforms could certainly make the system fairer and better, […]

  276. […] article, and they may be taken into consideration when making hiring or tenure decisions. The altmetrics manifesto argues that new forms of scholarly and popular communication (e.g. social media) require a rethink […]

  277. […] académique évoluant à grande vitesse sur les réseaux sociaux. A ce sujet, notons une étude sur http://altmetrics.org/manifesto/ (un groupe de chercheurs à l’international) ayant fait le constat que les chercheurs […]

  278. […] Altmetrics Manifesto […]

  279. […] and in social media are considered as well. The altmetrics promise, as laid out in the excellent manifesto, is that they assess impact quicker and on a broader […]

  280. […] are more metrics (even Alt-metrics) really the solution to the perverse incentives embodied by our existing metrics? The much derided […]

  281. […] with a terrible name [Note: "Alt" ('alternative') to what? Not really "metrics" either]. The Altmetrics.org manifesto clearly lays out the ambitions for what those active in the field envision altmetrics to be: acting […]

  282. By Twitter Open Access Report – 4 Sep 2013 | on December 12, 2013 at 11:16 am

    […] in journals, conferences and social media. (A good starting point for learning about altmetrics is http://altmetrics.org/manifesto.) Data collected by altmetric platforms come from many sources, ranging from PDF downloads on […]

  283. By Altmetrics en el Contexto | Universo Abierto on December 15, 2013 at 8:06 pm

    […] investigación, incluyendo presentaciones de diapositivas, los conjuntos de datos y artículos. En “Altmetrics: a manifiesto”, hay una buena introducción a cómo altmetrics pueden enriquecer más la reflexión tradicional […]

  284. By Unbundling Academia—It's Not Just for Cable Anymore on December 23, 2013 at 10:18 pm

    […] aggregating informal assessment are already flourishing.” These measuring systems, such as "altmetrics," in part use traffic and engagement statistics that wouldn’t be unfamiliar to any […]

  285. […] ; Groth, Paul ; Neylon, Cameron (2011): Altmetrics: A manifesto. Version 1.01, 06.12.2013. URL: http://altmetrics.org/manifesto                            P Dong, M Loh, A Mondry(2005) – The “impact factor” revisited.Biomedical digital […]

  286. […] research is based on journal prestige, but some scientists and startups are beginning to use alternative metrics in an effort to refocus on the science itself (rather than the publishing […]

  287. […] how many people are talking about it, their opinions and whether your work is important to them. Altmetrics gives you the answer, as well as an opportunity to find out which articles are widely disputed in […]

  288. By Twitter Open Access Report – 14 Jan 2014 | on January 14, 2014 at 11:42 am

    […] how many people are talking about it, their opinions and whether your work is important to them. Altmetrics gives you the answer, as well as an opportunity to find out which articles are widely disputed in […]

  289. By CMPO Viewpoint on January 14, 2014 at 3:06 pm

    […] is likely to change dramatically over the next few years, as open access, self-archiving, altmetrics and other technology-driven innovations become increasingly common. This provides an opportunity to […]

  290. […] field itself isn’t much older. One of its formative texts, “Altmetrics: A Manifesto,” written by ImpactStory founder Jason Priem and others, went online in October […]

  291. […] science (the kind that ought to be considered for tenure) which operates on the timescale of years. Priem also says “researchers must ask if altmetrics really reflect impact” .  Even he […]

  292. […] as a container is an important value metric and one that needs to continue, the rapidly evolving alternative metrics (altmetrics) movement is concerned with more than replacing traditional journal assessment […]

  293. […] Priem, J., Taraborelli, D., Groth, P., and Neylon, C. (2010a). Altmetrics: a manifesto. http://altmetrics.org/manifesto/ […]

  294. […] there is a new term which tries to capture online mentions of research articles called “altmetrics” originally defined by Jason Priem.[2, 3] Publishers such as Elsevier and PLoS are developing new […]

  295. By The end of the paywall | carsten.io on February 4, 2014 at 3:23 pm

    […] of self-archiving, a growing market for open access publishers, tools such as #icanhazpdf, and new impact measures, I think it is getting ever harder for the publishers to justify their steep subscription […]

  296. […] are more metrics (even Alt-metrics) really the solution to the perverse incentives embodied by our existing metrics? The much derided […]

  297. […] Altmetrics is the idea that scientific publications should be judged (perhaps primarily) on the impact they have in the general media, including on social media. This is in alternative to looking at either citations of journal impact factors. […]

  298. […] altmetricsは比較的新しい概念――altmetrics: a manifestoが初めて公開(v […]

  299. By Twitter Open Access Report – 11 Feb 2014 | on February 11, 2014 at 11:41 am

    […] has been measured in a few ways, usually through narrow citations counts or through peer review. Article level metrics (altmetrics) are becoming the new currency to measure research impact. They measure reach through article […]

  300. […] Article level metrics (altmetrics) are becoming the new currency to measure research impact. They measure reach through article views, downloads, traditional media or mentions in social media. […]

  301. By The Ongoing Evolution of Universities into Newsrooms on February 11, 2014 at 7:36 pm

    […] Article level metrics (altmetrics) are becoming the new currency to measure research impact. They measure reach through article views, downloads, traditional media or mentions in social media. […]

  302. By Un train peut en cacher un autre…. | BibliOpen on February 14, 2014 at 11:07 am

    […] scientifique. Et certes, un peu partout, des réflexions sont conduites pour proposer d’autres métriques , d’autres […]

  303. […] attention to things like the tyranny of top-tier journals, the rise of open access journals, and alt metrics. These revolutions in how we appraise a scientist’s worth are happening alongside the […]

  304. […] think some of the altmetrics strategies could come to support this part of the problem, but I still haven’t seen the real […]

  305. […] blogs, Twitter, or Facebook. New companies are launching in order to measure this response, and to create an alternative to the traditional ways of measuring the impact of a paper. Instead of looking at the number of […]

  306. […] Web,” have been purposely constructed to be alternatives to the JIF. Since the drafting of the altmetrics manifesto, there has been a special issue, a PLoS collection, a Mendeley group, several annual workshops, an […]

  307. […] Web,” have been purposely constructed to be alternatives to the JIF. Since the drafting of the altmetrics manifesto, there has been a special issue, a PLOS collection, a Mendeley group, […]

  308. […] bei 20 Berufungsverfahren oder in der internen Mittelvergabe explizit auch Benutzungs-Metriken, Altmetrics oder Metriken wie WikiTrust zur Messung individueller Beiträge zu kollaborativ erstellten […]

  309. […] This discussion is particularly important in community violently aware of the need to credit the electronic impact of non-traditiona-journal publicated work in the portfolio of academic clinicians. Felix Ankel (@felixankel) distributed an email referring to the issue a few days a go at the beginning of CORD Academic Assembly, being the highlight of the mail the Altmetrics Manifesto. […]

  310. […] ways they engage with your research. Social media platforms leave footprints on the web. These ”altmetrics” can be captured and aggregated at the article […]

Post a Comment

Your email is never shared. Required fields are marked *

*
*