4 things every librarian should do with altmetrics

Researchers are starting to use altmetrics to understand and promote their academic contributions. At the same time, administrators and funders are exploring them to evaluate researchers’ impact.

In light of these changes, how can you, as a librarian, stay relevant by supporting their fast-changing altmetrics needs?

In this post, we’ll give you four ways to stay relevant: staying up-to-date with the latest altmetrics research, experimenting with altmetrics tools, engaging in early altmetrics education and outreach, and defining what altmetrics mean to you as a librarian.

1. Know the literature

Faculty won’t come to you for help navigating the altmetrics landscape if they can tell you don’t know the area very well, will they?

To get familiar with discussions around altmetrics, start with the recent SPARC report on article-level metrics, this excellent overview that appeared in Serials Review (paywall), and the recent ASIS&T Bulletin special issue on altmetrics.

Then, check out this list of “17 Essential Altmetrics Resources” aimed at librarians, this recent article on collection development and altmetrics from Against the Grain, and presentations from Heather and Stacy on why it’s important for librarians to be involved in altmetrics discussions on their campuses.

There’s also a growing body of peer-reviewed research on altmetrics. One important concept from this literature is the idea of “impact flavors”–a way to understand distinctive patterns in the impacts of scholarly products.

For example, an article featured in mainstream media stories, blogged about, and downloaded by the public has a very different flavor of impact than a dataset heavily saved and discussed by scholars, which is in turn different from software that’s highly cited in research papers. Altmetrics can help researchers, funders, and administrators optimize for the mix of flavors that best fits their particular goals.

There’s also been a lot of studies on correlations (or lack thereof) between altmetrics and traditional citations. Some have shown that selected altmetrics sources (Mendeley in particular) are significantly correlated with citations (1, 2, 3), while other sources, like Facebook bookmarks, have only slight correlations with citations. These studies show that different types of altmetrics are capturing different types of impact, beyond just scholarly impact.

Other early touchstones include studies exploring the predictive potential of altmetrics, growing adoption of social media tools that inform altmetrics, and insights from article readership patterns.

But these are far from only studies to be aware of! Stay abreast of new research by reading through the PLOS Altmetrics Collection, joining the Altmetrics Mendeley group and following the #altmetrics hashtag on Twitter.

2. Know the tools

There are now several tools that allow scholars to collect and share the broad impact of their research portfolios.

In the same way a you’d experiment with new features added to Web of Science, you can play around with altmetrics tools and add them to your bibliographic instruction repertoire (more on that in the following section). Familiarity will enable you to do easy demonstrations, discuss strengths and weaknesses, contribute to product development, and serve as a resource for campus scholars and administration.

Here are some of the most popular altmetrics tools:

Impactstory

lopjuza.png

If you’re reading this post, chances are that you’re already familiar with Impactstory, a nonprofit Web application supported by the Alfred P. Sloan Foundation and NSF.

If you’re a newcomer, here’s the scoop: scholars create a free Impactstory profile and then upload their articles, datasets, software, and other products using Google Scholar, ORCID, or lists of permanent identifiers like DOIs, PubMed IDs, and so on. Impactstory then gathers and reports altmetrics and traditional citations for each product. As shown above, metrics are displayed as percentiles relative to similar products. Profile data can be exported for further analysis, and users can receive alerts about new impacts.

Impactstory is built on open-source code, offers open data, and is free to use. Our robust community of users helps us think up new features and prioritize development via our Feedback forum; once you’re familiar with our site, we encourage you to sign up and start contributing, too!

PlumX

PlumX Artifact Screen Shot.pngPlumX is another web application that displays metrics for a wide range of scholarly outputs. The metrics can be viewed and analyzed at any user-defined level, including at the researcher, department, institution, journal, grant, and research topic levels. PlumX reports some metrics that are unique from other altmetrics services, like WorldCat holdings and downloads and pageviews from some publishers, institutional repositories, and EBSCO databases. PlumX is developed and marketed by Plum Analytics, an EBSCO company.

The service is available via a subscription. Individuals who are curious can experiment with the free demo version.

Altmetric

Altmetric-Explorer-Screenshot-University-of-Texas-Sample.pngThe third tool that librarians should know about is Altmetric.com. Originally developed to provide altmetrics for publishers, the tool primarily tracks journal articles and ArXiv.org preprints. In recent years, the service has expanded to include a subscription-based institutional edition, aimed at university administrators.

Altmetric.com offers unique features, including the Altmetric score (a single-number summary of the attention an article has received online) and the Altmetric bookmarklet (a browser widget that allows you to look up altmetrics for any journal article or ArXiv.org preprint with a unique identifier). Sources tracked for mentions of articles include social and traditional media outlets from around the world, post-publication peer-review sites, reference managers like Mendeley, and public policy documents.

Librarians can get free access to the Altmetric Explorer and free services for institutional repositories. You can also request trial access to Altmetric for Institutions.

3. Integrate altmetrics into library outreach and education

Librarians are often asked to describe Open Access publishing choices to both faculty and students and teach how to gather evidence of impact for hiring, promotion, and tenure. These opportunities–whether one on one or in group settings like faculty meetings–can allow librarians to introduce altmetrics.

Discussing altmetrics in the context of Open Access publishing helps “sell” the benefits of OA. Altmetrics, like download counts that appear in PLOS journals and institutional repositories, can highlight the benefits of open access publishing. They can also demonstrate that “impact” is more closely tied to an individual’s scholarship rather than a journal’s impact factor.

Similarly, researchers often use an author’s h-index for hiring, tenure, and promotion, conflating the h-index with the quality of an individual’s work. Librarians are often asked to teach and provide assistance calculating an h-index within various databases (Web of Science, SCOPUS, etc.). Integrating altmetrics into these instruction sessions is akin to providing researchers with additional primary resource choices on a research project. Librarians need to make researchers aware of many tools they can use to evaluate the impact of scholarship, and of the relevant research–including benefits of and drawbacks to different altmetrics.

So, what does altmetrics outreach look like on the ground? To start, check out these great presentations that librarians around the world have given on the benefits of using altmetrics (and particular altmetrics tools) in research and promotion.

Another great way to stay relevant on this subject is to find and recommend to your grad students and faculty readings on ways they can use altmetrics in their career, like this one from our blog on the benefits of including altmetrics on your CV.

4. Discover the benefits that altmetrics offer librarians

There are reasons to learn about altmetrics beyond serving faculty and students. A major one is that many librarians are scholars themselves, and can use altmetrics to better understand the diverse impact of their articles, presentations, and white papers. Consider putting altmetrics on your own CV, and advocating the use of altmetrics among library faculty who are assembling tenure and promotion packages.

Librarians also produce and support terabytes’ worth of scholarly content that’s intended for others’ use, usually in the form of digital special collections and institutional repository holdings. Altmetrics can help librarians understand the impacts of these non-traditional scholarly outputs, and provide hard evidence of their use beyond ‘hits’ and downloads–evidence that’s especially useful when making arguments for increased budgetary and administrative support.

It’s important that librarians explore the unique ways they can apply altmetrics to their own research and jobs, especially in light of recent initiatives to create recommended practices for the collection and use of altmetrics. What is useful to a computational biologist may not be useful for a librarian (and vice versa). Get to know the research and tools and figure out ways to use them to your own ends.

There’s a lot happening right now in the altmetrics space, and it can sometimes be overwhelming for librarians to keep up with and understand. By following the steps outlined above, you’ll be well positioned to inform and support researchers, administrators, and library decision makers in their use. And in doing so, you’ll be indispensable in this new era of web-native research.

Are you a librarian that’s using altmetrics? Share your experiences in the comments below!

This post has been adapted from the 2013 C&RL News article, “Riding the crest of the altmetrics wave: How librarians can help prepare faculty for the next generation of research impact metrics” by Lapinski, Piwowar, and Priem.

Ten reasons you should put altmetrics on your CV right now

If you don’t include altmetrics on your CV, you’re missing out in a big way.

There are many benefits to scholars and scholarship when altmetrics are embedded in a CV.

Altmetrics can:

  1. provide additional information;
  2. de-emphasize inappropriate metrics;
  3. uncover the impact of just-published work;
  4. legitimize all types of scholarly products;
  5. recognize diverse impact flavors;
  6. reward effective efforts to facilitate reuse;
  7. encourage a focus on public engagement;
  8. facilitate qualitative exploration;
  9. empower publication choice; and
  10. spur innovation in research evaluation.

In this post, we’ll detail why these benefits are important to your career, and also recommend the ways you should–and shouldn’t–include altmetrics in your CV.

1. Altmetrics provide additional information

The most obvious benefit of including altmetrics on a CV is that you’re providing more information than your CV’s readers already have.  Readers can still assess the CV items just as they’ve always done: based on title, journal and author list, and maybe–if they’re motivated–by reading or reviewing the research product itself. Altmetrics have the added benefit of allowing readers to dig into post-publication impact of your work.

2. Altmetrics de-emphasize inappropriate metrics

It’s generally regarded as poor form to evaluate an article based on a journal title or impact factor. Why? Because high journal impact factors vary across fields and an article often receives more or less attention than its journal container suggests.

But what else are readers of a CV to do? Most of us don’t have enough domain expertise to dig into each item and assess its merits based on a careful reading, even if we did have time. We need help, but traditional CVs don’t provide enough information to assess the work on anything but journal title.

Providing article-level citations and altmetrics in a CV gives readers more information, thereby de-emphasizing evaluation based on journal rank.

3. Altmetrics uncover the impact of just-published work

Why not suggest that we include citation counts in CVs, and leave it at that? Why go so far as altmetrics? The reason is that altmetrics have benefits that complement the weaknesses of a citation-based solution.

Timeliness is the most obvious benefits of altmetrics. Citations take years to accrue, which can be a problem for graduate students who are applying for jobs soon after publishing their first papers and for those promotion candidates whose most profound work is published only shortly before review.

Multiple research studies have found that counts of downloads, bookmarks and tweets correlate with citations, yet accrue much more quickly, often in weeks or months rather than years. Using timely metrics allows researchers to showcase the impact of their most recent work.

4. Altmetrics legitimize all types of scholarly products

How can readers of a CV know if your included dataset, software project, or technical report is any good?

You can’t judge its quality and impact based on the reputation of the journal that published it, since datasets and software aren’t published in journals. And even if they were, we wouldn’t want to promote the poor practice of judging the impact of an item by the impact of its container.

How, then, can alternative scholarly products be more than just space-filler on a CV?

The answer is product-level metrics. Like article-level metrics do for journal articles, product-level metrics provide the needed evidence to convince evaluators that a dataset or software package or white paper has made a difference. These types of products often make impacts in ways that aren’t captured by standard attribution mechanisms like citations. Altmetrics are key to communicating the full picture of how a product has influenced a field.

5. Altmetrics recognize diverse impact flavors

The impact of a research paper has a flavor. There are scholarly flavors (a great methods sections bookmarked for later reference or controversial claims that change a field), public flavors (“sexy” research that captures the imagination or data from a paper that’s used in the classroom), and flavors that fall into the area in between (research that informs public policy or a paper that’s widely used in clinical practice).

We don’t yet know how many flavors of impact there are, but it would be a safe bet that scholarship and society need them all. The goal isn’t to compare flavors: one flavor isn’t objectively better than another. They each have to be appreciated on their own merits for the needs they meet.

To appreciate the impact flavor of items on a CV, we need to be able to tell the flavors apart. (Citations alone can’t fully inform what kind of difference a research paper has made on the world. They are important, but not enough.) This is where altmetrics come in. By analyzing patterns in what people are reading, bookmarking, sharing, discussing and citing online we can start to figure out what kind – what flavor – of impact a research output is making.

More research is needed to understand the flavor palette, how to classify impact flavor and what it means. In the meantime, exposing raw information about downloads, shares, bookmarks and the like starts to give a peek into impact flavor beyond just citations.

6. Altmetrics reward efforts to facilitate reuse

Reusing research – for replication, follow-up studies and entirely new purposes – reduces waste and spurs innovation. But it does take a bit of work to make your research reusable, and that work should be recognized using altmetrics.

There are a number of ways authors can make their research easier to reuse. They can make article text available for free with broad reuse rights. They can choose to publish in places with liberal text-mining policies, that invest in disseminating machine-friendly versions of articles and figures.

Authors can write detailed descriptions of their methods, materials, datasets and software and make them openly available for reuse. They can even go further, experimenting with executable papers, versioned papers, open peer review, semantic markup and so on.

When these additional steps result in increased reuse, it will likely be reflected in downloads, bookmarks, discussions and possibly citations. Including altmetrics in CVs will reward investigators who have invested their time to make their research reusable, and will encourage others to do so in the future.

7. Altmetrics can encourage a focus on public engagement

The research community, as well as society as a whole, benefits when research results are discussed outside the Ivory Tower. Engaging the public is essential for future funding, recruitment and accountability.

Today, however, researchers have little incentive to engage in outreach or make their research accessible to the public. By highlighting evidence of public engagement like tweets, blog posts and mainstream media coverage, altmetrics on a CV can reward researchers who choose to invest in public engagement activities.

8. Altmetrics facilitate qualitative exploration

Including altmetrics in a CV isn’t all about the numbers! Just as we hope many people who skim our CVs will stop to read our papers and explore our software packages, so too we can hope that interested parties will click through to explore the details of altmetrics engagement for themselves.

Who is discussing an article? What are they saying? Who has bookmarked a dataset? What are they using it for? As we discuss at the end of this post, including provenance information is crucial for trustworthy altmetrics. It also provides great information that helps CV readers move beyond the numbers and jump into qualitative exploration of impact.

9. Altmetrics empower publication choice

Publishing in a new or innovative journal can be risky. Many authors are hesitant to publish their best work somewhere new or with a relatively-low impact factor. Altmetrics can remedy this by highlighting work based on its post-publication impact, rather than the title of the journal it was published in. Authors will be empowered to choose publication venues they feel are most appropriate, leveling the playing field for what might otherwise be considered risky choices.

Successful publishing innovators will also benefit. New journals won’t have to wait two years to get an impact factor before they can compete. Publishing venues that increase access and reuse will be particularly attractive. This change will spur innovation and support the many publishing options that have recently debuted, such as eLife, PeerJ, F1000 Research and others.

10. Altmetrics spur innovation in research evaluation

Finally, including altmetrics on CVs will engage researchers directly in research evaluation. Researchers are evaluated all the time, but often behind closed doors, using data and tools they don’t have access to. Encouraging researchers to tell their own impact stories on their CVs, using broad sources of data, will help spur a much-needed conversation about how research evaluation is done and should be done in the future.

OK, so how can you do it right?

There can be risks to including altmetrics data on a CV, particularly if the data is presented or interpreted without due care or common sense.

Altmetrics data should be presented in a way that is accurate, auditable and meaningful:

  • Accurate data is up-to-date, well-described and has been filtered to remove attempts at deceitful gaming
  • Auditable data implies completely open and transparent calculation formulas for aggregation, navigable links to original sources and access by anyone without a subscription.
  • Meaningful data needs context and reference. Categorizing online activity into an engagement framework helps readers understand the metrics without becoming overwhelmed. Reference is also crucial. How many tweets is a lot? What percentage of papers are cited in Wikipedia? Representing raw counts as statistically rigorous percentiles, localized to domain or type of product, makes it easy to interpret the data responsibly.

Assuming these presentation requirements are met, how should the data be interpreted? We strongly recommend that altmetrics be considered not as a replacement for careful expert evaluation but as a supplement. Because they are still in their infancy, we should view altmetrics as way to ground subjective assessment in real data; a way to start conversations, not end them.

Given this approach, at least three varieties of interpretation are appropriate: signaling, highlighting and discovery. A CV with altmetrics clearly signals that a scholar is abreast of innovations in scholarly communication and serious about communicating the impact of scholarship in meaningful ways. Altmetrics can also be used to highlight research products that might otherwise go unnoticed: a highly downloaded dataset or a track record of F1000-reviewed papers suggests work worthy of a second look. Finally, as we described above, auditable altmetrics data can be used by evaluators as a jumping off point for discovery about who is interested in the research, what they are doing with it, and how they are using it.

How to Get Started

How can you add altmetrics to your own CV or, if you are a librarian, empower scholars to add altmetrics to theirs?

Start by experimenting with altmetrics for yourself. Play with the tools, explore and suggest improvements. Librarians can also spread the word on their campuses and beyond through writing, teaching and outreach. Finally, if you’re in a position to hire, promote, or review grant applications, explicitly welcome diverse evidence of impact when you solicit CVs.

What are your thoughts on using altmetrics on a CV? Would you welcome them as a reviewer, or choose to ignore them? Tell us in the comments section below.

This post has been adapted from “The Power of Altmetrics on a CV,” which appeared in the April/May 2013 issue of ASIS&T Bulletin.

Impactstory Advisor of the Month: Jon Tennant (June 2014)

Jon Tennant (blogTwitter), a PhD candidate studying tetrapod biodiversity and extinction at Imperial College London, was one of the first scientists to join our recently launched Advisor program.

jon.jpeg

Within minutes of receiving his acceptance into the program, Jon was pounding the virtual pavement to let others know about Impactstory and the benefits it brings to scientists. For this reason–and the fact that Jon has done some cool stuff in addition to his research, like write a children’s book!–Jon’s our first Impactstory Advisor of the Month.

We chatted with Jon to learn more about how he uses Impactstory, what it’s like being an Advisor, and what he’s doing in other areas of his professional life.

Why did you initially decide to create an Impactstory profile?

A couple of years ago, I immersed myself into social media and the whole concept of ‘Web 2.0’. It was clear that the internet was capable of changing many aspects of the way in which we practice, communicate, and assess scientific research. There were so many tools though, and so much diversity, it was all a bit daunting, especially as someone so junior in their career. Although I guess that’s one of the advantages of being at this stage – I wasn’t tied down to any particular way of ‘doing science’ yet, and free to experiment.

Having followed the discussions on alternative and article-level metrics, when ImpactStory was released it seemed like a tool that could really make a difference for myself and the broader research community. At the time, it made no sense to me how the outputs of research were assessed – the name or the impact factor of a journal was given far too much meaning, and did nothing to really encapsulate the diversity of ways in which quality or impact, or putative pathways to impact, could be measured. ImpactStory seemed to offer a decent alternative, and hey look – it does! Actually, it’s not an alternative, but complementary tool for a range of methods in assessing how research is used.

Why did you decide to become an Advisor?

Pretty much for the reasons above! One thing I’m learning as a young scientist is that it’s easy to be part of an echo chamber on social media, advocating altmetrics and all the jazzy new aspects of research, but many scientists aren’t online. Getting those people involved in conversations, and alerting them to cool new tools is made a lot easier as an Advisor.

I reckon this type of community engagement is pretty important, especially in what appears to be such a crucial transitional phase for researchers, including things like open access and data, and the way in which research is assessed (e.g., through the REF here in the UK). ImpactStory obviously has a role in making this much easier for academics.

How have you been spreading the word about Impactstory in your first month as an Advisor?

Mostly sharing stickers! They actually work really well in getting people’s attention. They’re even more doubly useful when people ask things like “What’s a h-index”, so you can actually use them as a basis for further discussion. But yeah, I don’t really go out of my way to preach to people about altmetrics and ImpactStory – academics really don’t like being told what they should be doing and things, especially at my university. I prefer to kind of hang back, wait for discussions, and inject that things like altmetrics exist, and could be really useful when combined with things like a social media presence, or an ORCID, and that they are one of an integrated set of tools that can be really useful for assessing how your research is being used, as well as a kind of personal tracking device. I’d love to hold an ImpactStory/altmetrics Q and A or workshop at some point in the future.

You just wrote a children’s book about dinosaurs–tell us about it!

Let it be known that you brought this up, not me 😉

So, pretty much just by having a social media presence (mostly through blogging), I was asked to write a book on kids dinosaurs! Of course I said yes, and along with a talented artist, we created a book with pop-out dinosaurs that you can reconstruct into your very own little models! You can pre-order it here.* I think it’s out in October in the UK and USA. Is there an ImpactStory bit for that…? [ed: Not yet! Perhaps add it as a feature request on our Feedback forum? :)]

* (I don’t get royalties, so it’s not as bad promoting it…)

What’s the best part about your current gig as a PhD student at Imperial College London?

The freedom. I have an excellent supervisor who is happy to let me blog, tweet, attend science communication conferences and a whole range of activities that are complimentary to my PhD, as long as the research gets done. So there’s a real diversity of things to do, and being in London there’s always something science-related going on, and there’s a great community vibe too, with people who work within the broader scope of science always coming together and interacting. Of course, the research itself is amazing – I work with a completely open database called the Palaeobiology Database/Fossilworks, where even the methods are open so anyone can play with science if they wish!

Thanks, Jon!

Jon is just one of a growing community of Impactstory Advisors. Want to join the ranks of some of the Web’s most cutting-edge researchers and librarians? Apply to be an Advisor today!

The ultimate guide for staying up-to-date on your data, software, white papers, slide decks and conference posters’ impact

Getting impact alerts for your papers was pretty simple to set up, but what about tracking real-time citations, downloads, and social media activity for your other research outputs?

There are so many types of outputs to track–datasets, software, slide decks, and more. Plus, there seems to be dozens of websites for hosting them! How can you easily keep track of your diverse impacts, as they happen?

Don’t worry–it’s literally our job to stay on top of this stuff! Below, we’ve compiled the very best services that send impact alerts for your research data, software, slide decks, conference posters, technical reports, and white papers.

Research data

Specific data repositories gather and display metrics on use. Here, we go into details on metrics offered by GitHub, Figshare, and Dryad, and then talk about how you can track citations via the Data Citation Index.

GitHub

github_logo.jpg

If you use the collaborative coding website GitHub to store and work with research data, you can enable email alerts for certain types of activities. That way, you’re notified any time someone comments on your data or wants to modify it using a “pull request.”

First, you’ll need to “watch” whatever repositories you want to get notifications for. To do that, visit the repository page for the dataset you want to track, and then click the “Watch” button in the upper right-hand corner and select “Watching” from the drop-down list, so you’ll get a notification when changes are made.

Then, you need to enable notification emails. To do that, log into GitHub and click the “Account Settings” icon in the upper right-hand corner. Then, go to “Notification center” on the left-hand navigation bar. Under “Watching,” make sure the “Email” box is ticked.

Other GitHub metrics are also useful researchers: “stars” tell you if others have bookmarked your repository and “forks”–a precursor to a pull request–indicate if others have adapted some of your code for their own uses. Impactstory notification emails (covered in more detail below) include both of these metrics.

GitHub, Dryad and Figshare metrics via Impactstory

Screen Shot 2014-06-06 at 953.png

Dryad data repository and Figshare both display download information on their web sites, but they don’t send notification emails when new downloads happen. And GitHub tracks stars and forks, but doesn’t include them in their alert emails. Luckily, Impactstory alerts notify you when your data stored on these sites receives the following types of new metrics:

Dryad

Figshare

GitHub

pageviews

X

X

downloads

X

X

shares

X

stars (bookmarks)

X

forks (adaptations)

X

Types of data metrics reported by Impactstory

To set up alerts, create an Impactstory profile and connect your profile to ORCID, Figshare, and GitHub using the “Import from accounts” button at the top of your profile. (If you already have an Impactstory profile, this button will appear as a blue “Connect more accounts” button instead.) This will allow you to auto-import many of your datasets. If any of your datasets are missing, you can add them one by one by clicking the “Import individual products” icon and providing links and DOIs. Once your profile is set up, you’ll start to receive a notification email once every 1-2 weeks.

Data Citation Index

If you’ve deposited your data into a repository that assigns a DOI, the Data Citation Index (DCI) is often the best way to learn if your dataset has been cited in the literature.

To create an alert, you’ll need a subscription to the service, so check with your institution to see if you have access. If you do, you can set up an alert by first creating a personal registration with the Data Citation Index; click the “Sign In” button at the top right of the screen, then select “Register”. (If you’re already registered with Web of Knowledge to get citation alerts for your articles, there’s no need to set up a separate registration.)

Then, set your preferred database to the Data Citation Index by clicking the orange arrow next to “All Databases” to the right of “Search” in the top-left corner. You’ll get a drop-down list of databases; select “Data Citation Index.”

Now you’re ready to create an alert. On the Basic Search screen, search for your dataset by its title. Click on the appropriate title to get to the dataset’s item record. In the upper right hand corner of the record, you’ll find the Citation Network box. Click “Create citation alert.” Let the Data Citation Index know your preferred email address, then save your alert.

Software

The same GitHub metrics you can track for data can be used to track software impact, too. To receive alerts about comments on your code and pull requests, follow the notification sign-up instructions outlined under Research Data > GitHub, above. To receive alerts when your software gets stars or forks, sign up for Impactstory alerts according to the instructions under Research Data > GitHub, Dryad, and Figshare.

Impactstory and others are working on ways to track software impact better–stay tuned!

Technical reports, working papers, conference slides & posters

Slideshare sends alerts for metrics your slide decks and posters receive. Impactstory includes some of these metrics from Slideshare in our alert emails.  Impactstory alerts also include metrics for technical reports, working papers, conference slides, and posters hosted on Figshare.

Slideshare

w8Zu8Ow.png

Though Slideshare is best known for allowing users to view and share slide decks, some researchers also use it to share conference posters. The platform sends users detailed weekly alert emails about new metrics their slide decks and posters have received, including the number of total views, downloads, comments, favorites, tweets, and Facebook likes.

To receive notification emails, go to Slideshare.net and click the profile icon in the upper right-hand corner of the page. Then, click “Email” in the left-hand navigation bar, and check the “With the statistics of my content” box to start receiving your weekly notification emails.

Figshare and Slideshare metrics via Impactstory

You can use Impactstory to receive notifications for downloads, shares, and views for anything you’ve uploaded to Figshare, and for the downloads, comments, favorites, and views for slide decks and posters uploaded to Slideshare.

First, create an Impactstory profile and connect your profile to Figshare and Slideshare using the “Import from accounts” button at the top of your profile. (If you already have an Impactstory profile, this button will appear as a “Connect more accounts” button instead.) For both services, click the appropriate button, then provide your profile URL when prompted. Your content will then auto-import.

If any Figshare or Slideshare uploads are missing–which might be the case your collaborators have uploaded content on your behalf–you can add them one by one by clicking the “Import stuff” icon at the upper right-hand corner of your profile, clicking the “Import individual products” link, and then providing the Figshare DOIs and Slideshare URLs. Once your profile is set up, you’ll start to receive a notification email once every 1-2 weeks.

Videos

Vimeo and Youtube both provide a solid suite of statistics for videos hosted on their sites, and you can use those metrics to track the impact of your video research outputs. To get alerts for these metrics, though, you’ll need to sign up for Impactstory alerts.

Vimeo and Youtube metrics via Impactstory

Vimeo tracks likes, comments, and plays for videos hosted on their platform; Youtube reports the same, plus dislikes and favorites. To get metrics notifications for your videos hosted on either of these sites, you’ll need to add links to your videos to your Impactstory profile.

Once you’ve signed up for an Impactstory profile, the “Import stuff” icon at the upper right-hand corner of your profile, then click the “Import individual products” link. There, add URLs for each of the  videos and click “Import”. Once they’re imported to your profile, you’ll start to receive notifications for new video metrics once every 1-2 weeks.

Are we missing anything? We’ve managed to cover the most popular platforms in this post, but we’d love to get your tips on niche data repositories, video platforms, and coding sites that keep you up to date on your impact by sending alerts. Leave them in the comments below!

Bookmark this guide. This post–and our other Ultimate Guide for articles–will be updated over time, as services change.

Open Science & Altmetrics Monthly Roundup (May 2014)

Don’t have time to stay on top of the most important Open Science and Altmetrics news? We’ve gathered the very best of the month in this post. Read on!

GitHub & co. continue working to incentivize open science software

This month, collaborative coding site GitHub updated the public on their work with Figshare, Zenodo, and Mozilla Science to create citable code for academic software. Now, you can make any GitHub repository more citable–and accessible over time–by minting a DOI for it.

Researchers at the SciForge project responded to the announcement with a list of “10 non-trivial things GitHub & friends can do for science.” In their post, they pointed out that minting DOIs for software code is just the tip of the iceberg. Other challenges include reconciling GitHub’s commercial interests with what’s best for the scientific community, maintaining metadata quality for metadata submitted to DOI registries via Figshare and Zenodo, and optimizing how DOIs are issued for software that has multiple versions.

Of course, not everyone uses GitHub to manage their research software to begin with. If you’re a GitHub beginner, check out Carly Strasser’s “GitHub: a primer for researchers” and the GitHub guide to getting started.

Originator of Open Notebook Science, Jean-Claude Bradley, Dies

Chemist and Open Science advocate Jean-Claude Bradley passed away this month. Bradley is most famous for coining the term Open Notebook Science, which he used to describe his practice of “making all your research freely available to the public, and in real time”. His lab did its work this way for years. The Open Science community has lost a giant. Jean-Claude will be greatly missed.

How many scholarly documents are on the Web?

According to research published this month in PLOS ONE, “the [lower bound] number of scholarly documents, published in English, available on the web is roughly 114 million.”

Why is this important? Well, with the large number of scholarly documents on the web, we can text- and data-mine at scale–so long as these documents are all Open Access. But as @openscience pointed out on Twitter, 3 in 4 scholarly documents on the Web aren’t Open Access–which brings us to our next news item.

Are most researchers Open Access poseurs?

A recent publisher survey of Canadian authors found that while 83% agreed that Open Access to scholarship is important, less than 10% of authors considered OA when deciding where to publish. And a recently tweeted JASIST article from 2013 shows that only around 36% of European authors are taking advantage of publishers’ permissions to post OA copies of otherwise paywalled scholarship.

Why the disconnect between beliefs and practice? It’s not clear from these sources, but we hope that the numbers continue to increase over time, so we end up in a fully Open Access future.

Other recent altmetrics news

  • PeerJ makes peer-reviews more citable: the publisher now issues DOIs for open peer-reviews of its articles, making it possible to cite peer reviews using a permanent identifier. In doing so, peer-review contributions will remain accessible over time, even as URLs change, and reviewers will now be able to more easily track citations to their reviews (thereby incentivizing open peer-review).

  • Altmetrics-themed workshop at SSP 2014 Meeting: some of the area’s brightest minds–including Euan Adie (Altmetric.com) and William Gunn (Mendeley.com)–participated yesterday in the “21st Century Research Assessment” panel at this year’s Society for Scholarly Publishing annual meeting. As you might expect, the event was highly tweeted: check out the #sspboston hashtag on Twitter to witness the debate.

  • Australian and New Zealander librarians sought for altmetrics survey: a team of researchers seeks participants for a survey on support for altmetrics at Australian and New Zealand academic libraries. Respond to the survey on SurveyMonkey before it closes on June 7, 2014.

  • Impactstory launches notification emails, Advisors program: Now, you no longer have to visit impactstory.org to find out when your research has received new citations, downloads, or tweets. Instead, we’ll send you an email alert. We’re really excited about this new feature and also about another big launch that happened this month: our Advisors program!

    Impactstory users have been asking us for months how they can help spread the word. So, in addition to launching a Spread the Word resources page, we’ve started an Advisors program, so motivated advocates can better host Impactstory workshops, help us understand their needs, and advocate for altmetrics at their institution.  To learn more–and apply!–visit our website.

Upcoming events you can’t miss

Two great events are happening in June: the Altmetrics14 workshop in Bloomington, Indiana and the Special Library Association 2014 Annual Meeting in Vancouver, British Columbia. Heather will appear on an altmetrics panel and at the closing session of SLA ‘14, and Stacy will be in attendance at Altmetrics14. We hope to see you at both events! But if you can’t make ‘em, follow along on Twitter at #sla2014 and #altmetrics14.

Stay connected

We share altmetrics and Open Science news as-it-happens on our Twitter, Google+, Facebook, or LinkedIn pages. And if you don’t want to miss next month’s news roundup, remember that you can sign up to get these posts and other Impactstory news delivered straight to your inbox.

The ultimate guide to staying up-to-date on your articles’ impact

You published a paper–congrats!  Has anyone read it?  Cited it?  Talked about it on Twitter?  How can you find out–as it happens?

Automated alerts!  Email updates that matter come right to you.

We’ve compiled a two-part primer on the services that deliver essential research impact metrics straight to your inbox, so you can stay up to date without having to do a lot of work.

In this post, we’ll share tips for how to automagically track citations, altmetrics and downloads for your publications; in our next post, we’ll share strategies for tracking similar metrics for your data, code, slides, and social media outreach.

Citations

Let’s start with citations: the “coin of the realm” to track scholarly impact. You can get citation alerts in two main ways: from Google Scholar or from traditional citation indices.

Google Scholar Citations alerts

Google Scholar citations track any citations to your work that occur on the scholarly web. These citations can appear in any type of scholarly document (white papers, slide decks, and of course journal articles are all fair game) and in documents of any language. Naturally, this means that your citation count on Google Scholar may be larger than on other citation services.

To get Google Scholar alerts, first sign up for a Google Scholar Citations account and add all the documents you want to track citations for. Then, visit your profile page and click the blue “Follow” button at the top of your profile. You’ll see a drop-down like this:

Screenshot of a Google Scholar profile, showing the blue

Enter your preferred email address in the box that appears, then click “Create alert.” You’ll now get an alert anytime you’ve received a citation.

Citation alerts via Scopus & Web of Knowledge

Traditional citation indices like Scopus and Web of Knowledge are another good way to get citation alerts delivered to your inbox. These services are more selective in scope, so you’ll be notified only when your work is cited by vetted, peer-reviewed publications. However, they only track citations for select journal articles and book chapters–a far cry from the diverse citations that are available from Google Scholar. Another drawback: you have to have subscription access to set alerts.

Web of Knowledge

Web of Knowledge offers article-level citation alerts. To create an alert, you first have to register with Web of Knowledge by clicking the “Sign In” button at the top right of the screen, then selecting “Register”.

5sBUo1G.png

Then, set your preferred database to the Web of Science Core Collection (alerts cannot be set up across all databases at once). To do that, click the orange arrow next to “All Databases” to the right of “Search” in the top-left corner. You’ll get a drop-down list of databases, from which you should select “Web of Science Core Collection.”

Now you’re ready to create an alert. On the Basic Search screen, search for your article by its title. Click on the appropriate title to get to the article page. In the upper right hand corner of the record, you’ll find the Citation Network box. Click “Create citation alert.” Let Web of Knowledge know your preferred email address, then save your alert.

Scopus

In Scopus, you can set up alerts for both articles and authors. To create an alert for an article, search for it and then and click on the title in your search results. Once you’re on the Article Abstract screen, you will see a list of papers that cite your article on the right-hand side. To set your alert, click “Set alert” under “Inform me when this document is cited in Scopus.”

To set an author-level alert, click the Author Search tab on the Scopus homepage and run a search for your name. If multiple results are returned, check the author affiliation and subjects listed to find your correct author profile. Next, click on your author profile link. On your author details page, follow the “Get citation alerts” link, and list your saved alert, set an email address, and select your preferred frequency of alerts. Once you’re finished, save your alert.

With alerts set for all three of these services, you’ll now be notified when your work is cited in virtually any publication in the world! But citations only capture a very specific form of scholarly impact. How do we learn about other uses of your articles?

Tracking article pageviews & downloads

How many people are reading your work? While you can’t be certain that article pageviews and full-text downloads mean people are reading your articles,  many scientists still find these measures to be a good proxy. A number of services can send you this information via email notifications for content hosted on their sites. Impactstory can send you pageview and download information for some content hosted elsewhere.

Publisher notifications

Publishers like PeerJ and Frontiers send notification emails as a service to their authors.

If you’re a PeerJ author, you should receive notification emails by default once your article is published. But if you want to check if your notifications are enabled, sign into PeerJ.com, and click your name in the upper right hand corner. Select “Settings.” Choose “Notification Settings” on the left nav bar, and then select the “Summary” tab. You can then choose to receive daily or weekly summary emails for articles you’re following.

In Frontiers journals, it works like this: once logged in, click the arrow next to your name on the upper left-hand side and select “Settings.” On the left-hand nav bar, choose “Messages,” and under the “Other emails” section, check the box next to “Frontiers monthly impact digest.”

Both publishers aggregate activity for all of the publications you’ve published with them, so no need to worry about multiple emails crowding your inbox at once.

Not a PeerJ or Frontiers author? Contact your publisher to find out if they offer notifications for metrics related to articles you’ve published. If they do, let us know by leaving a comment below, and we’ll update this guide!

ResearchGate & Academia.edu

bhr3lLZ.png

Some places where you upload free-to-read versions of your papers, like ResearchGate and Academia.edu, will report how many people have viewed your paper on their site.

You can turn on email notifications for pageviews, downloads, comments, bookmarks, and citations by other papers on ResearchGate by visiting “Settings” (on both sites, click the triangle in the upper right-hand corner of your screen). Then, click on the “Notifications” tab in the sidebar menu, and check off the types of emails you want to receive. On Academia.edu, the option to receive new metrics notifications for pageviews, downloads, and bookmarks are under “Analytics” and “Papers”; on Researchgate, it’s under “Your publications” and “Scheduled updates”.

PLOS article metrics via Impactstory

Impactstory now offers alerts, so you’re notified any time your articles get new metrics, including pageviews and downloads. However, we currently only offer these metrics for articles published in PLOS journals. (If you’d like to see us add similar notifications for other publishers, submit an idea to our Feedback site!) We describe how to get Impactstory notifications for the articles that matter to you in the Social Media section below.

Post-publication peer review

Some articles garner comments as a form of post-publication peer review. PeerJ authors are notified any time their articles get a comment, and any work that’s uploaded to ResearchGate can be commented upon, too. Reviews can also be tracked via Altmetric.com alerts.

PeerJ

To make sure you’re notified with you receive new PeerJ comments, login to PeerJ and go to “Settings” > “Notification Settings”  and then click on the “Email” tab. There, check the box next to “Someone posts feedback on an article I wrote.”

ResearchGate

To set your ResearchGate notifications, login to the site and navigate to “Settings” > “Notifications.” Check the boxes next to “One of my publications is rated, bookmarked or commented on” and “Someone reviews my publication”.

Altmetric.com

Post-publication peer reviews from Publons and PubPeer are included in Altmetric.com notification emails, and will be included in Impactstory emails in the near future. Instructions for signing up for Altmetric and Impactstory notifications can be found below.

PubChase

Article recommendation platform PubChase can also be used to set up notifications for PubPeer comments and reviews that your articles receive. To set it up, first add your articles to your PubChase library (either by searching and adding papers one-by-one, or by syncing PubChase with your Mendeley account). Then, hover over the Account icon in the upper-right hand corner, and select “My Account.” Click “Email Settings” on the left-hand navigation bar, and then check the box next to “PubPeer comments” to get your alerts.

Social media metrics

What are other researchers saying about your articles around the water cooler? It used to be that we couldn’t track these informal conversations, but now we’re able to listen in using social media sites like Twitter and on blogs. Here’s how.

Social media metrics via Altmetric.com

Altmetric.com allows you to track altmetrics and receive notifications for any article that you have published, no matter the publisher.

S00Rpwu.png

First, install the Altmetric.com browser bookmarklet (visit this page and drag the “Altmetric It!” button into your browser menu bar). Then, find your article on the publisher’s website and click the “Altmetric it!” button. The altmetrics for your article will appear in the upper right-hand side of your browser window, in a pop-up box similar to the one at right.

Next, follow the “Click for more details” link in the Altmetric pop-up. You’ll be taken to a drill-down view of the metrics. At the bottom left-hand corner of the page, you can sign up to receive notifications whenever someone mentions your article online.

The only drawback of these notification emails is that you have to sign up to track each of your articles individually, which can cause inbox mayhem if you are tracking many publications.

Social media metrics via Impactstory

9GtkvJ6.png

Here at Impactstory, we recently launched similar notification emails. Our emails differ in that they alert you to new social media metrics, bookmarks, and citations for all of your articles, aggregated into a single report.

To get started, create an Impactstory profile and connect your profile to ORCID, Google Scholar, and other third-party services. This will allow you to auto-import your articles. If a few of your articles are missing, you can add them one by one by clicking the “Import stuff” icon, clicking the “Import individual products” link on the next page, and then providing links and DOIs. Once your profile is set up, you’ll start to receive your notification emails once every 1-2 weeks.

When you get your first email, take a look at your “cards”. Each card highlights something unique about your new metrics for that week or month: if you’re in a top percentile related to other papers published that year or if your PLOS paper has topped 1000 views or gotten new Mendeley readers. You’ll get a card for each type of new metric one of your articles receives.

Note that Impactstory notification emails also contain alerts for metrics that your other types of outputs–including data, code and slide decks–receive, but we’ll cover that in more detail in our next post.

Now you’ve got more time for the things that matter

No more wasting your days scouring 10+ websites for evidence of your articles’ impact; it’s now delivered to your inbox, as new impacts accumulate.

Do you have more types of research outputs, beyond journal articles? In our next post, we’ll tell you how to set up similar notifications to track the impact of your data, software, and more.

Updates:
12/17/2014: 
Updates to describe the revamped Impactstory interface and new notification options for ResearchGate and Academia.edu
5/27/2014: Added information about PubChase notification emails.

Do you have what it takes to be an Impactstory Advisor?

Posted on
Help us spread the word! (Photo licensed CC-BY-SA by Vacant Fever)

Help us spread the word!
(Photo licensed CC-BY-SA by Vacant Fever)

You’ve been asking for an opportunity to help spread the word about Impactstory. Here it is.

We’re recruiting a select group of researchers and librarians to become Impactstory Advisors!

Our advisors will:

  • Invite friends and colleagues to try out Impactstory

  • Give us feedback on features and report bugs

  • Host brown bag lunches and presentations on Impactstory at their school or library

  • Spread the word locally by hanging up our (soon to be released) cool new posters

  • Connect Impactstory to the rest of your online life–link to your profile from your Twitter bio, Facebook page, lab website, and anywhere else you can!

In return, we’ll foot the pizza bill for Impactstory workshops, give our Advisors access to Impactstory Premium (details coming soon!), send awesome swag, and share hot off the press news on planned features and other company developments.

The best benefit of all? Our community of like-minded, cutting edge Advisors will get the satisfaction of knowing they’re helping to change research evaluation for the better.

Think you have what it takes? Apply to be an Impactstory Advisor today!

Open Science & Altmetrics Monthly Roundup (April 2014)

Don’t have time to stay on top of the most important Open Science and Altmetrics news? We’ve gathered the very best of the month in this post. Read on!

Funding agencies denying payments to scientists in violation of Open Access mandates

Want to actually get paid from those grants you won? If you haven’t made publications about your grant-funded research Open Access, it’s possible you could be in violation of funders’ public access mandates–and may lose funding because of it.

Richard Van Noorden of Nature News reports,

The London-based Wellcome Trust says that it has withheld grant payments on 63 occasions in the past year because papers resulting from the funding were not open access. And the NIH…says that it has delayed some continuing grant awards since July 2013 because of non-compliance with open-access policies, although the agency does not know the exact numbers.

Post-enforcement, compliance rates increased 14% at the Wellcome Trust and 7% and the NIH. However, they’re still both a ways from seeing full compliance with the mandates.

And that’s not the only shakeup happening in the UK: the higher ed funding bodies warned researchers that any article or conference paper accepted after April 1, 2016 that doesn’t comply with their Open Access policy can’t be used for the UK Research Excellence Framework, by which universities’ worthiness to receive funding is determined.

That means institutions now have a big incentive to make sure their researchers are following the rules–if their researchers are found out of compliance, the institutions’ funding will be in jeopardy.

Post-publication peer review getting a lot of comments

Post-publication peer review via social media was the topic of Dr. Zen Faulkes’ “The Vaccuum Shouts Back” editorial, published in Neuron earlier this month. In it, he points out:

Postpublication peer review can’t do the entire job of filtering the scientific literature right now; it’s too far from being a standard practice….[it’s] an extraordinarily valuable addition to, not a substitute for, the familiar peer review process that journals use before publication. My model is one of continuous evaluation: “filter, publish, and keep filtering.”

So what does that filtering look like? Comments on journal and funder websites, publisher-hosted social networks, and post-pub peer review websites, to start with. But Faulkes argues that “none of these efforts to formalize and centralize postpublication peer review have come close to the effectiveness of social media.” To learn why, check out his article on Neuron’s website.

New evidence supports Faulkes’ claim that post-publication peer review via social media can be very effective. A study by Paul S. Brookes, published this month in PeerJ, found post-publication peer review using blogs makes corrections to the literature an astounding eight times as likely to happen than corrections reported to journal editors in the traditional (private) manner.

For more on post-publication peer review, check out this classic Frontiers in Computational Neuroscience special issue, Tim Gower’s influential blog post, “How might we get to a new model of mathematical publishing?,” or Faculty of 1000 Prime, the highly respected post-pub peer review platform.

Recent altmetrics-related studies of interest

  • Scholarly blog mentions relate to later citations: A recent study published in JASIST (green OA version here) found that mentions of articles on scholarly blogs correlate to later citations.

  • What disciplines have the highest presence of altmetrics? Hint: it’s not the ones you think. Turns out, a higher percentage of humanities and social science articles have altmetrics than for those in the biomedical and life sciences. Researchers also found that only 7% of all papers found in Web of Science had Altmetric.com data.

  • Video abstracts lead to more readers: For articles in the New Journal of Physics, video abstract views correlate to increased article usage counts, according to a study published this month in the Journal of Librarianship and Scholarly Communication.

New data sources available for Impactstory & Altmetric.com

New data sources include post-publication peer review sites Publons and PubPeer, and microblogging site Weibo Sina (the “Chinese Twitter”). Since we get data from Altmetric, that means Impactstory will be reporting this data soon, too!

And another highly-demanded data source will be opening up in the near future: Zotero. The Sloan Foundation has backed research and development for the open source reference management software that will eventually help Zotero build “a preliminary public API that returns anonymous readership counts when fed universal identifiers (e.g. ISBN, DOI).” So, some day soon, we’ll be able to report Zotero readership information alongside Mendeley stats in your profile–a feature that many of you have been asking us about for a long time.

Altmetric.com offering new badges

Altmetric.com founder Euan Adie announced that for those who want to de-emphasize numeric scores on content, the famous “donut” badges will now be available sans Altmetric score–a move heralded by many in the altmetrics research community as being a good move away from “one score to rule them all.”

Must-read blog posts about ORCID and megajournals

We’ve been on a tear publishing about innovations in Open Science and altmetrics on the Impactstory blog. Here are two of our most popular posts for the month:

Stay connected

Do you blog on altmetrics or Open Science and want to share your posts with us? Let us know on our Twitter, Google+, Facebook, or LinkedIn pages. We might just feature your work in next month’s roundup!

And if you don’t want to miss next month’s news, remember that you can sign up to get these posts and other Impactstory news delivered straight to your inbox.

“Share your impact story” contest winner announced!

Last week, we asked you to share how Impactstory has helped your career. Today, we’re announcing the contest winner: Dr. Emilio Bruna!

Dr. Emilio Bruna, our contest winnerEmilio is a Professor with the Department of Wildlife Ecology & Conservation at the University of Florida and an Open Science advocate. Here’s his impact story:

I included Impactstory data in my portfolios for 1) promotion to full professor and  2) selection to UF’s Academy of Distinguished Teaching Scholars,  a campus-wide faculty award.  Both were successful.  

But perhaps more importantly, I included Impactstory in my workshop on scientific publishing for graduate students, where in one of the sessions all the participants set up ORCID IDs, Researcher IDs, and Impactstory Profiles – check it out. Students get it.

Emilio’s story echoes many others we’ve heard since founding Impactstory: you’re using our service to uncover all the ways in which your research makes an impact, and you’re using that data when going up for tenure & promotion, applying for grants and awards, and teaching the next generation of scientists what it means to be an influential scholar.

For having the best story, Emilio wins an Impactstory t-shirt of his choice. Congrats, Emilio!

And thanks to all of our contest participants!

How to become an academic networking pro on LinkedIn

You now have a solid LinkedIn profile, but you don’t quite know what to do with it.

After all, it’s difficult for scientists to self-promote. To many, it just feels unnatural. Plus, your contacts are out of date, and LinkedIn functionalities like Endorsements seem to be not quite right you as an academic.

Given that, how exactly are you supposed to use LinkedIn appropriately to connect with other scientists and find job opportunities?

You’re in luck. On top of the tips we compiled for our last post, we’ve found the best strategies for using LinkedIn to network in academia.

In this post, we’ll tell you the keys to networking for academics on LinkedIn: how to find and sustain a professional relationship with colleagues and experts in your field, get others to Endorse and Recommend you in the right ways, and connect LinkedIn to the rest of your professional life.

1. Get connected to your existing web of co-workers and advisors

Lu6DG0z.png

It’s surprisingly easy to find people you already know and add them to your network on LinkedIn.

Use the Add Connections tab in the top right corner of your profile to connect LinkedIn to your email account.

LinkedIn then suggests Connections based on your contacts. A rule to follow for LinkedIn, as opposed to Twitter and Facebook, is that you should only select Connections you actually know and feel comfortable asking to keep in touch (former collaborators, co-workers, and friends are good choices).

When Connecting, it’s a nice touch to send a message saying hello. Networking is all about building meaningful relationships, not how many people you have in your virtual Rolodex.

2. Request introductions to new contacts

If you want a good way to meet potential collaborators or get an “in” for a job, Connecting with strangers can be useful.

But how do you get around the awkwardness of asking strangers to Connect? The answer: ask a current contact for an introduction.

Here’s an example of how that would work: I’m not currently Connected to genomics researcher Mike Eisen on LinkedIn, but let’s say I want to collaborate with him to do some research on a great idea I have.

H7IeBkj.png

The first thing I need to do to connect with him is find a contact that we have in common. So, I visit Mike’s profile. On the left-hand side is a “How You’re Connected” graphic. I can scroll through the list of contacts we have in common to find a suitable middleman–Mendeley’s William Gunn.

Next, I would click on the “Ask William about Mike” link. In the dialog box that appears, I’d write my request for an introduction and send it to William. The request should follow three key rules:

Be specific

William might take 10 minutes out of his day to write a recommendation for me, so I shouldn’t waste his time. That means telling him exactly why I want to meet Mike: what Mike does that interests me (he’s a genomics researcher), and what I’m looking to get out of an introduction (an opportunity to tell him about my great research idea: widgets for genomics researchers).

Include a “pitch” as to why an introduction would be valuable

Likewise, I should make it clear what Mike would get out of meeting me. What do I bring to the table? In this case, it’d be the chance to learn about a well-received new widget, and a future NSF grant opportunity.

Show appreciation, and also provide William with an “easy out”

William’s time is valuable, so I should make it clear that I’m thankful that he’s considering writing an Introduction. A good way to do that in addition to saying thanks is to give him a way to beg off without feeling too guilty.

Two additional rules for special scenarios are: 1) If we didn’t know each other well, I’d want to remind William how we met, and 2) If William does introduce Mike and I, I should follow up with an update and thanks.

Using these rules, here’s how my request for an Introduction reads:

Hi William,

I’m writing to ask if you’d be kind enough to introduce me to Mike (if, of course, you feel you know him well enough to do so). As you know, I’ve been toying with a new idea for widgets for genetics researchers. The prototype has been very well received by our initial user group; I think it has the potential to be a success, with the right stewardship.

It’s for that reason I want to connect with Mike. Being a well-known genomicist, Mike might be interested in the widget and, eventually, collaborating with me to go after a round of NSF funding. I hear there’s an upcoming “Dear Colleagues” letter that may be specifically related to genetics research widget design.

Thanks very much for taking the time to read this and considering my request. Feel free to decline if you don’t have the bandwidth to make the Introduction right now, I completely understand.

Best,
Stacy

One final note: keep your requests for introductions to “2nd degree connections”–that is, friends of friends–because your chances of getting a meaningful introduction to a stranger through a friend of a friend of a friend depends on too many variables to be successful.

3. “Cold call” people you want to get to know

This strategy is one of the most risky, but can also be rewarding if it helps you move beyond your existing network and break into new areas–especially important for those seeking jobs.

You can use LinkedIn messaging to send a short note to introduce yourself to and ask advice of individuals who have a job similar to the one you’re aiming for, or to get in touch with recruiters (if you’re looking for a job in industry). You might also consider writing messages to people you don’t know that have viewed your profile, if you think it’d have a payoff (i.e. a connection or, better yet, a lead on a job).

4. Boost your discoverability with the help of your network

eTqcpRL.png

Let’s be clear: Endorsements can be totally useless when not done right. In the past, I’ve been endorsed for “Library”. And I’ve seen Endorsements on others’ profiles for even more mundane things.

But Endorsements can be useful for academics, if done with care. The more people Endorse you for a skill or knowledge area (like “Grant writing”), the more you are associated with that skill by LinkedIn and search engines–thereby upping your appearance in search results, surfacing you to potential collaborators or future employers.

Here’s how to keep from getting Endorsed for something too vague to be useful. You can control what others are able to Endorse you for by editing the Skills & Endorsements section of your profile. Delete any skills that don’t apply or aren’t relevant. You can also reorder how those skills appear on your profile–helpful for breaking out of a loop where you are most often endorsed for the skills you’re most endorsed for.

 If you choose to Endorse others, be sure to only do so for people you know, and for skills you actually think they possess. Otherwise, it comes off as spammy.

5. Land at least one Recommendation

qHoBCtc.png

Recommendations can help you network passively using your profile. Having at least one Recommendation on your profile makes it clear what type of an employee or collaborator you are, which builds trust in your personal brand.

Asking others to write Recommendations for you doesn’t have to be awkward. Offer to write a Recommendation for them, and let them know you’d welcome a Recommendation in return. Just be sure to make it clear that reciprocation is by no means required.

When writing a Recommendation, make it clear how you know him or her. Did you serve as co-chairs for a professional society? Did she supervise you at your last job? Give specifics about what makes him or her a solid co-worker, and let the reader know what types of jobs you think she or he could excel at.

6. Let others know you’re here and ready to dance

Now it’s time to connect your LinkedIn presence to the rest of your professional life.

Make new LinkedIn Connections in your offline life by advertising that you are on the network. One way to do that is to create a memorable LinkedIn URL and include that URL on your business card. You can also put your custom URL or a LinkedIn badge prominently on your professional website or blog.

LinkedIn should be just one piece of your online identity. Academia.edu, Mendeley, and Impactstory all have functionalities that LinkedIn lacks; use those sites to host your publications, find new collaborators, and track impact metrics for your work.

7. Boost the signals and cut the noise from LinkedIn Notifications

LinkedIn’s Notification emails can be both a blessing and a curse.

Notifications about your Connections–which include information about their new jobs, promotions, and requests for Recommendations–can be a nice way to stay abreast of what your colleagues are up to, and a reminder to check-in with former coworkers to say hello.

However, all the Notifications can sometimes be too much. (Do you really need to know about your LinkedIn Connections’ work anniversaries?) You can reduce the “noise” if you are sure to only connect with people you know, and review your Communications settings to make sure you’re getting the types of email you’d prefer to see.

You’ll also want to pay close attention to what sort of Notifications you’re sending out. Job seekers especially should make sure their “Activity broadcasts” are set up correctly (go to Privacy & Settings > “Turn on/off your activity broadcasts”), so current employers don’t get emails letting them know you’re on the job hunt.

Are you ready to rumble?

By now, you’ve reconnected with coworkers and friends to build a meaningful network. And you’ve learned how to hack some of LinkedIn’s more annoying features–Endorsements and Notifications chief among them–to build your brand as a scientist, making new contacts and uncovering professional opportunities along the way.

Do you have other tips for networking using LinkedIn? Want to share a story about a time you triumphed–or failed–to make new Connections or get a Recommendation on the site? Leave them in the comments section below!