Measuring Usage: A Comprehensive Analysis of a Social Work Journal Collection

Abstract This study examines what can be learned about a library’s electronic social work journal collection from usage statistics, survey data, faculty publications, and an examination of open access (OA) availability. A collections analysis was completed using data from two sources: a custom report by 1Science and results of a faculty survey on top journals for teaching. After creating a list of journals important to social work, top journals were identified by article downloads, faculty-authored publications, and references to faculty-authored papers. A publications analysis using faculty websites and author searches in Web of Science was also completed, to provide local, contextual data. SHERPA/RoMEO was used to determine the journals’ OA level and archiving policy. Library coverage for the journals was also included in the analysis. Results show that the McGill University Library has access to almost all of the journals identified as important to social work. Nearly one-third of publications authored by the McGill University School of Social Work since 2006 are OA, and more than half of the faculty in the school have at least one article published in an OA journal. While this is a good start for librarians who want to help faculty and students understand OA publishing and access, there is room for outreach in this area. While these results will aid librarians supporting faculty, students, and practitioners in the field of social work, a secondary aim of the study is to demonstrate a method that can be used by librarians undertaking similar analyses in other fields.


Introduction
Academic librarians who support schools of social work can play a key role in helping faculty and students understand which journals have the most impact, i.e., are the most downloaded, cited, and published in, as well as which titles are open access (OA) or offer options to make the content freely available, such as self-archiving. Gaining deeper insight into social work researchers' journal usage and publication behaviors has the potential to help liaison librarians support faculty and students in a variety of ways. One is to educate faculty and students about journal metrics, which have implications for research impact, tenure and promotion, and information retrieval and use. Another is to inform thoughtful collection development activities, including purchasing, renewing, and de-selection of journals. Additionally, OA publishing, it can be argued, is critical in the field of social work research, therefore making it a behavior to track and analyze. Evidence-based practice (EPB) is widely considered by many to be a way of bridging the research-practice gap among social work practitioners. However, in the field, finding online EBP repositories and "gold standard" evidence can be challenging as they differ quite significantly from subscription database access as found in academic libraries (Bingham, Wirjapranata, & Chinnery, 2016). This means that publishing high-quality articles and reviews in OA journals or archiving in disciplinary or subject repositories can play an important role in improving EBP in social work agency settings. While these are familiar topics for many social work liaison librarians, they are not always well understood by students and faculty. A collection analysis is one way to get started on understanding the journals available in a given field and the type of access they offer. To better support the McGill University School of Social Work, librarians at the McGill University Library asked: "What can be learned about McGill's Social Work researchers and the Library's collection from combining usage statistics, survey data, and faculty publication and citation data?" Specifically, to answer this question, this study examined the following: Does the Library have access to the journals that support the School of Social work, and what are the gaps in the electronic journal collection? Does the faculty publish in open access journals? Is there room for enhanced outreach with this faculty to inform and educate regarding open access publishing?
This study aims to add to the body of literature meant to help social work liaison librarians enhance collections activities and outreach initiatives with students and faculty. Furthermore, the methods used in this study can be applied to other subject areas, so a secondary aim is to help librarians in other fields who may want to undertake a similar analysis of the electronic journals collections in their libraries.

Collection analysis
Local citation analysis has long been a common practice in academia and in libraries. It is considered a "valid measure of journal use" (Duy & Vaughan, 2006, p. 516) and a standard (McDonald, 2007, p. 39). Additionally, Ke and Bronicki found it to be a useful measurement because of its objectivity (2015, p. 166). Literature reviews covering the history of citation analysis within librarianship are easily found, including Miller's (2011) discussion covering the practice back to 1927. These differ from articles discussing the history of methods and merits on creating "core" lists of journals, in which journals are often ranked according to various criteria (Nisonger, 2007). Rather, citation analysis can be thought of as "taking stock" of the collection's use, rather than a ranking the journals within it. Rathemacher's (2010) literature review covers more recent history and focuses on practical aspects of collection analysis such as its use in determining which journal subscriptions to retain.
Both multidisciplinary and single-discipline citation analyses have merit, and libraries can learn from the many variations. For example, Chew, Schoenborn, Stemper, and Lilyard's study of 12 disciplines (2016) and Wical and Vandenbark's analysis of nursing, chemistry, biology, and mathematics (2015) demonstrate the complexity of projects that include multiple subject areas. Some prefer Spearman's rho ranked correlation, when a ranked list of journals is required (Black, 2013). The literature also contains numerous citation analysis studies on single subjects, including Miller's 2011 andBarnett-Ellis andTang's (2016) analyses of biology theses, Dewland's 2011 citation analysis of a business school faculty, and Kimball, Stephens, Hubbard, and Pickett's (2013) study of the Department of Atmospheric Sciences faculty. These types of projects demonstrate that simplicity can be just as useful and provide practical advice that can be applied in other disciplines.
As electronic resources have become a constant in libraries, the tools and methods to complete citation analysis have changed over time and vary from study to study. Some librarians manually pull citation data from indexing databases like Web of Science or Scopus and analyze it themselves (De Groote, Blecic, & Martin, 2013;Kimball et al., 2013;Wical & Vandenbark, 2015). Others rely on purchasing data from indexing companies such as Thomson Reuters (now Clarivate Analytics) (Chew et al., 2016). Researchers often combined different measurements with their citation analyses to add depth and to create richer pictures of how their collections are being used. For example, De Groote et al. (2013) combined local citation data with COUNTER usage data and data from their link resolver. The California Digital Library's (CDL) Weighted Value Algorithm is an approach that uses a series of mathematical calculations that measure value in three categories: utility, quality, and cost-effectiveness (Jurczyk & Jacobs, 2014;Wilson & Li, 2012). Other libraries have used this method for their own analysis, (Chew et al., 2016). Open access has become of greater interest to libraries in recent years, and Mercer (2011) combined this type of analysis with a knowledge of OA journals to determine the type of journals librarians were publishing in.

Open access
One of the key values guiding the ethical practice of social workers is the pursuit of social justice, including equitable access to all resources (Canadian Association of Social Workers, 2005). When providing social work services, which include a wide range of health, mental health, and social service options, social workers are "obligated to ensure that the interventions they employ are informed by the available research evidence" (Roberts & Yeager, 2006, p. 3), but in some settings, social workers have limited access to reliable empirical research. The importance placed on evidence-based practice in the field of social work is one of the many reasons why advocacy for OA publishing and conversations about changing traditional scholarly communication models are particularly relevant for social work faculty, as well as for students who will later become practitioners. Librarians working in the field of social work can aid their faculty and students by helping them discover and understand the OA landscape and identify resources in their field.
Understanding academics' OA publishing behaviors can be extremely informative for librarians who want to gain insight into whether researchers publish in OA journals (gold OA) or venues with OA policies that give the author the right to self-archive in an open access repository or on a personal or commercial website (green OA). Previous studies have investigated whether academics in certain disciplinary fields or from certain institutions practice OA publishing, why or why not, and how they share their research using OA channels (Hughes, 2008;Mercer, 2011;Nichols & Twidale, 2017;Zhu, 2017). These studies differ from the prolific literature on faculty and researcher attitudes toward OA and beliefs in citation impact of publishing in OA journals in that they focus instead on OA publishing behaviors. The methods used in these studies range from citation and publication analysis to survey research to publication venue analysis. Nichols and Twidale (2017) summarized and critiqued existing measures of openness used in studies published between 2005 and 2014. The studies aimed to quantify the open availability of research articles within certain disciplines or by certain groups of researchers. The methods employed by these studies consisted primarily of publication analysis-of choosing a domain or selection of publications and developing a search method to determine OA availability of individual articles. Mercer's (2011) analysis assessed the self-archiving behaviors of academic librarians in the United States. Using the database Library Information Science Abstract (LISA), Mercer identified articles published in 2008 where the primary author was an academic librarian and then determined if those articles were freely available online. This study helped answer the question of whether or not "their behaviors reflect a commitment to OA because of their increased exposure to scholarly communication issues" (pp. 443-444). Laakso and Polonioli (2018) also did a publication analysis to look at the depositing behavior of ethics researchers as well as any instances of copyright infringement. They asked: Are ethicists prone to copyright infringement, and how much do they share their research for free online? After identifying a group of ethicists, their research outputs from 2010 to 2015 were recorded by searching websites, profiles, and Google Scholar. They searched each article title, finding that just over half were freely available online and that ethicists were prone to copyright infringement. Zhu (2017) surveyed over 1,800 academics based in the UK, from a range of disciplines, to determine the extent to which they practice OA publishing and also what factors might influence their publishing behavior. The study asks if differences in university, discipline, age, gender, seniority, and OA attitudes and awareness influence use of OA publishing. To gather data about their OA publishing behaviors, the respondents were asked if they had ever published in an OA journal or self-archived their publications in an OA repository.
Finally, Hughes (2008) did an analysis of the publication venues chosen by those who signed the PLoS "open letter to scientific publishers" and their OA policies. This method differs from the studies discussed at the beginning of this section in that the chosen journals of the signers are analyzed for openness and not the individual articles each signer has published. The method discussed in the Hughes (2008) articles aligns closely with one of the methods used in this study to assess the OA publishing practices of McGill's social work faculty.
Understanding research and results completed on OA publishing behaviors is a critical first step for librarians who want to gain insight into this area. Learning about measures of "openness" (Nichols & Twidale, 2017) and behaviors of academic librarians and professors (Laakso & Polonioli, 2018;Mercer, 2011;Zhu, 2017) helps librarians become familiar with the OA landscape. Knowledge of a faculty's OA publishing practices can inform a librarian's outreach and education initiatives and help to determine levels of communication, advocacy, and promotion required to increase faculty publication in OA journals or self-archiving.

Collection analysis
To collect information on usage, data from two main sources were used: a custom collections analysis report created by 1Science and results of a faculty survey on their top journal choices for teaching. Both tools also use COUNTER JR1 usage data to capture article downloads. The JR1 data are one standard for collecting and reporting on usage of online journal articles (https://www.projectcounter.org/code-of-practice-sections/usage-reports/). Each tool offers different types of data, and only portions of data were used for this study. Additionally, a brief examination of faculty publications taken from faculty websites and author searches in Web of Science was conducted. SHERPA/ RoMEO was used to determine the OA level and archiving policies.
1Science report The 1Science report uses the Library's ejournal subscription information and COUNTER statistics combined with data pulled from Web of Science covering 2006 to 2015. 1Science has its own algorithm for measuring "usage" that combines many factors. This study uses only three traditional measurements from the report: papers written in peer-reviewed journals indexed in Web of Science with at least one McGill University author; references made by McGill University authors to articles; and downloads based on COUNTER JR1 data from both aggregators and publishers.

Faculty survey
Researchers at the Canadian Research Knowledge Network (CRKN), a national library consortia, administered an online survey to teaching faculty across Canada that asked them to identify their "top" journals for teaching. The survey was dependent on the ethics boards of each institution and was therefore sent out at different times in 2017. At McGill, it was sent to faculty by email in January 2017, and it ran for four weeks. Participants were asked to write their top journal choices; they did not select from a list or have a guide to help choose or identify journals. When a journal was cited by a participant, it was referred to in the survey results as a "mention." For example, when two faculty members wrote down the same journal in their responses, that journal had two "mentions." Participants were not asked to identify their department or faculty; instead they were asked to select from four high-level categories: Arts and Humanities, Biomedical, Natural Sciences and Engineering, and Social Sciences and Humanities. Granular data were not tied back to these categories, so while the representation of each domain is displayed, it is not possible to see which journals the social work faculty "mentioned" in their results. Results pertaining to McGill University faculty were sent to the Library, and the report contained additional usage data including references made by McGill University researchers in peer-reviewed articles indexed in Web of Science in 2015 and downloads for 2015 based on COUNTER JR1 data from both aggregators and publishers.

Faculty publication lists
The 1Science report and the report provided with the faculty survey included publications by McGill University authors and were not broken down by faculty or department, so the authors examined publication lists and curricula vitae (CVs) on the McGill University School of Social Work website. Also, as faculty members do not always keep these websites up to date, each faculty member was also searched as an author within Web of Science, and additional publications were noted. Faculty members who have not published articles since 2006 were excluded. This year was chosen as a reference year, as the 1Science faculty publication data were 2006-2015. Each journal that faculty published in was noted, and those that were determined within the scope of social work, and the public health fields that deal with social work, were included in the analysis. As the focus of the study was the McGill University School of Social Work and the collection that supports it, it was important to have a good understanding of where the faculty are actually publishing.

Creating a list of social work journals
Before analyzing data, a list of journals identified as being within the field of social work was required. The 1Science report included a category for social work, so a list of 29 journals was immediately available. The faculty survey data did not include categories on a granular level, so data from the 29 social work journals identified in the 1Science report were pulled from the faculty survey results. Additionally, in both data sets, journals with the phrase "social work" in the title were added to the list. This included 59 titles, bringing the list to 88 journals. The faculty's publications were analyzed, and social work journals in which they published were added to the list. Finally, the liaison librarian for the School of Social Work compiled a list of journals important to the faculty over time. Typically, these were journals in which the faculty had published or were affiliated with. Many of the journals were already included in the list through the first two criteria; the remaining 44 were added, and data for these titles were pulled using the 1Science report and the faculty survey data. Most of the additions were journals where the subject area was broader than social work, such as the fields of public health, psychology, or gerontology. These are journals that remain important to the faculty and are journals in which the faculty publish and recommend to their students. Librarians doing collections work or liaison work might want to know the full spectrum of publications, and so these broader titles were included. While former titles were included in the 1Science report and the faculty survey data, they were removed from this list of journals, leaving only current titles. Likewise, journals that have ceased publication were removed to provide a list of 132 current journals for the field of social work. Of the 132 titles, 102 titles are indexed in Web of Science. The remaining 30 journals that were not in Web of Science consisted mostly of open access titles, niche journals, or journals published by smaller societies. The full list of journals, with an indication of which data set each title is included in, can be found in the Appendix.
The list of journals was compiled in Microsoft Excel. In addition to the results captured from the 1Science report and the faculty survey (papers, references, downloads, "mentions"), the Library's coverage of each journal was included. While the Library has at least partial access to all of the titles in the 1Science report (as the report is based on subscription), the Library may not have access to all of the journals mentioned in the faculty survey. This may also highlight journals that the faculty believe are important or that are downloaded frequently but to which only partial access is provided by the Library.

Open Access
To collect information on OA policies, each journal was searched in SHERPA/RoMEO (http://www.sherpa. ac.uk/romeo). The site's color-coded level of open access was noted in the Excel worksheet mentioned previously, as well as the article version permissible to be archived in an institutional repository (preprint, postprint, publisher's version)  Seven journals from the journal list were not indexed in SHERPA/RoMEO. For each of these journals, the editors of the journals were contacted by email to confirm OA policies and the versions of articles that could be placed in institutional repositories, i.e., preprints, postprints, or the publisher versions.

Collection analysis
This study uses four measurements to determine "usage": article downloads, publications written by McGill University researchers, references made by McGill University authors, and top journals as identified by faculty for teaching.
The first result pertains to the 1Science report's category of Social Work journals. Within that category, Children and Youth Services Review was the top downloaded journal. See Table 1 for the top 10 journals by downloads in 2015, as per the 1Science report.
The remainder of the results considers the entire list of 132 journals. Of this larger list, the top downloaded journal was American Journal of Public Health. See Table 2 for the top 10 journals downloaded in 2015.
Of the publications authored by McGill University researchers from 2006 to 2015, Canadian Journal of Public Health was the top journal, with 62 papers. Note that these publications are by McGill University authors from all disciplines, as the reports cannot be filtered by a specific faculty or department. For example, it is possible that a researcher in the Faculty of Medicine contributed to the Canadian Journal of Public Health. See Table 3 for the top 10 journals by publications.
The number of references made by researchers at McGill University to an article from a given journal were counted by the 1Science report and included in the report with the faculty survey from CRKN. Both reports used the same reference period and Web of Science data. However, the data for the number of references differed between reports for 43% of the journals. For example, a journal in the 1Science report might report three references, and the same title in the CRKN data cited five references. Most of the time, the number of references differed by a count of one or two, but for five titles, the count was off by more than 50. However, the order of the journals did not change, and in these cases, the higher of the two numbers was included in the results. In contacting representatives from both organizations, it is unclear why there is a difference. American Journal of Public Health contained the greatest number of references within the list of journals. See Table 4 for the list of top 10 journals by references. Table 5 includes all of the social work journals that were "mentioned" by McGill University faculty in the survey as being important for teaching and research. From the list of 132 journals, the following two journals appeared in all of the top 10 lists, indicating that they are top journals for teaching, for publishing, and for downloading: American Journal of Public Health and Child Abuse & Neglect.

Faculty survey
The following journals appeared on three of the four top 10 lists (top downloads, publications, and references): Patient Education and Counseling; Canadian Journal of Public Health, Health & Place; and Addictive Behaviors. Even though they were not in the top 10 for faculty "mentions," they should be considered top journals as well.
The McGill University Library has at least partial access to all titles in the journal list except for two titles. The library does not have access to Journal of Social Work in Disability & Rehabilitation or International Journal of Migration and Border Studies. For 100 journals (75.8%), the library has full coverage of the journal under its current title. In most cases, for these journals, coverage extends to the former titles as well. For eight journals (6.1%), partial access is provided through secure publishers, such as through a JSTOR backfile. For 22 journals (16.7%), partial access is provided through aggregators.

Open access
Of the 132 titles in the journal list, 19 journals are open access, as confirmed by policy statements on the journals web pages. See Table 6 for a list of OA journals for the field of social work. Several other journals appeared to be OA, as content was freely available, but they did not explicitly state that they were OA journals and thus are excluded from this list of OA journals. Note: Aotearoa New Zealand Social Work Review is OA from 2008 to present.
The majority, 89 (67.4%), have a green status in SHERPA/RoMEO. Seventeen journals (12.9%) had a yellow status. Journals that were ungraded or not indexed in SHERPA/RoMEO were contacted by email to inquire about the self-archiving policy. Of the 26 journals (19.7%), two editors responded to email requests. Journal of Baccalaureate Social Work allowed the preprint or postprint to be included in institutional repositories. Social Work and Christianity allowed the publisher's version to be included in an institutional repository after a 12-month embargo. While the other journals were contacted, none had responded throughout the duration of this project.
SHERPA/RoMEO also indicates whether paid OA options are available for a given journal. This is sometimes referred to as paying an article processing charge, wherein an author pays the publisher a fee to ensure the article becomes OA even if the platform itself is not freely accessible. Within the list of 132 journals, 98 (74.2%) offer paid OA options at the article level.
The analysis of publications from faculty websites and the author searches within Web of Science resulted in 270 peer-reviewed articles published since 2006 by 21 faculty members. Within the 270 faculty publications, there are 79 articles (29%) published in OA social work journals published by 16 faculty members. Three articles by a single faculty member appear to be published using APCs, as they are marked as OA but published in subscription journals. There are 11 articles published by six authors that are marked as "free" but not "open access" on publisher platforms for subscription journals. Without conducting interviews, it cannot be assumed that these will remain freely available permanently. The publishers of these articles were contacted but did not respond. Publishers sometimes make current articles or articles on trendy topics available for free so that media can read them.

Collection analysis
The results of the study provide insightful information for both collection development work and liaison librarianship, with tangible lessons that can be applied to an assessment beyond the field of social work. As "faculty often view the library as a purchasing agent," (Wical & Vandenbark, 2015, p. 39) it is important that librarians have a strong understanding of the journals purchased by the library and of titles considered important by the faculty in the discipline(s) they support to make sure they are retaining the right subscriptions to meet the needs of their patrons (Rathemacher, 2010).
It was particularly surprising to learn that each tool and usage measurement provided a different set of "top" journals with minimal overlap between them. This is especially evident in the comparison of the 1Science data with the faculty survey download data. Journals in the 1Science report are only assigned one category, despite some being multidisciplinary, and this category excluded many important journals in the field. Both used Web of Science data from 2015, and yet 47% of the titles have differences for the number of downloads-some just by one and a few titles by more than 1,000 downloads. This discrepancy exemplifies how using different methods to extract data from the same source can produce different results. For example, had the 1Science report been the sole source of usage data, the top three journals would have been Children and Youth Services Review, British Journal of Social Work, and Child & Family Social Work. When data for the top downloaded journals from the faculty survey report is added, the top three journals become American Journal of Public Health, Child Abuse & Neglect, and Patient Education and Counseling. Other researchers undertaking a similar exercise might be wise to use multiple tools, with differing points of view, if possible. If not possible, at the very least one must understand the limitations of the tool.
The first research question led to the examination of the library's electronic journal holdings and its possible gaps. Creating a list of journals in a given field is a vital step in understanding the collections that support a particular discipline. As liaison librarians have multiple roles to play and diverse responsibilities, including teaching, research, and reference tasks, carrying out a comprehensive exercise to learn the journals within a collection may seem daunting. It is likely that important journals are collected and curated over time, as the liaison's expertise in the subject develops, but this is often a piecemeal exercise. It is well worth the effort to take time to create a list. Knowing that the McGill University Library currently has at least partial access to all but one title in the journal list is reassuring, helpful during collections conversations, and can become a talking point with faculty. In times of budgetary constraints, knowledge about which journals are held and used becomes essential in decision making. On the flip side, knowing which journals are missing, or which journals are Table 6. Open access journals with social work content.

1.
Aotearoa New Zealand Social Work Review 2.
Canadian Journal of Children's Rights 4.
Child and Adolescent Psychiatry and Mental Health 5.
Critical Social Work 7.
Drug and Alcohol Dependence 9.
Frontiers in Public Health 11.
International Indigenous Policy Journal 14.
International Journal for Equity in Health 15.
International Journal of Child and Adolescent Resilience 16.
Intervention It is important to underline that usage data are captured across the university and not usage by social work students and faculty only. A publication categorized as a "social work" journal may have high usage because of students in another discipline; likewise, journals that are important to the field may not register as being in the "top 10" by these measures. This shows how a comprehensive analysis is needed before collections decisions can be acted upon.

Open access
The second and third research questions relate to open access. The social work list of journals contains many journals that are fully OA, while others are only partially open. For example, Canadian Journal of Public Health is not fully OA, yet articles are freely available on the publisher platform after a six-month embargo, and Canadian Social Work Review has the most recent issues, dating back to 2015, available for free online. It is important to understand which journals offer partially open access when speaking to faculty, who may want to ensure they publish in places with shorter embargoes if the OA options are not suitable. It is also important when teaching students who will become social workers-in practice, in some cases, they may not have resources for subscriptions, so having an understanding of which journals offer articles online for free will be of the utmost importance after graduation.
More than half of the journals (67.4%) are considered "green" in SHERPA/RoMEO. Typically, this means a preprint or postprint can be archived in an institutional repository. Sometimes a 12-or 24-month delay is required before the article can be archived. This information lends itself to two areas of outreach for the liaison librarians: knowledge sharing with faculty about the institutional repository and its benefits, particularly in this field, and again, educating students who may use the institutional repository when working as a social worker after graduation. While the evaluation of institutional repositories compared with academic social media sites like academia.edu and researchgate.net are outside the scope of this project, "certain resources such as institutional repositories or libraries are likely to last longer than others" (Nichols & Twidale, 2017, p. 1053, so it is important for faculty and students to learn about them.
A surprising discovery made during this study has to do with the level of difficulty to determine when authors pay APCs to make their articles open access. Articles are not typically denoted with a symbol or a footnote to indicate that the author paid an APC. Publishers can choose to make single articles in subscription journals open for a number of reasons, such as the article is a topical or trendy subject that the publisher wishes to promote, particularly in the media, or authors may have earned credits for working for the journal for free to waive the APC and make their articles open. An "open access" symbol on an article within a subscription journal is not enough proof to determine if the author paid an APC. Additionally, many publisher platforms have a status of "free" for single articles, as well as "open access," further complicating the issue. Patrons can filter or search for free articles in addition to OA articles, which may also be a key talking point with students. To truly determine who paid APCs, further qualitative analysis will need to be undertaken through discussions with the faculty. There is value in knowing which faculty members pay APCs since libraries often broker deals with certain publishers or through consortia to lessen the cost of APCs. Additionally, part of OA advocacy and promotion is offering kudos to OA champions, including those who pay APCs to make their article accessible to everyone.
According to the publication analysis data from the faculty website, nearly a third of the articles since 2006 are published in OA journals. More than half of the McGill University School of Social Work faculty have at least one OA journal article. Undertaking this type of analysis can help the librarians know where to focus their outreach activities. It is possible that faculty are unaware that the journals in which they published are OA and that they published in those journals for other reasons. With a good understanding of which journals offer OA publishing, or green OA archiving, faculty can make informed decisions about where they publish. For example, if liaison librarians find themselves in an advisory role, assisting a faculty member or researcher to choose or vet a journal, knowing the OA policy is key. Similarly, communications in person to groups or online, pertaining to OA policies, guidelines, government regulations, etc., can be enhanced by citing the faculty who already publish in green OA journals and promoting that they continue this OA publishing behavior for many of the reasons already cited in this discussion. As funders change requirements for receiving grants, obliging some grant recipients to make articles OA within 12 months of publication, this information may become increasingly important to faculty. For students who plan on becoming social work practitioners, librarians can use these results to guide them in finding freely available resources to keep current in the field.

Limitations
One limitation of this study is the tools used. Narrowing down the data sources to a 1Science report, a faculty survey, Web of Science data, and faculty publication lists means that data are only as broad as those tools can provide. To complete a further analysis, it would be beneficial to look at data from other sources, such as Scopus, and to gather qualitative data by interviewing faculty and students in the School of Social Work. Additionally, for data sources that aggregate usage from all patrons and do not break down usage by faculty or department, deeper analysis could be done to determine usage attributed directly to the School of Social Work. For example, popular journals in the field could be examined to determine the number of articles written by the School of Social Work faculty members rather than articles written by any faculty at McGill. This would provide more insightful knowledge of the School, which could be helpful for the librarian performing outreach activities.
A second limitation is that a large portion of the download data is taken from a single year: 2015. Pulling data for additional years from all tools will result in a more comprehensive study. The 1Science report and the data collected with the faculty survey were reports created outside of the Library, and there was no control over which year was pulled.

Conclusion
Fortunately for the McGill University faculty and students, the library has access to nearly all of the social work journals and has full access to titles identified as "top" journals. It appears that one faculty member has paid APCs to ensure that the content is freely available, and more than half of the faculty has at least one article in an OA journal. This offers an inroad for the librarians who want to help faculty and students learn more about open access publishing and its implications for not only equitable access but research impact and where to find open access material.
This study helped identify a current list of social work journals that can be useful for librarians doing liaison work or collection development in this field. Pulling data from several sources to determine the journals that are considered "top" by McGill University faculty and used by the McGill University community has further applications, as other librarians may learn from these methods and use them to conduct their own collection analysis. In particular, using multiple tools to pull data is crucial to create a full picture of the collection, particularly where decisions about de-selection or subscription cancelation are concerned. Completing this exercise is the first step in understanding and better managing the ejournal collection, and it provides essential data needed to create appropriate and useful outreach programs related to publishing and open access in the field of social work.