Chapter 6. Sideshows and Leftovers

Chapter 6. Sideshows and Leftovers

Does DOAJ represent the universe of OA journals? Not entirely. There are certainly more than 1,000 OA journals that are not in DOAJ—and more than 7,000 journal names that aren’t represented in DOAJ.

OA journals that I encountered but that aren’t in DOAJ may be missing for one or more of several reasons:

  • They’re brand-new, and the publisher is waiting until a couple of issues are published before submitting them to DOAJ.
  • They don’t meet DOAJ criteria for inclusion—a situation that’s much more likely in the future, given tighter criteria for inclusion.
  • They’re not actual OA journals publishing actual peer-reviewed scholarly articles at all: they’re something else, most commonly “journals.” I define “journals” with scare quotes as web pages that purport to identify and describe journals, where there is no operational journal behind the web page.
  • The publisher chose not to submit them to DOAJ.
  • The publishing body isn’t aware that DOAJ exists.

The title of this chapter suggests two ways to look at non-DOAJ gold open-access journals: as sideshows—things that aren’t serious OA journals at all—and as leftovers—journals that aren’t or aren’t yet part of DOAJ.

My sense is that there are, at most, a few hundred leftovers, most of which are likely to show up in DOAJ unless they disappear. The examples here are some 401 journal names from OASPA members that, as of May 7, 2014, either weren’t in DOAJ or couldn’t be identified as being in DOAJ, and 8,000 or more entities—journals and “journals”—that are either on Jeffrey Beall’s list of “predatory” journals or published by one of his long list of “predatory” publishers. I think of the OASPA group as leftovers and of Beall’s lists and most of the entities in them as sideshows.

OASPA Leftovers

As of the spring of 2014, OASPA member sites listed 1,531 journals. Of these, all but 401 are in DOAJ and are included in the discussion so far. Here’s what I found among the other 401:

  • Almost New: 112 began in 2013, but have had so few articles to date that the publishers may not yet have submitted them to DOAJ.
  • Empty: 69, most of them explicitly ceased.
  • New: 66 began in 2014 and will probably show up in DOAJ later.
  • New or Empty: 41 journals in a single series of similarly named journals either started in 2014 or are essentially empty (in some cases explicitly ceased).
  • Sparse: 30 began before 2013 but have never achieved five articles in any year; the publishers may not have submitted them (and they wouldn’t be eligible under current criteria).
  • Unworkable: Nine couldn’t be evaluated, one because it yielded 404 errors, eight because the archives appear to be random.
  • Ceased: Two others have explicitly ceased.

That leaves 73 journals, all of which are in grades A, B, DE (erratic), or DS (sparse). Four of those are miscellaneous. Of the others:

  • Biomed includes 35 journals (9 percent free) with 3,694 articles in 2013 (1 percent free).
  • STEM includes 8 journals (50 percent free) with 557 articles in 2013 (22 percent free).
  • HSS includes 26 journals (73 percent free) with 274 articles in 2013 (75 percent free).

Inclusion of these journals would add almost nothing to STEM or HSS and would add only 1.7 percent more journals and 2.9 percent more articles to Biomed. I’d assume most of these will disappear or be added to DOAJ. I don’t think they’d change the picture very much.

Beall’s Lists Sideshow

Before I began looking at the full range of open-access journals, I investigated the 2014 versions of Jeffrey Beall’s list of “potential, possible, or probable predatory scholarly open-access publishers” and his list of “potential, possible, or probable predatory scholarly open-access journals” that aren’t from those publishers.

Beall’s 2014 lists

http://scholarlyoa.com/2014/01/02/list-of-predatory-publishers-2014

The results of that investigation were published as “Journals, ‘Journals’ and Wannabes: Investigating the List,” in the July 2014 issue of Cites & Insights. I found that the lists expanded to 9,219 “journals”—but that thousands of these “journals” deserved the scare quotes: more than 2,800 had never published a single article, and more than 500 weren’t reachable at all. You’ll find more about these journals and “journals” in the October/November 2014 Cites & Insights, a follow-up of sorts to the July issue.

Cites & Insights, July 2014

http://citesandinsights.info/civ14i7.pdf

Cites & Insights, October/November 2014

http://citesandinsights.info/civ14i10.pdf

After reviewing more of Jeffrey Beall’s writings on serials and open access, I conclude that Beall’s list is not a meaningful resource. It is a subjective sideshow maintained by somebody who’s made it clear that he’s opposed to open access in general. Rather than link to particular articles, I’ll suggest the April 2014 issue of Cites & Insights, specifically the first fourteen pages: “Ethics and Access 1: The Sad Case of Jeffrey Beall.” That essay refers and links to Beall’s article “The Open-Access Movement Is Not Really about Open Access,” and you should also read “Reactionary Rhetoric against Open Access Publishing” by Wayne Bivens-Tatum, a direct response to Beall’s article, published in the same journal.

Cites & Insights, April 2014

http://citesandinsights.info/civ14i4.pdf

“Reactionary Rhetoric against Open Access Publishing”

http://triple-c.at/index.php/tripleC/article/view/617

Less than 10 percent of the “journals” from Beall’s lists were also in DOAJ as of mid-2014—and less than 10 percent of DOAJ entries were on Beall’s set of questionable publishers and journals. I have no doubt there are some good-quality journals and publishers in Beall’s set—just as I have no doubt there are questionable journals not only in Beall’s set but among subscription journals.

Realistically, your best bet—for authors, readers, and librarians—is to begin with DOAJ and assume that any OA journal not included there is somewhat questionable, with exceptions noted in chapter 7.

Just Not Much There

Once you eliminate from the Beall subset journals that aren’t reachable, journals that have never published anything, journals that aren’t open access at all, journals that are dying or dead, and the large numbers of journals that are obviously questionable to an intelligent author or reader—those with grade C—there’s just not much left.

A few key figures:

  • Of journals checked in DOAJ, 70 percent are plausible prospects (grades A, A$, and B). Of journals checked in the Beall set that are not also in DOAJ, 14 percent are plausible prospects.
  • Looking at journals with decent grades that have managed to publish 20 or more articles in at least one recent year—not a terribly high bar—you’ll find 3,714 such journals in the portion of DOAJ I investigated—and 474 in the Beall set. That’s a 7.8 to 1 ratio.

Including journals with grades A, A$, and B but with fewer articles, we arrive at figures for journal count and 2013 article counts (and the percentage of free journals and articles in those journals) shown in table 6.1.

The Ratio row shows the result of dividing the DOAJ figure by the Beall figure. In other words, there are 4.4 times as many A, A$, and B journals in the tested subset of DOAJ as in the Beall set (excluding overlap)—and 8.6 times as many 2013 articles.

A Few Other Facts and Figures

In my full examination of OA journals, with detailed article counts and including 2011 and the first half of 2014, I visited 6,498 journals and “journals” in Beall’s set that weren’t also in DOAJ—skipping more than a thousand that yielded 404s on the first try or were too difficult to retry (mostly because publishers didn’t offer downloadable lists with hyperlinks). Of that 6,498, I found that 11 percent (753) were unreachable; 6 percent (413) didn’t meet my definition of OA; 3 percent (263) were hybrid journals with no apparent OA articles; 30 percent (2,045) were just names with no published articles whatsoever; and 279 were too opaque to analyze. The rest of these notes are based on the remaining 3,275 journals, of which I found 1,206 in D subcategories, 916 obviously questionable (C), 874 that require further checking (B), and 279 that appear to be good (A and A$). Table 2.1 and the preceding text offer the closest comparison, but you may also find tables 6.3 and 6.4 later in this chapter useful.

By area, that group includes 1,135 Biomed journals (3 percent free) publishing 22,325 articles in 2013 (1 percent free); 1,489 STEM journals (6 percent free) publishing 38,953 articles in 2013 (3 percent free); and 632 HSS journals (3 percent free) publishing 12,080 articles in 2013 (1 percent free). There were also 19 miscellaneous journals. Compare that with table 1.1 for DOAJ.

Looking at peak article volume, 10 journals in the Beall set published 1,000 or more articles in their best recent year (accounting for 11,771 articles in 2013); 49 published 200 to 999 articles (17,318 in 2013); 219 published 60 to 199 articles (17,759 in 2013); 661 published 20 to 59 articles (16,953 in 2013); and 1,336 published fewer than 20 articles (11,952 in 2013). Table 2.4 is comparable.

Table 6.2 can be compared directly to table 3.2 and shows dramatic differences. Beall journals in Biomed and STEM mostly charge low fees ($201–$600)—and although the Beall HSS journals number less than one-third of the DOAJ group, there are actually more fee-charging HSS journals in the Beall set. (There are Beall journals with high APCs—more than 100 of them—but they’re all either grade C or in a D subgrade with very few articles.)

Just as almost all journals in this set charge fees, most of them appear to be recent parts of the gold rush. Where the number of DOAJ journals starting in 2012–2013 is less than half the number for 2010–2011, more than half of all journals in the Beall set (grades AD) appear to have started in 2012 and 2013—1,883, nearly three times as many as in 2010–2011.

To the extent that the Beall set includes actual journals, they are mostly APC-charging journals begun during the gold rush with relatively low fees and relatively few articles, and there aren’t that many that sensible authors would consider submitting articles to, blacklist or no blacklist.

Comparing Major Areas

Tables 6.3 and 6.4 compare journals and articles with grades A, A$, and B in the DOAJ and Beall sets in each of the broad areas. The /DOAJ suffix indicates the DOAJ numbers; the Beall line follows in each case, with the ratio (DOAJ divided by Beall) below that.

Ratios in these two tables show one decimal place because at some APC levels there are actually more plausible Beall journals than DOAJ journals, even though overall there are several times as many plausible DOAJ journals.

There are no cases in which more articles appeared in plausible Beall journals than in DOAJ journals—and some of the ratios are fairly astonishing, such as the 727-to-1 ratio for articles in no-fee STEM journals.

Exiting the Sideshow

I don’t think the sideshow deserves more attention. To the extent that Beall-set journals are worthy places for authors and readers, they will almost certainly show up in DOAJ. Showing up in DOAJ is, of course, not automatically proof of high quality. DOAJ lacks the resources to ensure that each issue of each journal listed actually meets all ethical and editorial standards. It is no more able to provide a reliable whitelist than one librarian with an admitted disdain for OA in general is able to provide a reliable blacklist.

I’ve already listed sources for much more thorough coverage of the Beall set—that is, the July and October/November 2014 issues of Cites & Insights, with some additional coverage in December 2014 and January 2015. As with the DOAJ subset, data (but not publishers, journal names, or notes) for the Beall set is available as an anonymized spreadsheet if you wish to do your own analysis. See chapter 8 for details.

Table 6.1. A, A$, and B journals in DOAJ and Beall

Group

Journals

% No-Fee

Articles

% No-Fee

DOAJ

5,123

67%

330,924

37%

Beall

1,153

6%

38,673

2%

Ratio

4.4

8.6

Table 6.2. Fee ranges by subject areas, Beall set A, A$, and B

Area

No Fee

Nominal

Low

Medium

High

Biomed

11

98

196

52

Articles

60

5,165

3,802

933

STEM

11

98

255

52

Articles

60

5,165

3,802

933

HSS

8

115

128

2

Articles

102

5,030

2,750

156

Table 6.3. DOAJ and Beall A–B journals by area

Area

No Fee

Nominal

Low

Medium

High

Total

Biomed/DOAJ

824

114

140

165

397

1,640

Beall

11

98

196

52

357

Ratio

74.9

1.2

0.7

3.2

4.6

STEM/DOAJ

1,068

242

230

149

40

1,729

Beall

11

98

255

52

416

Ratio

97.1

2.5

0.9

2.9

4.2

HSS/DOAJ

1,482

105

71

26

4

1,688

Beall

8

115

128

2

253

Ratio

185.3

0.9

0.6

13.0

6.7

Table 6.4. DOAJ and Beall A–B articles by area

Area

No Fee

Nominal

Low

Medium

High

Total

Biomed/DOAJ

41,224

9,897

8,869

11,962

44,153

116,105

Beall

60

5,165

3,802

933

9,960

Ratio

687.1

1.9

2.3

12.8

11.7

STEM/DOAJ

43,623

26,050

23,020

20,915

11,934

125,542

Beall

60

5,165

3,802

933

9,960

Ratio

727.1

5.0

6.1

22.4

12.6

HSS/DOAJ

34,911

7,065

4,278

1,197

1,155

48,606

Beall

102

5,030

2,750

156

8,038

Ratio

342.3

1.4

1.6

7.7

6.0

Refbacks

  • There are currently no refbacks.


Published by ALA TechSource, an imprint of the American Library Association.
Copyright Statement | ALA Privacy Policy