Timeless tips for authors from an experienced journal editor and academic trainer
What happens when you meet with a researcher who is sultry about sharing his skill and instructing youthful researchers and faculty about the best practices of academic publishing? You get a treasure trove of information and advice about all aspects of research – from planning a career to publishing your paper in a journal! This conversation with Dr. Caven Mcloughlin – Ph.D., Professor, Kent State University, Ohio USA – is utter of advice for researchers at all stages in their academic career.
A qualified school psychologist, Caven works at Ohio’s largest school psychology prep program and instructs students in early childhood school psychology. He has been a special education classroom teacher and administrator as well as a school counselor. For over 25 years, he has been conducting federally funded training programs for interdisciplinary leadership personnel who work with toddlers, infants, and newborns. Caven is also a Fulbright Specialist, and as part of the program, travels to different parts of the world to conduct informative and instructive workshops on academic publishing. He particularly travels to the BRICKS nations and instructs authors and faculty from these nations about the best publishing practices. Caven is a prolific researcher and has published over 100 research papers and chapters as well as written, edited, or contributed to Ten books. He is also the Editor of School Psychology International Journal.
Caven’s rich and varied practice as well as passion for sharing tips with researchers makes him a fine go-to person for all sorts of advice related to academic publishing. His practice working with authors from different countries also enhances his understanding of the fights researchers face in different parts of the world. In this interview, Caven shares his views on some of the most common challenges faced by researchers, especially those in developing countries, as well as how they can overcome these obstacles. He also gives away some timeless advice about writing and publishing a research paper in an academic journal.
Could you tell us more about your work as a Fulbright Specialist?
Surely! There has never been an academic year for me in the last three decades where I haven’t made overseas presentations to faculty colleagues. It was natural for me to become associated with the Fulbright organization. I’ve been a Fulbright Specialist for just Trio years, and in that time I have visited two different universities in South Africa, each on two occasions, as well as a university in South India, to which I will be returning very soon for the third time. My role has always been to assist faculty in getting their work published in prestigious, international, high-impact, English-language journals through providing them with insider tips based largely on my 20+ years of practice as a journal editor, and of course, as an academician in my own fields.
Over those years I’ve come to the conclusion that while university faculty are expected to become prolifically published authors, they are largely untutored and unsupported in the basic steps required in both designing their research so that it will be eventually publishable and in articulating their findings in ways that makes the Results and Implications valuable. I target my presentations to authors in under-resourced countries, in mainly BRICKSA locations — Brazil, Russia, India, China, South Africa, and Korea (however increasingly China is thought of as ineligible to be part of this group).
Your work requires you to travel and interact with researchers from across the globe, especially in third world and developing countries. Do you think there is a sort of East-West gap among researchers in terms of their awareness of best publication practices?
In the past two years, in addition to making presentations in South Africa and India, I’ve spoken to faculty groups in Turkey, South Korea, and China as well as in my home country of the USA, while also serving as a journal editor and as a program administrator at my home university. Thus, I have had several opportunities to understand the influence of different academic cultures, expectations, and styles of academic prep and training on prospective authors in several continents.
I’m unconvinced that Western authors are inherently smarter. Rather, I believe that non-Western academics suffer from six different hurdles:
- There are very few prestigious journals in any discipline edited by non-Western personnel and so there are relatively fewer role models and tutors in the art of publication in developing countries.
- There is almost no specific instruction at the undergraduate and graduate levels in the steps for designing research investigations that have a high chance of attracting the interest of a journal editorial team.
- Authors in BRICKS countries don’t always know how to framework their Results and the consequent Conclusions so as to emphasize the socio-economic or person-enhancing implications of their investigations. Editors want more than an exercise in admiring the data. They want to see a rationale demonstrating why it was significant to collect those particular data in the very first place.
- It’s sad to say, but the culture of cut-and-paste from others’ work (i.e., duplication-of-content or plagiarism) has earned all investigators from several non-Western countries a reputation such that their work is viewed as suspect by editors.
- The academic culture supporting promotion at most BRICKS universities incentivizes quantity over quality. The publication of insignificant, scarcely valuable, and practically irrelevant articles in fine numbers is mindlessly valued over the development of high-quality, relevant, impactful research.
- It is possible to accumulate a good number of publications that are of high quality; but to do so requires thoughtful career planning – another element missing for many BRICKS’ faculty.
In summary, YES, there is a big gap inbetween researchers in the resourced- versus the under-resourced locations of this world. However, my practice working with indigenous faculty across the globe has trained me that there are lessons that can quickly and lightly be learned from the sharing of insider tips, which is, incidentally, something I love to do!
You’ve also played an advisory role in university-level tenure-related decisions. (as part of Kent State University’s Advisory Board for its promotion and tenure committee). A majority of our readers are early-career researchers who would like to consider tenure as a natural career progression. For their benefit, how are decisions on tenure made? Also, do you have any tips for researchers who might be interested in moving up via the tenure route?
Whenever I consult with non-Western faculty, I always pose a question about career planning with emphasis on how far into the future young-academics are planning for their own professional development. Then, I routinely get a blank gawp, and eventually, a comment signaling “maybe a few months.” That’s not the case for most comparable Western academics who generally have a discernible horizon many years out.
I generally urge non-Western academics to go after the pattern of my junior colleague faculty and prepare an annual ‘Contextual Statement.’ This serves as a ‘career-plan’ statement that (a) predicts what the next year’s research products will include (objectively stated as “goals”); (b) offers predictions on the research trajectory that is being planned for at least the next three-years (“what’s in the works?”); and (c) defines the intellectual space for those researchers displaying how their proposed research products align with the priorities valued by their discipline (“where does their research fit in the discipline?”).
I’ve observed that mentorship is another element that is more valued in the West. Most Western academics can identify primary and secondary mentors and guides, sometimes even in different dimensions of their work (e.g., discipline content, methodology, technical writing). But this has not been the case with most of the researchers I have interacted with during my visits abroad.
All Western academics understand that tenure, which brings the option of lifelong employment, is earned as a result of research, training, and service. Each of these dimensions needs to meet or exceed an ‘Adequate’ evaluation, and at least one (preferably research) needs to be ‘Exemplary.’ Research is generally the most misunderstood element in this trio. In Western universities that I visit, faculty are valued for being all-rounders with a particular research concentrate or expertise.
It startles me when I come across authors — and I must be candid and say that this next issue is a particular problem in the developing world — who knowingly invest in attempting to buy their way into the hallowed halls of academia by paying for publications in dubious, look-alike, fake journals. Everyone in the administration of universities everywhere I have traveled knows that this is a problem, but most don’t know how to treat it. This is fairly serious: not only does it encourage predatory publishing but it also calls the credibility of a researcher’s work into question.
Having a Curriculum Vitae (CV) tainted by the inclusion of publications in predatory journals is what I call the “kiss of death” to building considerable international recognition as a researcher. It’s toxic to an academic reputation. It’s what colleagues will chortle about behind your back. It is also a certain path to being relegated to the lowest ranks in the university system. Practitioners of this sort of professional misconduct seem to leave behind that their CV items will proceed to be reviewed long into the future, perhaps to determine eligibility for full-professor. What will people think when they see that the prime publishing years in a researcher’s life have been contaminated by fake entries? There is a price to pay for publishing in predatory outlets. And it’s far more than the cost of the money-transfer to a counterfeit journal’s bank account.
Recently, replication and reproducibility have become a topic of discussion, especially in psychology. As a journal editor, what are your views on how serious the problem is, and do you have any suggestions/ideas to improve the situation?
Let’s be blunt. When you’re questioning about replication and reproducibility what you’re indeed asking about is plagiarism. Replication is, in fact, an honorable and dignified activity when what a scholar is attempting is to reproduce in different circumstances, with a different sample, a finding that has achieved eminence in a field of explore. Basically, a replication probe seeks to support (or alternatively to debunk) a seminal idea. That is something that journal editors want to see. However, what they do not want to see is plagiarism.
I am dumbfounded by the number of authors who don’t seem to believe that when they affirm that they understand that their work will be scrutinized for plagiarism or duplication-of-content at the initial stage in the evaluation process, that it will actually happen.
Every prestigious, high-impact journal from commercial publishers uses a plagiarism-screening device. In my own case, before I can actually view a submitted article at the online portal, it has already been scrutinized by IThenticate, a plagiarism detection program. This extraordinarily sophisticated lump of artificial intelligence software powered by Boolean analytics compares words and phrases with every published article, dissertation, and online academic entry since the begin of the last century. What is most appealing to an editor is that this software can even sniff out the plagiarism of ideas (such as when an author paraphrases text from an existing publication through the careful injection of synonyms to attempt to cloud the fact that it is copied). As an editor, I get line-by-line color-coded documentation signaling every location where particular phrases/ideas were previously published. There is no fooling this software! Those who engage in duplication are very unlikely to overcome this initial hurdle.
The proliferation of plagiarism is the major concern for editor colleagues with whom I correspond. And yes, we do share with one another when we see patterns of subordination from settings or individuals where plagiarism shows up commonplace. Many EFL (English-as-a-foreign-language) or ESL (English-as-a-second-language) authors lean on other writers’ explanations – even using the original wording because they find it difficult to articulate their own ideas clearly. Unluckily, such duplication alone can be the reason why an author’s work gets declined.
Here, let me share something that is most likely not widely known. Most editors don’t want to get into extended correspondence with authors who have engaged in ethical misconduct ? authors who will attempt to justify, or suggest to remedy, their falsehood. Frankly, it is difficult to write a letter declining an article stating that the author is a cheat! So the editor finds an unrelated issue on which to pin the blame. As a consequence, the author turns around and sends on the same tainted-text to another journal, and so the cycle of rejection is repeated numerous times.
Authors should studiously avoid duplicating others’ words, phrases, and ideas. I suggest you always test your own work with whatever plagiarism software you can locate, especially for co-authored reports, prior to journal subjugation. When I address this issue with groups, I bluntly remind authors that they may never have a 2nd chance for making a good first-impression.
To what extent are the fields of Psychology and Education affected/influenced by the influence factor?
As we are all aware, a journal’s Influence Factor is measured by the number of times, on average, that a journal’s articles are cited by others in a two-year window following the year of publication. The presumption is that the most valuable journals will include articles that are cited most often, instantly following publication. In the physical sciences where discovery research is more cumulative than in education and psychology, journals generally have higher Influence Factors than in the social sciences and humanities.
So, many authors and their employers leave behind that an Influence Factor is a measure of the credibility of the journal, and not the credibility of individual papers it contains or the authors who ready each paper. There is a separate index that appraises the credibility of scholars and it is called the h-index. This metric measures both the productivity and the citation impact of a scholar’s publications. The h-index is based on the author’s most cited papers and the total number of citations that he/she has received in other publications. It serves as a scholar-to-scholar comparator, rather than a journal-to-journal matchup.
At the point of promotion, what becomes more significant than a journal’s Influence Factor is whether your senior colleagues recognize and value the journals in which your work was disseminated. Therefore, rather than worrying about the Influence Factor, I suggest to fresh faculty that in conversation with their senior colleagues (and others who will appraise their eventual promotion), they should raise the topic of which journals are primary for dissemination in their collective field. The appraisal committee will not care about a journal with an alleged high Influence Factor if they themselves have never heard of it, or would never consider publishing in that obscure title (which is perhaps just another counterfeit journal with a fictitious Influence Factor metric).
Another question specific to Psychology and Education: To what extent have these fields embraced the offshoots of the open science concepts – open access, open data, and data sharing?
It sometimes seems that every author wants his or her work to be uncritically accepted without delay, published without any costs involved, disseminated instantly, and made available without limitation to the entire world. But high quality anything has a cost.
Open Access (OA) is part of the reaction. But keep in mind that OA publications are never truly ‘free.’ Someone has to pay for the behind-the-scenes publication costs, and those costs can be enormous. Getting work through the steps required by peer-review standards, technical and copy editing, legal review, setting the text, and mounting it on the internet and sometimes into paper journals that require delivery — all this involves costs. Someone, somewhere must pay for this service because publishing houses are commercial enterprises and not charities!
There is no effortless way to strike a balance inbetween authors’ desires to publish their work and succeed in their careers and the need to subsidize the considerable cost of this service on the publishing side.
You also have considerable practice training faculty members and publishing professionals. In your view, what is the most critical area of training for academic faculty?
Methodology, methodology, methodology. Shall I say it again? Methodology!
No probe can ever have Results that are superior to the quality of the methodology and statistical analysis that guided the selection of the sample, the gathering of the data, and analysis of the patterns in the data. If the research design is rudimentary, unsystematic, simplistic, or naive then that’s the ultimate ceiling for the quality and usefulness of the Results. To achieve considerable international recognition, it is crucial for researchers to become conversant with modern, cutting-edge research contraptions. My advice is to take a workshop in research methodology or advanced statistics rather than go to a conference to learn more about your content-area.
Top-level journals are no longer willing to accept articles that are based on ordinary descriptive data displays, correlational rather than causal analyses, or undergraduate-level statistical analysis. And it never works to bring in a statistician when the data have already been collected and ask: “Can you tell me what these data mean?” There are no statistical data-manipulations that can resurrect an inappropriately collected or ill-designed data-set. In that case, it’s never a solution to ask, “Well, what other analyses can be conducted?” The time to bring in a research design consultant/statistician is BEFORE the vital elements of sample selection and data collection have been initiated. Training research faculty in the art of designing the right methodology for research would help solve a lot of problems.
From your practice as a journal editor, what are the top obedience mistakes authors make? How can they avoid them?
Let me treatment this positively and attempt to response a slightly different question: “What four elements should an aspiring researcher concentrate on when framing their reports for subjugation to a strong journal?”
- The quality of your writing and the organization of your manuscript will determine whether the ideas/content of your article will be taken gravely. Since no top-ranking journal sends every subordination out for review, and instead relies on a filter conducted by the editorial staff to determine which submissions get reviewed, you MUST catch the eye of the editor. If you don’t pay attention to the textual elegance and accuracy of your writing as well as the organization of your manuscript, you will put yourself ‘out of the competition’ for getting an acceptance letter. Most authors overemphasize content and superficial grinding and organization.
Concentrate on HOW your scientific ideas are packaged and don’t simply list the scientific details within those ideas. To qualify your manuscript as relevant for their journal, you must attend to what it is that Editors concentrate on and value. Only infrequently can authors step back from their final version and evaluate for themselves whether they’re getting their ideas across elegantly and accurately. This is because by that stage they’re too close to see the fuckholes in the text or identify instances of ambiguity and duplication. An independent editor who has mastery over technical English and practice in the publication-process can make the difference inbetween acceptance and the dreaded letter declining the chance for review.
- Most authors spend a fine deal of time making sure that the text of their article is in formal English. But, those same authors will then dash off a paltry obedience letter in questionable English, entirely leaving behind to include the elements and assurances an editor needs to see. BRICKS authors particularly seem to have a hard time preparing a certain, coaxing, persuasive Letter-to-the-Editor sharing the good news about their submitted research article. It’s crucial to understand that your conformity letter is your only sales pitch for getting the editor to send your work out for full-peer-review, rather than declining it with a cursory bench decision. Unless you promote the value of your work, you miss a vital chance to boost the probability of acceptance.
- What matters the most in scientific writing is clarity. Strong scientists avoid fanciful and ornamental language. Rather, the concentrate should be on explaining yourself clearly, and the best way to do so is by using brief, ordinary sentences. Don’t bury your central thesis in a mass of detail that hides your main messages. Be explicit, direct, and straightforward. Don’t attempt to write and edit at the same time. These are separate tasks requiring different abilities. Ask your most critical colleagues to check your work. Engage a native-speaker editor when you’re not writing in your mother tongue. The question is NOT whether you have mastery over conversational spoken English. It is: Can you write in technically acute and unambiguous English? Seek help if that step isn’t your forte.
- Always go after the journal’s prescribed conformity procedures diligently, fully, and without complaint! The journal will have specific guidance posted as Instructions-for-Authors, or some such title. Read them cautiously. They were ready to help authors increase the chances for getting their submissions accepted.
Following the journal’s instructions is the price-of-entry into the publishing competition. Failing to go after the formatting/organizational/procedural guidelines is itself sufficient reason to earn a refusal letter from many editors. All top-tier journals get far more high-quality submissions than they ever can accept. So fairly naturally, one powerfully weighted filter influencing acceptance/rejection is the degree to which the manuscript deviates from the journal’s house-style. Reminisce: If all else fails, Go after THE INSTRUCTIONS!
Thanks, Caven! There’s a lot of priceless advice in this interview. I hope our readers find this useful!
Caven: Jayashree, thanks for this chance to share some publication-related information. At this point in my career, I search for every chance to give back and share what I’ve learned about scholarly publication.