Research methodology vs. research methods
The research methodology or design is the overall strategy and rationale that you used to carry out the research. Whereas, research methods are the specific tools and processes you use to gather and understand the data you need to test your hypothesis.
To further understand research methodology, let’s explore some examples of research methodology:
a. Qualitative research methodology example: A study exploring the impact of author branding on author popularity might utilize in-depth interviews to gather personal experiences and perspectives.
b. Quantitative research methodology example: A research project investigating the effects of a book promotion technique on book sales could employ a statistical analysis of profit margins and sales before and after the implementation of the method.
c. Mixed-Methods research methodology example: A study examining the relationship between social media use and academic performance might combine both qualitative and quantitative approaches. It could include surveys to quantitatively assess the frequency of social media usage and its correlation with grades, alongside focus groups or interviews to qualitatively explore students’ perceptions and experiences regarding how social media affects their study habits and academic engagement.
These examples highlight the meaning of methodology in research and how it guides the research process, from data collection to analysis, ensuring the study’s objectives are met efficiently.
When it comes to writing your study, the methodology in research papers or a dissertation plays a pivotal role. A well-crafted methodology section of a research paper or thesis not only enhances the credibility of your research but also provides a roadmap for others to replicate or build upon your work.
Wondering how to write the research methodology section? Follow these steps to create a strong methods chapter:
At the start of a research paper , you would have provided the background of your research and stated your hypothesis or research problem. In this section, you will elaborate on your research strategy.
Begin by restating your research question and proceed to explain what type of research you opted for to test it. Depending on your research, here are some questions you can consider:
a. Did you use qualitative or quantitative data to test the hypothesis?
b. Did you perform an experiment where you collected data or are you writing a dissertation that is descriptive/theoretical without data collection?
c. Did you use primary data that you collected or analyze secondary research data or existing data as part of your study?
These questions will help you establish the rationale for your study on a broader level, which you will follow by elaborating on the specific methods you used to collect and understand your data.
Now that you have told your reader what type of research you’ve undertaken for the dissertation, it’s time to dig into specifics. State what specific methods you used and explain the conditions and variables involved. Explain what the theoretical framework behind the method was, what samples you used for testing it, and what tools and materials you used to collect the data.
Once you have explained the data collection process, explain how you analyzed and studied the data. Here, your focus is simply to explain the methods of analysis rather than the results of the study.
Here are some questions you can answer at this stage:
a. What tools or software did you use to analyze your results?
b. What parameters or variables did you consider while understanding and studying the data you’ve collected?
c. Was your analysis based on a theoretical framework?
Your mode of analysis will change depending on whether you used a quantitative or qualitative research methodology in your study. If you’re working within the hard sciences or physical sciences, you are likely to use a quantitative research methodology (relying on numbers and hard data). If you’re doing a qualitative study, in the social sciences or humanities, your analysis may rely on understanding language and socio-political contexts around your topic. This is why it’s important to establish what kind of study you’re undertaking at the onset.
Now that you have gone through your research process in detail, you’ll also have to make a case for it. Justify your choice of methodology and methods, explaining why it is the best choice for your research question. This is especially important if you have chosen an unconventional approach or you’ve simply chosen to study an existing research problem from a different perspective. Compare it with other methodologies, especially ones attempted by previous researchers, and discuss what contributions using your methodology makes.
No matter how thorough a methodology is, it doesn’t come without its hurdles. This is a natural part of scientific research that is important to document so that your peers and future researchers are aware of it. Writing in a research paper about this aspect of your research process also tells your evaluator that you have actively worked to overcome the pitfalls that came your way and you have refined the research process.
1. Remember who you are writing for. Keeping sight of the reader/evaluator will help you know what to elaborate on and what information they are already likely to have. You’re condensing months’ work of research in just a few pages, so you should omit basic definitions and information about general phenomena people already know.
2. Do not give an overly elaborate explanation of every single condition in your study.
3. Skip details and findings irrelevant to the results.
4. Cite references that back your claim and choice of methodology.
5. Consistently emphasize the relationship between your research question and the methodology you adopted to study it.
To sum it up, what is methodology in research? It’s the blueprint of your research, essential for ensuring that your study is systematic, rigorous, and credible. Whether your focus is on qualitative research methodology, quantitative research methodology, or a combination of both, understanding and clearly defining your methodology is key to the success of your research.
Once you write the research methodology and complete writing the entire research paper, the next step is to edit your paper. As experts in research paper editing and proofreading services , we’d love to help you perfect your paper!
Here are some other articles that you might find useful:
What does research methodology mean, what types of research methodologies are there, what is qualitative research methodology, how to determine sample size in research methodology, what is action research methodology.
Found this article helpful?
This is very simplified and direct. Very helpful to understand the research methodology section of a dissertation
Leave a Comment: Cancel reply
Your email address will not be published.
Your organization needs a technical editor: here’s why, your guide to the best ebook readers in 2024, writing for the web: 7 expert tips for web content writing.
Subscribe to our Newsletter
Get carefully curated resources about writing, editing, and publishing in the comfort of your inbox.
How to Copyright Your Book?
If you’ve thought about copyrighting your book, you’re on the right path.
© 2024 All rights reserved
Research methodology involves a systematic and well-structured approach to conducting scholarly or scientific inquiries. Knowing the significance of research methodology and its different components is crucial as it serves as the basis for any study.
Typically, your research topic will start as a broad idea you want to investigate more thoroughly. Once you’ve identified a research problem and created research questions , you must choose the appropriate methodology and frameworks to address those questions effectively.
Research methodology is the process or the way you intend to execute your study. The methodology section of a research paper outlines how you plan to conduct your study. It covers various steps such as collecting data, statistical analysis, observing participants, and other procedures involved in the research process
The methods section should give a description of the process that will convert your idea into a study. Additionally, the outcomes of your process must provide valid and reliable results resonant with the aims and objectives of your research. This thumb rule holds complete validity, no matter whether your paper has inclinations for qualitative or quantitative usage.
Studying research methods used in related studies can provide helpful insights and direction for your own research. Now easily discover papers related to your topic on SciSpace and utilize our AI research assistant, Copilot , to quickly review the methodologies applied in different papers.
While deciding on your approach towards your research, the reason or factors you weighed in choosing a particular problem and formulating a research topic need to be validated and explained. A research methodology helps you do exactly that. Moreover, a good research methodology lets you build your argument to validate your research work performed through various data collection methods, analytical methods, and other essential points.
Just imagine it as a strategy documented to provide an overview of what you intend to do.
While undertaking any research writing or performing the research itself, you may get drifted in not something of much importance. In such a case, a research methodology helps you to get back to your outlined work methodology.
A research methodology helps in keeping you accountable for your work. Additionally, it can help you evaluate whether your work is in sync with your original aims and objectives or not. Besides, a good research methodology enables you to navigate your research process smoothly and swiftly while providing effective planning to achieve your desired results.
Usually, you must ensure to include the following stated aspects while deciding over the basic structure of your research methodology:
Explain what research methods you’re going to use. Whether you intend to proceed with quantitative or qualitative, or a composite of both approaches, you need to state that explicitly. The option among the three depends on your research’s aim, objectives, and scope.
Based on logic and reason, let your readers know why you have chosen said research methodologies. Additionally, you have to build strong arguments supporting why your chosen research method is the best way to achieve the desired outcome.
The mechanism encompasses the research methods or instruments you will use to develop your research methodology. It usually refers to your data collection methods. You can use interviews, surveys, physical questionnaires, etc., of the many available mechanisms as research methodology instruments. The data collection method is determined by the type of research and whether the data is quantitative data(includes numerical data) or qualitative data (perception, morale, etc.) Moreover, you need to put logical reasoning behind choosing a particular instrument.
The results will be available once you have finished experimenting. However, you should also explain how you plan to use the data to interpret the findings. This section also aids in understanding the problem from within, breaking it down into pieces, and viewing the research problem from various perspectives.
Anything that you feel must be explained to spread more awareness among readers and focus groups must be included and described in detail. You should not just specify your research methodology on the assumption that a reader is aware of the topic.
All the relevant information that explains and simplifies your research paper must be included in the methodology section. If you are conducting your research in a non-traditional manner, give a logical justification and list its benefits.
Include information about the sample and sample space in the methodology section. The term "sample" refers to a smaller set of data that a researcher selects or chooses from a larger group of people or focus groups using a predetermined selection method. Let your readers know how you are going to distinguish between relevant and non-relevant samples. How you figured out those exact numbers to back your research methodology, i.e. the sample spacing of instruments, must be discussed thoroughly.
For example, if you are going to conduct a survey or interview, then by what procedure will you select the interviewees (or sample size in case of surveys), and how exactly will the interview or survey be conducted.
This part, which is frequently assumed to be unnecessary, is actually very important. The challenges and limitations that your chosen strategy inherently possesses must be specified while you are conducting different types of research.
You must have observed that all research papers, dissertations, or theses carry a chapter entirely dedicated to research methodology. This section helps maintain your credibility as a better interpreter of results rather than a manipulator.
A good research methodology always explains the procedure, data collection methods and techniques, aim, and scope of the research. In a research study, it leads to a well-organized, rationality-based approach, while the paper lacking it is often observed as messy or disorganized.
You should pay special attention to validating your chosen way towards the research methodology. This becomes extremely important in case you select an unconventional or a distinct method of execution.
Curating and developing a strong, effective research methodology can assist you in addressing a variety of situations, such as:
As a researcher, you must choose which tools or data collection methods that fit best in terms of the relevance of your research. This decision has to be wise.
There exists many research equipments or tools that you can use to carry out your research process. These are classified as:
An interview aimed to get your desired research outcomes can be undertaken in many different ways. For example, you can design your interview as structured, semi-structured, or unstructured. What sets them apart is the degree of formality in the questions. On the other hand, in a group interview, your aim should be to collect more opinions and group perceptions from the focus groups on a certain topic rather than looking out for some formal answers.
In surveys, you are in better control if you specifically draft the questions you seek the response for. For example, you may choose to include free-style questions that can be answered descriptively, or you may provide a multiple-choice type response for questions. Besides, you can also opt to choose both ways, deciding what suits your research process and purpose better.
Similar to the group interviews, here, you can select a group of individuals and assign them a topic to discuss or freely express their opinions over that. You can simultaneously note down the answers and later draft them appropriately, deciding on the relevance of every response.
If your research domain is humanities or sociology, observations are the best-proven method to draw your research methodology. Of course, you can always include studying the spontaneous response of the participants towards a situation or conducting the same but in a more structured manner. A structured observation means putting the participants in a situation at a previously decided time and then studying their responses.
Of all the tools described above, it is you who should wisely choose the instruments and decide what’s the best fit for your research. You must not restrict yourself from multiple methods or a combination of a few instruments if appropriate in drafting a good research methodology.
A research methodology exists in various forms. Depending upon their approach, whether centered around words, numbers, or both, methodologies are distinguished as qualitative, quantitative, or an amalgamation of both.
When a research methodology primarily focuses on words and textual data, then it is generally referred to as qualitative research methodology. This type is usually preferred among researchers when the aim and scope of the research are mainly theoretical and explanatory.
The instruments used are observations, interviews, and sample groups. You can use this methodology if you are trying to study human behavior or response in some situations. Generally, qualitative research methodology is widely used in sociology, psychology, and other related domains.
If your research is majorly centered on data, figures, and stats, then analyzing these numerical data is often referred to as quantitative research methodology. You can use quantitative research methodology if your research requires you to validate or justify the obtained results.
In quantitative methods, surveys, tests, experiments, and evaluations of current databases can be advantageously used as instruments If your research involves testing some hypothesis, then use this methodology.
As the name suggests, the amalgam methodology uses both quantitative and qualitative approaches. This methodology is used when a part of the research requires you to verify the facts and figures, whereas the other part demands you to discover the theoretical and explanatory nature of the research question.
The instruments for the amalgam methodology require you to conduct interviews and surveys, including tests and experiments. The outcome of this methodology can be insightful and valuable as it provides precise test results in line with theoretical explanations and reasoning.
The amalgam method, makes your work both factual and rational at the same time.
If you have kept your sincerity and awareness intact with the aims and scope of research well enough, you must have got an idea of which research methodology suits your work best.
Before deciding which research methodology answers your research question, you must invest significant time in reading and doing your homework for that. Taking references that yield relevant results should be your first approach to establishing a research methodology.
Moreover, you should never refrain from exploring other options. Before setting your work in stone, you must try all the available options as it explains why the choice of research methodology that you finally make is more appropriate than the other available options.
You should always go for a quantitative research methodology if your research requires gathering large amounts of data, figures, and statistics. This research methodology will provide you with results if your research paper involves the validation of some hypothesis.
Whereas, if you are looking for more explanations, reasons, opinions, and public perceptions around a theory, you must use qualitative research methodology.The choice of an appropriate research methodology ultimately depends on what you want to achieve through your research.
1. how to write a research methodology.
You can always provide a separate section for research methodology where you should specify details about the methods and instruments used during the research, discussions on result analysis, including insights into the background information, and conveying the research limitations.
There generally exists four types of research methodology i.e.
The set of techniques or procedures followed to discover and analyze the information gathered to validate or justify a research outcome is generally called Research Methodology.
Your research methodology directly reflects the validity of your research outcomes and how well-informed your research work is. Moreover, it can help future researchers cite or refer to your research if they plan to use a similar research methodology.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Lawrence mbuagbaw.
1 Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON Canada
2 Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario L8N 4A6 Canada
3 Centre for the Development of Best Practices in Health, Yaoundé, Cameroon
Livia puljak.
4 Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000 Zagreb, Croatia
5 Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN 47405 USA
6 Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON Canada
7 Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON Canada
8 Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON Canada
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.
We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?
Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.
The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 – 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 – 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).
In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig. 1 .
Trends in the number studies that mention “methodological review” or “meta-
epidemiological study” in PubMed.
The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.
The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.
Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 – 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.
Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.
Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.
These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].
There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.
Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].
Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.
In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.
Q: How should I select research reports for my methodological study?
A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].
The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.
Q: How many databases should I search?
A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.
Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.
Q: Should I publish a protocol for my methodological study?
A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.
Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).
Q: How to appraise the quality of a methodological study?
A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.
Q: Should I justify a sample size?
A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:
For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].
Q: What should I call my study?
A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.
Q: Should I account for clustering in my methodological study?
A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”
A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].
Q: Should I extract data in duplicate?
A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].
Q: Should I assess the risk of bias of research reports included in my methodological study?
A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].
Q: What variables are relevant to methodological studies?
A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:
Q: Should I focus only on high impact journals?
A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.
Q: Can I conduct a methodological study of qualitative research?
A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.
Q: What reporting guidelines should I use for my methodological study?
A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.
Q: What are the potential threats to validity and how can I avoid them?
A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.
Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].
With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.
Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.
Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.
In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:
A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].
Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].
Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].
In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].
Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].
Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].
Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].
In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.
Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].
Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].
Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n = 103) [ 30 ].
Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.
Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.
Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].
This framework is outlined in Fig. 2 .
A proposed framework for methodological studies
Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.
In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.
Abbreviations.
CONSORT | Consolidated Standards of Reporting Trials |
EPICOT | Evidence, Participants, Intervention, Comparison, Outcome, Timeframe |
GRADE | Grading of Recommendations, Assessment, Development and Evaluations |
PICOT | Participants, Intervention, Comparison, Outcome, Timeframe |
PRISMA | Preferred Reporting Items of Systematic reviews and Meta-Analyses |
SWAR | Studies Within a Review |
SWAT | Studies Within a Trial |
LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.
This work did not receive any dedicated funding.
Ethics approval and consent to participate.
Not applicable.
Competing interests.
DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Quantitative research methodologies, qualitative research methodologies, mixed method methodologies, selecting a methodology.
According to Dawson (2019),a research methodology is the primary principle that will guide your research. It becomes the general approach in conducting research on your topic and determines what research method you will use. A research methodology is different from a research method because research methods are the tools you use to gather your data (Dawson, 2019). You must consider several issues when it comes to selecting the most appropriate methodology for your topic. Issues might include research limitations and ethical dilemmas that might impact the quality of your research. Descriptions of each type of methodology are included below.
Quantitative research methodologies are meant to create numeric statistics by using survey research to gather data (Dawson, 2019). This approach tends to reach a larger amount of people in a shorter amount of time. According to Labaree (2020), there are three parts that make up a quantitative research methodology:
Once you decide on a methodology, you can consider the method to which you will apply your methodology.
Qualitative research methodologies examine the behaviors, opinions, and experiences of individuals through methods of examination (Dawson, 2019). This type of approach typically requires less participants, but more time with each participant. It gives research subjects the opportunity to provide their own opinion on a certain topic.
Examples of Qualitative Research Methodologies
A mixed methodology allows you to implement the strengths of both qualitative and quantitative research methods. In some cases, you may find that your research project would benefit from this. This approach is beneficial because it allows each methodology to counteract the weaknesses of the other (Dawson, 2019). You should consider this option carefully, as it can make your research complicated if not planned correctly.
What should you do to decide on a research methodology? The most logical way to determine your methodology is to decide whether you plan on conducting qualitative or qualitative research. You also have the option to implement a mixed methods approach. Looking back on Dawson's (2019) five "W's" on the previous page , may help you with this process. You should also look for key words that indicate a specific type of research methodology in your hypothesis or proposal. Some words may lean more towards one methodology over another.
Quantitative Research Key Words
Qualitative Research Key Words
BMC Medical Research Methodology volume 20 , Article number: 226 ( 2020 ) Cite this article
41k Accesses
57 Citations
59 Altmetric
Metrics details
Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.
We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?
Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.
Peer Review reports
The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 , 2 , 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 , 7 , 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).
In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig. 1 .
Trends in the number studies that mention “methodological review” or “meta-
epidemiological study” in PubMed.
The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.
The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.
Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 , 13 , 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.
Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.
Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.
These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].
There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.
Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].
Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.
In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.
Q: How should I select research reports for my methodological study?
A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].
The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.
Q: How many databases should I search?
A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.
Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.
Q: Should I publish a protocol for my methodological study?
A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.
Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).
Q: How to appraise the quality of a methodological study?
A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.
Q: Should I justify a sample size?
A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:
Comparing two groups
Determining a proportion, mean or another quantifier
Determining factors associated with an outcome using regression-based analyses
For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].
Q: What should I call my study?
A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.
Q: Should I account for clustering in my methodological study?
A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”
A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].
Q: Should I extract data in duplicate?
A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].
Q: Should I assess the risk of bias of research reports included in my methodological study?
A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].
Q: What variables are relevant to methodological studies?
A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:
Country: Countries and regions differ in their research cultures, and the resources available to conduct research. Therefore, it is reasonable to believe that there may be differences in methodological features across countries. Methodological studies have reported loco-regional differences in reporting quality [ 52 , 53 ]. This may also be related to challenges non-English speakers face in publishing papers in English.
Authors’ expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. Oltean et al. found that among randomized trials in orthopaedic surgery, the use of analyses that accounted for clustering was more likely when specialists (e.g. statistician, epidemiologist or clinical trials methodologist) were included on the study team [ 54 ]. Fleming et al. found that including methodologists in the review team was associated with appropriate use of reporting guidelines [ 55 ].
Source of funding and conflicts of interest: Some studies have found that funded studies report better [ 56 , 57 ], while others do not [ 53 , 58 ]. The presence of funding would indicate the availability of resources deployed to ensure optimal design, conduct, analysis and reporting. However, the source of funding may introduce conflicts of interest and warrant assessment. For example, Kaiser et al. investigated the effect of industry funding on obesity or nutrition randomized trials and found that reporting quality was similar [ 59 ]. Thomas et al. looked at reporting quality of long-term weight loss trials and found that industry funded studies were better [ 60 ]. Kan et al. examined the association between industry funding and “positive trials” (trials reporting a significant intervention effect) and found that industry funding was highly predictive of a positive trial [ 61 ]. This finding is similar to that of a recent Cochrane Methodology Review by Hansen et al. [ 62 ]
Journal characteristics: Certain journals’ characteristics may influence the study design, analysis or reporting. Characteristics such as journal endorsement of guidelines [ 63 , 64 ], and Journal Impact Factor (JIF) have been shown to be associated with reporting [ 63 , 65 , 66 , 67 ].
Study size (sample size/number of sites): Some studies have shown that reporting is better in larger studies [ 53 , 56 , 58 ].
Year of publication: It is reasonable to assume that design, conduct, analysis and reporting of research will change over time. Many studies have demonstrated improvements in reporting over time or after the publication of reporting guidelines [ 68 , 69 ].
Type of intervention: In a methodological study of reporting quality of weight loss intervention studies, Thabane et al. found that trials of pharmacologic interventions were reported better than trials of non-pharmacologic interventions [ 70 ].
Interactions between variables: Complex interactions between the previously listed variables are possible. High income countries with more resources may be more likely to conduct larger studies and incorporate a variety of experts. Authors in certain countries may prefer certain journals, and journal endorsement of guidelines and editorial policies may change over time.
Q: Should I focus only on high impact journals?
A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.
Q: Can I conduct a methodological study of qualitative research?
A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.
Q: What reporting guidelines should I use for my methodological study?
A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.
Q: What are the potential threats to validity and how can I avoid them?
A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.
Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].
With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.
Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.
Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.
In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:
What is the aim?
Methodological studies that investigate bias
A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].
Methodological studies that investigate quality (or completeness) of reporting
Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].
Methodological studies that investigate the consistency of reporting
Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].
Methodological studies that investigate factors associated with reporting
In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].
Methodological studies that investigate methods
Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].
Methodological studies that summarize other methodological studies
Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].
Methodological studies that investigate nomenclature and terminology
Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].
Other types of methodological studies
In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.
What is the design?
Methodological studies that are descriptive
Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].
Methodological studies that are analytical
Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].
What is the sampling strategy?
Methodological studies that include the target population
Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n = 103) [ 30 ].
Methodological studies that include a sample of the target population
Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.
What is the unit of analysis?
Methodological studies with a research report as the unit of analysis
Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.
Methodological studies with a design, analysis or reporting item as the unit of analysis
Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].
This framework is outlined in Fig. 2 .
A proposed framework for methodological studies
Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.
In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Consolidated Standards of Reporting Trials
Evidence, Participants, Intervention, Comparison, Outcome, Timeframe
Grading of Recommendations, Assessment, Development and Evaluations
Participants, Intervention, Comparison, Outcome, Timeframe
Preferred Reporting Items of Systematic reviews and Meta-Analyses
Studies Within a Review
Studies Within a Trial
Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9.
PubMed Google Scholar
Chan AW, Song F, Vickers A, Jefferson T, Dickersin K, Gotzsche PC, Krumholz HM, Ghersi D, van der Worp HB. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.
PubMed PubMed Central Google Scholar
Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.
Higgins JP, Altman DG, Gotzsche PC, Juni P, Moher D, Oxman AD, Savovic J, Schulz KF, Weeks L, Sterne JA. The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928.
Moher D, Schulz KF, Altman DG. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet. 2001;357.
Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.
Shea BJ, Hamel C, Wells GA, Bouter LM, Kristjansson E, Grimshaw J, Henry DA, Boers M. AMSTAR is a reliable and valid measurement tool to assess the methodological quality of systematic reviews. J Clin Epidemiol. 2009;62(10):1013–20.
Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, Moher D, Tugwell P, Welch V, Kristjansson E, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. Bmj. 2017;358:j4008.
Lawson DO, Leenus A, Mbuagbaw L. Mapping the nomenclature, methodology, and reporting of studies that review methods: a pilot methodological review. Pilot Feasibility Studies. 2020;6(1):13.
Puljak L, Makaric ZL, Buljan I, Pieper D. What is a meta-epidemiological study? Analysis of published literature indicated heterogeneous study designs and definitions. J Comp Eff Res. 2020.
Abbade LPF, Wang M, Sriganesh K, Jin Y, Mbuagbaw L, Thabane L. The framing of research questions using the PICOT format in randomized controlled trials of venous ulcer disease is suboptimal: a systematic survey. Wound Repair Regen. 2017;25(5):892–900.
Gohari F, Baradaran HR, Tabatabaee M, Anijidani S, Mohammadpour Touserkani F, Atlasi R, Razmgir M. Quality of reporting randomized controlled trials (RCTs) in diabetes in Iran; a systematic review. J Diabetes Metab Disord. 2015;15(1):36.
Wang M, Jin Y, Hu ZJ, Thabane A, Dennis B, Gajic-Veljanoski O, Paul J, Thabane L. The reporting quality of abstracts of stepped wedge randomized trials is suboptimal: a systematic survey of the literature. Contemp Clin Trials Commun. 2017;8:1–10.
Shanthanna H, Kaushal A, Mbuagbaw L, Couban R, Busse J, Thabane L: A cross-sectional study of the reporting quality of pilot or feasibility trials in high-impact anesthesia journals Can J Anaesthesia 2018, 65(11):1180–1195.
Kosa SD, Mbuagbaw L, Borg Debono V, Bhandari M, Dennis BB, Ene G, Leenus A, Shi D, Thabane M, Valvasori S, et al. Agreement in reporting between trial publications and current clinical trial registry in high impact journals: a methodological review. Contemporary Clinical Trials. 2018;65:144–50.
Zhang Y, Florez ID, Colunga Lozano LE, Aloweni FAB, Kennedy SA, Li A, Craigie S, Zhang S, Agarwal A, Lopes LC, et al. A systematic survey on reporting and methods for handling missing participant data for continuous outcomes in randomized controlled trials. J Clin Epidemiol. 2017;88:57–66.
CAS PubMed Google Scholar
Hernández AV, Boersma E, Murray GD, Habbema JD, Steyerberg EW. Subgroup analyses in therapeutic cardiovascular clinical trials: are most of them misleading? Am Heart J. 2006;151(2):257–64.
Samaan Z, Mbuagbaw L, Kosa D, Borg Debono V, Dillenburg R, Zhang S, Fruci V, Dennis B, Bawor M, Thabane L. A systematic scoping review of adherence to reporting guidelines in health care literature. J Multidiscip Healthc. 2013;6:169–88.
Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59(7):697–703.
Carrasco-Labra A, Brignardello-Petersen R, Santesso N, Neumann I, Mustafa RA, Mbuagbaw L, Etxeandia Ikobaltzeta I, De Stio C, McCullagh LJ, Alonso-Coello P. Improving GRADE evidence tables part 1: a randomized trial shows improved understanding of content in summary-of-findings tables with a new format. J Clin Epidemiol. 2016;74:7–18.
The Northern Ireland Hub for Trials Methodology Research: SWAT/SWAR Information [ https://www.qub.ac.uk/sites/TheNorthernIrelandNetworkforTrialsMethodologyResearch/SWATSWARInformation/ ]. Accessed 31 Aug 2020.
Chick S, Sánchez P, Ferrin D, Morrice D. How to conduct a successful simulation study. In: Proceedings of the 2003 winter simulation conference: 2003; 2003. p. 66–70.
Google Scholar
Mulrow CD. The medical review article: state of the science. Ann Intern Med. 1987;106(3):485–8.
Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an update. Mount Sinai J Med New York. 1996;63(3–4):216–24.
CAS Google Scholar
Areia M, Soares M, Dinis-Ribeiro M. Quality reporting of endoscopic diagnostic studies in gastrointestinal journals: where do we stand on the use of the STARD and CONSORT statements? Endoscopy. 2010;42(2):138–47.
Knol M, Groenwold R, Grobbee D. P-values in baseline tables of randomised controlled trials are inappropriate but still common in high impact journals. Eur J Prev Cardiol. 2012;19(2):231–2.
Chen M, Cui J, Zhang AL, Sze DM, Xue CC, May BH. Adherence to CONSORT items in randomized controlled trials of integrative medicine for colorectal Cancer published in Chinese journals. J Altern Complement Med. 2018;24(2):115–24.
Hopewell S, Ravaud P, Baron G, Boutron I. Effect of editors' implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis. BMJ. 2012;344:e4178.
The Cochrane Methodology Register Issue 2 2009 [ https://cmr.cochrane.org/help.htm ]. Accessed 31 Aug 2020.
Mbuagbaw L, Kredo T, Welch V, Mursleen S, Ross S, Zani B, Motaze NV, Quinlan L. Critical EPICOT items were absent in Cochrane human immunodeficiency virus systematic reviews: a bibliometric analysis. J Clin Epidemiol. 2016;74:66–72.
Barton S, Peckitt C, Sclafani F, Cunningham D, Chau I. The influence of industry sponsorship on the reporting of subgroup analyses within phase III randomised controlled trials in gastrointestinal oncology. Eur J Cancer. 2015;51(18):2732–9.
Setia MS. Methodology series module 5: sampling strategies. Indian J Dermatol. 2016;61(5):505–9.
Wilson B, Burnett P, Moher D, Altman DG, Al-Shahi Salman R. Completeness of reporting of randomised controlled trials including people with transient ischaemic attack or stroke: a systematic review. Eur Stroke J. 2018;3(4):337–46.
Kahale LA, Diab B, Brignardello-Petersen R, Agarwal A, Mustafa RA, Kwong J, Neumann I, Li L, Lopes LC, Briel M, et al. Systematic reviews do not adequately report or address missing outcome data in their analyses: a methodological survey. J Clin Epidemiol. 2018;99:14–23.
De Angelis CD, Drazen JM, Frizelle FA, Haug C, Hoey J, Horton R, Kotzin S, Laine C, Marusic A, Overbeke AJPM, et al. Is this clinical trial fully registered?: a statement from the International Committee of Medical Journal Editors*. Ann Intern Med. 2005;143(2):146–8.
Ohtake PJ, Childs JD. Why publish study protocols? Phys Ther. 2014;94(9):1208–9.
Rombey T, Allers K, Mathes T, Hoffmann F, Pieper D. A descriptive analysis of the characteristics and the peer review process of systematic review protocols published in an open peer review journal from 2012 to 2017. BMC Med Res Methodol. 2019;19(1):57.
Grimes DA, Schulz KF. Bias and causal associations in observational research. Lancet. 2002;359(9302):248–52.
Porta M (ed.): A dictionary of epidemiology, 5th edn. Oxford: Oxford University Press, Inc.; 2008.
El Dib R, Tikkinen KAO, Akl EA, Gomaa HA, Mustafa RA, Agarwal A, Carpenter CR, Zhang Y, Jorge EC, Almeida R, et al. Systematic survey of randomized trials evaluating the impact of alternative diagnostic strategies on patient-important outcomes. J Clin Epidemiol. 2017;84:61–9.
Helzer JE, Robins LN, Taibleson M, Woodruff RA Jr, Reich T, Wish ED. Reliability of psychiatric diagnosis. I. a methodological review. Arch Gen Psychiatry. 1977;34(2):129–33.
Chung ST, Chacko SK, Sunehag AL, Haymond MW. Measurements of gluconeogenesis and Glycogenolysis: a methodological review. Diabetes. 2015;64(12):3996–4010.
CAS PubMed PubMed Central Google Scholar
Sterne JA, Juni P, Schulz KF, Altman DG, Bartlett C, Egger M. Statistical methods for assessing the influence of study characteristics on treatment effects in 'meta-epidemiological' research. Stat Med. 2002;21(11):1513–24.
Moen EL, Fricano-Kugler CJ, Luikart BW, O’Malley AJ. Analyzing clustered data: why and how to account for multiple observations nested within a study participant? PLoS One. 2016;11(1):e0146721.
Zyzanski SJ, Flocke SA, Dickinson LM. On the nature and analysis of clustered data. Ann Fam Med. 2004;2(3):199–200.
Mathes T, Klassen P, Pieper D. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review. BMC Med Res Methodol. 2017;17(1):152.
Bui DDA, Del Fiol G, Hurdle JF, Jonnalagadda S. Extractive text summarization system to aid data extraction from full text in systematic review development. J Biomed Inform. 2016;64:265–72.
Bui DD, Del Fiol G, Jonnalagadda S. PDF text classification to leverage information extraction from publication reports. J Biomed Inform. 2016;61:141–8.
Maticic K, Krnic Martinic M, Puljak L. Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience. BMC Med Res Methodol. 2019;19(1):32.
Speich B. Blinding in surgical randomized clinical trials in 2015. Ann Surg. 2017;266(1):21–2.
Abraha I, Cozzolino F, Orso M, Marchesi M, Germani A, Lombardo G, Eusebi P, De Florio R, Luchetta ML, Iorio A, et al. A systematic review found that deviations from intention-to-treat are common in randomized trials and systematic reviews. J Clin Epidemiol. 2017;84:37–46.
Zhong Y, Zhou W, Jiang H, Fan T, Diao X, Yang H, Min J, Wang G, Fu J, Mao B. Quality of reporting of two-group parallel randomized controlled clinical trials of multi-herb formulae: A survey of reports indexed in the Science Citation Index Expanded. Eur J Integrative Med. 2011;3(4):e309–16.
Farrokhyar F, Chu R, Whitlock R, Thabane L. A systematic review of the quality of publications reporting coronary artery bypass grafting trials. Can J Surg. 2007;50(4):266–77.
Oltean H, Gagnier JJ. Use of clustering analysis in randomized controlled trials in orthopaedic surgery. BMC Med Res Methodol. 2015;15:17.
Fleming PS, Koletsi D, Pandis N. Blinded by PRISMA: are systematic reviewers focusing on PRISMA and ignoring other guidelines? PLoS One. 2014;9(5):e96407.
Balasubramanian SP, Wiener M, Alshameeri Z, Tiruvoipati R, Elbourne D, Reed MW. Standards of reporting of randomized controlled trials in general surgery: can we do better? Ann Surg. 2006;244(5):663–7.
de Vries TW, van Roon EN. Low quality of reporting adverse drug reactions in paediatric randomised controlled trials. Arch Dis Child. 2010;95(12):1023–6.
Borg Debono V, Zhang S, Ye C, Paul J, Arya A, Hurlburt L, Murthy Y, Thabane L. The quality of reporting of RCTs used within a postoperative pain management meta-analysis, using the CONSORT statement. BMC Anesthesiol. 2012;12:13.
Kaiser KA, Cofield SS, Fontaine KR, Glasser SP, Thabane L, Chu R, Ambrale S, Dwary AD, Kumar A, Nayyar G, et al. Is funding source related to study reporting quality in obesity or nutrition randomized control trials in top-tier medical journals? Int J Obes. 2012;36(7):977–81.
Thomas O, Thabane L, Douketis J, Chu R, Westfall AO, Allison DB. Industry funding and the reporting quality of large long-term weight loss trials. Int J Obes. 2008;32(10):1531–6.
Khan NR, Saad H, Oravec CS, Rossi N, Nguyen V, Venable GT, Lillard JC, Patel P, Taylor DR, Vaughn BN, et al. A review of industry funding in randomized controlled trials published in the neurosurgical literature-the elephant in the room. Neurosurgery. 2018;83(5):890–7.
Hansen C, Lundh A, Rasmussen K, Hrobjartsson A. Financial conflicts of interest in systematic reviews: associations with results, conclusions, and methodological quality. Cochrane Database Syst Rev. 2019;8:Mr000047.
Kiehna EN, Starke RM, Pouratian N, Dumont AS. Standards for reporting randomized controlled trials in neurosurgery. J Neurosurg. 2011;114(2):280–5.
Liu LQ, Morris PJ, Pengel LH. Compliance to the CONSORT statement of randomized controlled trials in solid organ transplantation: a 3-year overview. Transpl Int. 2013;26(3):300–6.
Bala MM, Akl EA, Sun X, Bassler D, Mertz D, Mejza F, Vandvik PO, Malaga G, Johnston BC, Dahm P, et al. Randomized trials published in higher vs. lower impact journals differ in design, conduct, and analysis. J Clin Epidemiol. 2013;66(3):286–95.
Lee SY, Teoh PJ, Camm CF, Agha RA. Compliance of randomized controlled trials in trauma surgery with the CONSORT statement. J Trauma Acute Care Surg. 2013;75(4):562–72.
Ziogas DC, Zintzaras E. Analysis of the quality of reporting of randomized controlled trials in acute and chronic myeloid leukemia, and myelodysplastic syndromes as governed by the CONSORT statement. Ann Epidemiol. 2009;19(7):494–500.
Alvarez F, Meyer N, Gourraud PA, Paul C. CONSORT adoption and quality of reporting of randomized controlled trials: a systematic analysis in two dermatology journals. Br J Dermatol. 2009;161(5):1159–65.
Mbuagbaw L, Thabane M, Vanniyasingam T, Borg Debono V, Kosa S, Zhang S, Ye C, Parpia S, Dennis BB, Thabane L. Improvement in the quality of abstracts in major clinical journals since CONSORT extension for abstracts: a systematic review. Contemporary Clin trials. 2014;38(2):245–50.
Thabane L, Chu R, Cuddy K, Douketis J. What is the quality of reporting in weight loss intervention studies? A systematic review of randomized controlled trials. Int J Obes. 2007;31(10):1554–9.
Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evidence Based Med. 2017;22(4):139.
METRIC - MEthodological sTudy ReportIng Checklist: guidelines for reporting methodological studies in health research [ http://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-other-study-designs/#METRIC ]. Accessed 31 Aug 2020.
Jager KJ, Zoccali C, MacLeod A, Dekker FW. Confounding: what it is and how to deal with it. Kidney Int. 2008;73(3):256–60.
Parker SG, Halligan S, Erotocritou M, Wood CPJ, Boulton RW, Plumb AAO, Windsor ACJ, Mallett S. A systematic methodological review of non-randomised interventional studies of elective ventral hernia repair: clear definitions and a standardised minimum dataset are needed. Hernia. 2019.
Bouwmeester W, Zuithoff NPA, Mallett S, Geerlings MI, Vergouwe Y, Steyerberg EW, Altman DG, Moons KGM. Reporting and methods in clinical prediction research: a systematic review. PLoS Med. 2012;9(5):1–12.
Schiller P, Burchardi N, Niestroj M, Kieser M. Quality of reporting of clinical non-inferiority and equivalence randomised trials--update and extension. Trials. 2012;13:214.
Riado Minguez D, Kowalski M, Vallve Odena M, Longin Pontzen D, Jelicic Kadic A, Jeric M, Dosenovic S, Jakus D, Vrdoljak M, Poklepovic Pericic T, et al. Methodological and reporting quality of systematic reviews published in the highest ranking journals in the field of pain. Anesth Analg. 2017;125(4):1348–54.
Thabut G, Estellat C, Boutron I, Samama CM, Ravaud P. Methodological issues in trials assessing primary prophylaxis of venous thrombo-embolism. Eur Heart J. 2005;27(2):227–36.
Puljak L, Riva N, Parmelli E, González-Lorenzo M, Moja L, Pieper D. Data extraction methods: an analysis of internal reporting discrepancies in single manuscripts and practical advice. J Clin Epidemiol. 2020;117:158–64.
Ritchie A, Seubert L, Clifford R, Perry D, Bond C. Do randomised controlled trials relevant to pharmacy meet best practice standards for quality conduct and reporting? A systematic review. Int J Pharm Pract. 2019.
Babic A, Vuka I, Saric F, Proloscic I, Slapnicar E, Cavar J, Pericic TP, Pieper D, Puljak L. Overall bias methods and their use in sensitivity analysis of Cochrane reviews were not consistent. J Clin Epidemiol. 2019.
Tan A, Porcher R, Crequit P, Ravaud P, Dechartres A. Differences in treatment effect size between overall survival and progression-free survival in immunotherapy trials: a Meta-epidemiologic study of trials with results posted at ClinicalTrials.gov. J Clin Oncol. 2017;35(15):1686–94.
Croitoru D, Huang Y, Kurdina A, Chan AW, Drucker AM. Quality of reporting in systematic reviews published in dermatology journals. Br J Dermatol. 2020;182(6):1469–76.
Khan MS, Ochani RK, Shaikh A, Vaduganathan M, Khan SU, Fatima K, Yamani N, Mandrola J, Doukky R, Krasuski RA: Assessing the Quality of Reporting of Harms in Randomized Controlled Trials Published in High Impact Cardiovascular Journals. Eur Heart J Qual Care Clin Outcomes 2019.
Rosmarakis ES, Soteriades ES, Vergidis PI, Kasiakou SK, Falagas ME. From conference abstract to full paper: differences between data presented in conferences and journals. FASEB J. 2005;19(7):673–80.
Mueller M, D’Addario M, Egger M, Cevallos M, Dekkers O, Mugglin C, Scott P. Methods to systematically review and meta-analyse observational studies: a systematic scoping review of recommendations. BMC Med Res Methodol. 2018;18(1):44.
Li G, Abbade LPF, Nwosu I, Jin Y, Leenus A, Maaz M, Wang M, Bhatt M, Zielinski L, Sanger N, et al. A scoping review of comparisons between abstracts and full reports in primary biomedical research. BMC Med Res Methodol. 2017;17(1):181.
Krnic Martinic M, Pieper D, Glatt A, Puljak L. Definition of a systematic review used in overviews of systematic reviews, meta-epidemiological studies and textbooks. BMC Med Res Methodol. 2019;19(1):203.
Analytical study [ https://medical-dictionary.thefreedictionary.com/analytical+study ]. Accessed 31 Aug 2020.
Tricco AC, Tetzlaff J, Pham B, Brehaut J, Moher D. Non-Cochrane vs. Cochrane reviews were twice as likely to have positive conclusion statements: cross-sectional study. J Clin Epidemiol. 2009;62(4):380–6 e381.
Schalken N, Rietbergen C. The reporting quality of systematic reviews and Meta-analyses in industrial and organizational psychology: a systematic review. Front Psychol. 2017;8:1395.
Ranker LR, Petersen JM, Fox MP. Awareness of and potential for dependent error in the observational epidemiologic literature: A review. Ann Epidemiol. 2019;36:15–9 e12.
Paquette M, Alotaibi AM, Nieuwlaat R, Santesso N, Mbuagbaw L. A meta-epidemiological study of subgroup analyses in cochrane systematic reviews of atrial fibrillation. Syst Rev. 2019;8(1):241.
Download references
This work did not receive any dedicated funding.
Authors and affiliations.
Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada
Lawrence Mbuagbaw, Daeria O. Lawson & Lehana Thabane
Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario, L8N 4A6, Canada
Lawrence Mbuagbaw & Lehana Thabane
Centre for the Development of Best Practices in Health, Yaoundé, Cameroon
Lawrence Mbuagbaw
Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000, Zagreb, Croatia
Livia Puljak
Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN, 47405, USA
David B. Allison
Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON, Canada
Lehana Thabane
Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON, Canada
Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON, Canada
You can also search for this author in PubMed Google Scholar
LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.
Correspondence to Lawrence Mbuagbaw .
Ethics approval and consent to participate.
Not applicable.
Competing interests.
DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and permissions
Cite this article.
Mbuagbaw, L., Lawson, D.O., Puljak, L. et al. A tutorial on methodological studies: the what, when, how and why. BMC Med Res Methodol 20 , 226 (2020). https://doi.org/10.1186/s12874-020-01107-7
Download citation
Received : 27 May 2020
Accepted : 27 August 2020
Published : 07 September 2020
DOI : https://doi.org/10.1186/s12874-020-01107-7
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
ISSN: 1471-2288
Run a free plagiarism check in 10 minutes, automatically generate references for free.
Published on 25 February 2019 by Shona McCombes . Revised on 10 October 2022.
Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research.
It should include:
Be assured that you'll submit flawless writing. Upload your document to correct all your mistakes.
How to write a research methodology, why is a methods section important, step 1: explain your methodological approach, step 2: describe your data collection methods, step 3: describe your analysis method, step 4: evaluate and justify the methodological choices you made, tips for writing a strong methodology chapter, frequently asked questions about methodology.
Your methods section is your opportunity to share how you conducted your research and why you chose the methods you chose. It’s also the place to show that your research was rigorously conducted and can be replicated .
It gives your research legitimacy and situates it within your field, and also gives your readers a place to refer to if they have any questions or critiques in other sections.
You can start by introducing your overall approach to your research. You have two options here.
What research problem or question did you investigate?
And what type of data did you need to achieve this aim?
Depending on your discipline, you can also start with a discussion of the rationale and assumptions underpinning your methodology. In other words, why did you choose these methods for your study?
Once you have introduced your reader to your methodological approach, you should share full details about your data collection methods .
In order to be considered generalisable, you should describe quantitative research methods in enough detail for another researcher to replicate your study.
Here, explain how you operationalised your concepts and measured your variables. Discuss your sampling method or inclusion/exclusion criteria, as well as any tools, procedures, and materials you used to gather your data.
Surveys Describe where, when, and how the survey was conducted.
Experiments Share full details of the tools, techniques, and procedures you used to conduct your experiment.
Existing data Explain how you gathered and selected the material (such as datasets or archival data) that you used in your analysis.
The survey consisted of 5 multiple-choice questions and 10 questions measured on a 7-point Likert scale.
The goal was to collect survey responses from 350 customers visiting the fitness apparel company’s brick-and-mortar location in Boston on 4–8 July 2022, between 11:00 and 15:00.
Here, a customer was defined as a person who had purchased a product from the company on the day they took the survey. Participants were given 5 minutes to fill in the survey anonymously. In total, 408 customers responded, but not all surveys were fully completed. Due to this, 371 survey results were included in the analysis.
In qualitative research , methods are often more flexible and subjective. For this reason, it’s crucial to robustly explain the methodology choices you made.
Be sure to discuss the criteria you used to select your data, the context in which your research was conducted, and the role you played in collecting your data (e.g., were you an active participant, or a passive observer?)
Interviews or focus groups Describe where, when, and how the interviews were conducted.
Participant observation Describe where, when, and how you conducted the observation or ethnography .
Existing data Explain how you selected case study materials for your analysis.
In order to gain better insight into possibilities for future improvement of the fitness shop’s product range, semi-structured interviews were conducted with 8 returning customers.
Here, a returning customer was defined as someone who usually bought products at least twice a week from the store.
Surveys were used to select participants. Interviews were conducted in a small office next to the cash register and lasted approximately 20 minutes each. Answers were recorded by note-taking, and seven interviews were also filmed with consent. One interviewee preferred not to be filmed.
Mixed methods research combines quantitative and qualitative approaches. If a standalone quantitative or qualitative study is insufficient to answer your research question, mixed methods may be a good fit for you.
Mixed methods are less common than standalone analyses, largely because they require a great deal of effort to pull off successfully. If you choose to pursue mixed methods, it’s especially important to robustly justify your methods here.
Next, you should indicate how you processed and analysed your data. Avoid going into too much detail: you should not start introducing or discussing any of your results at this stage.
In quantitative research , your analysis will be based on numbers. In your methods section, you can include:
In qualitative research, your analysis will be based on language, images, and observations (often involving some form of textual analysis ).
Specific methods might include:
Mixed methods combine the above two research methods, integrating both qualitative and quantitative approaches into one coherent analytical process.
Above all, your methodology section should clearly make the case for why you chose the methods you did. This is especially true if you did not take the most standard approach to your topic. In this case, discuss why other methods were not suitable for your objectives, and show how this approach contributes new knowledge or understanding.
In any case, it should be overwhelmingly clear to your reader that you set yourself up for success in terms of your methodology’s design. Show how your methods should lead to results that are valid and reliable, while leaving the analysis of the meaning, importance, and relevance of your results for your discussion section .
Remember that your aim is not just to describe your methods, but to show how and why you applied them. Again, it’s critical to demonstrate that your research was rigorously conducted and can be replicated.
The methodology section should clearly show why your methods suit your objectives and convince the reader that you chose the best possible approach to answering your problem statement and research questions .
Your methodology can be strengthened by referencing existing research in your field. This can help you to:
Consider how much information you need to give, and avoid getting too lengthy. If you are using methods that are standard for your discipline, you probably don’t need to give a lot of background or justification.
Regardless, your methodology should be a clear, well-structured text that makes an argument for your approach, not just a list of technical details and procedures.
Methodology refers to the overarching strategy and rationale of your research. Developing your methodology involves studying the research methods used in your field and the theories or principles that underpin them, in order to choose the approach that best matches your objectives.
Methods are the specific tools and procedures you use to collect and analyse data (e.g. interviews, experiments , surveys , statistical tests ).
In a dissertation or scientific paper, the methodology chapter or methods section comes after the introduction and before the results , discussion and conclusion .
Depending on the length and type of document, you might also include a literature review or theoretical framework before the methodology.
Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.
Quantitative methods allow you to test a hypothesis by systematically collecting and analysing data, while qualitative methods allow you to explore ideas and experiences in depth.
A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.
For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.
Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.
If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.
McCombes, S. (2022, October 10). What Is a Research Methodology? | Steps & Tips. Scribbr. Retrieved 5 August 2024, from https://www.scribbr.co.uk/thesis-dissertation/methodology/
Other students also liked, how to write a dissertation proposal | a step-by-step guide, what is a literature review | guide, template, & examples, what is a theoretical framework | a step-by-step guide.
Thesis and research papers are incomplete without research methodology. Learn what is, why it is important and how to write one.
It is not easy to carry out your research and write your thesis or dissertation simultaneously. It becomes challenging to keep track of everything, especially for the amateurs who have no experience in research.
To manage this problem and carry out your research smoothly, you have a section in your thesis, dissertation, or research paper called research methodology, that outlines your research work and saves you from misery.
In our blog, there are many articles to help your journey in writing your paper. Such as “ Research paper: how to write from scratch in 5 easy steps ” and “ How to Write a Conclusion for a Research Pape r”
But in this one, you will know what is methodology in research , its importance, its type, and how to write a research methodology.
Research Methodology is a systematic framework used to solve the research problem by using the best and most feasible methods to conduct the research while aligning with the aim and objectives of your research.
The research methodology includes answering the what, why, and how of your research.
To put it in simpler words, you will explain about:
Research instruments are the tools you use to collect and analyze the data in your research. The topics below are the common tools used in carrying out the research and their usage depends on the type of needs. Sometimes, a combination of these tools is used to solve the research problem.
Interviews help you to collect personalized information and are categorized as structured, semi-structured, or unstructured. You can use them based on the type and tone of the questions. While one-on-one interview gives you detailed information about the respondent, through group interviews, you could get the perception of a group of people.
Through the survey, you seek responses to a set of questions you have designed targeting a specific group of people. In the survey, you can use open-ended and closed-ended questions or a mix of both to get the answers to your questions.
In focus group discussions, a large group of people share their opinions on a topic and you will be making a note of the answers they give. This is similar to that of group interviews.
This tool is used to study human behavior in different situations. You can study either the spontaneous actions of the respondent or it can be a structured one. In a structured observation, the researcher observes the behavior of the respondent in a pre-planned situation.
The three types of methodology used by researchers are qualitative, quantitative, and mixed methods.
Qualitative research encompasses the collection and analysis of written or spoken words and texts. Researchers generally use qualitative methods when their goals and objectives are exploratory, such as when they study the perception of an event, person, product, etc. This type of data is basically gathered through interviews, observations, and focus groups.
Example: What factors influence employee retention in a large organization?
In the Quantitative method, researchers collect, measure, and analyze numerical data from a large number of participants. This method is mainly used to confirm something by using facts and statistics. It is gathered using surveys, questionnaires, tests, databases, and records.
Example: How many people were laid off in the UK due to the recession?
This method is a combination of both qualitative and quantitative methods. It provides a more realistic and true approach to any findings and presents multiple possibilities for those findings. This method often tends to produce interesting results for a specific set of approaches or findings.
Example: How many people have left their jobs post-pandemic (quantitative) and how does it affect the current employees (qualitative)?
Define your method properly.
Explain the methodology that you used to investigate the issue while carrying out the research. This allows you to walk your reader through your research step-by-step and gain their trust in your work. Your research methodology can either be quantitative, qualitative or a combination of two.
In order to make your approach relevant to the overall research design, you must establish a clear connection between your methods and the research problem. Therefore, your methodology must meet the objectives of your research paper and match the aim.
Describe the tools and instruments you plan to use in gathering your data and how you will use them. These could be surveys, questionnaires, interviews, observations, or a mix of two or more. If you will use any outside method, explain that method and the result obtained from that method clearly.
Next, you need to explain how you will process and analyze the data you intend to collect. However, you need not discuss any results or conclusions here
If your study will strictly be quantitative, explain how you will ensure the data is accurate, how you will analyze the numbers, and what statistical tests you will conduct. If your study is purely qualitative, specify whether you intend to conduct any content analysis, theme analysis, or discourse analysis.
In this step, you will explain why you choose a particular approach, be it qualitative, quantitative, or mixed approach, with the problem faced and the results obtained so far. Also, explain why that approach is relevant to your study.
Always remember to take notes and outline your research as it ensures you have complete data, and also keeps a record of all approaches used in the research.
The research question is the key, so always focus on the research question and try to answer it with all the relevant facts, figures, and data possible.
The methodology should be written or drafted in such a way that it shows and third-party approach and is from the point of view of the audience.
Only relevant data should be included in the research, and also try to present the information in a graphical format so that they are clear and easy to understand.
The main objective of the methodology is to identify and understand the methods applied in the research. | The main objective of the method is to find a solution to the outlined problem. |
The methodology is a proper study or analysis of all the methods used in the research. | Methods are simply behavior or tools used to select research techniques. |
The methodology is applied at the initial stage of the research/study. | Methods are used and applied at a later stage of the study/ research. |
A methodology is a systematic approach to finding a solution to a problem. | Methods are a combination of different investigation and comparison techniques. |
Learning becomes fun and easy when you add some visuals to the words. So, help your Hippocampus restore the information quickly with unique and visually appealing infographics through Mind the Graph.
Exclusive high quality content about effective visual communication in science.
Sign Up for Free
Try the best infographic maker and promote your research with scientifically-accurate beautiful figures
no credit card required
Fabricio Pamplona is the founder of Mind the Graph - a tool used by over 400K users in 60 countries. He has a Ph.D. and solid scientific background in Psychopharmacology and experience as a Guest Researcher at the Max Planck Institute of Psychiatry (Germany) and Researcher in D'Or Institute for Research and Education (IDOR, Brazil). Fabricio holds over 2500 citations in Google Scholar. He has 10 years of experience in small innovative businesses, with relevant experience in product design and innovation management. Connect with him on LinkedIn - Fabricio Pamplona .
The methods section of a research paper provides the information by which a study’s validity is judged. The method section answers two main questions: 1) How was the data collected or generated? 2) How was it analyzed? The writing should be direct and precise and written in the past tense.
You must explain how you obtained and analyzed your results for the following reasons:
Bem, Daryl J. Writing the Empirical Journal Article . Psychology Writing Center. University of Washington; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008.
I. Groups of Research Methods
There are two main groups of research methods in the social sciences:
II. Content
An effectively written methodology section should:
NOTE : Once you have written all of the elements of the methods section, subsequent revisions should focus on how to present those elements as clearly and as logically as possibly. The description of how you prepared to study the research problem, how you gathered the data, and the protocol for analyzing the data should be organized chronologically. For clarity, when a large amount of detail must be presented, information should be presented in sub-sections according to topic.
III. Problems to Avoid
Irrelevant Detail The methodology section of your paper should be thorough but to the point. Don’t provide any background information that doesn’t directly help the reader to understand why a particular method was chosen, how the data was gathered or obtained, and how it was analyzed. Unnecessary Explanation of Basic Procedures Remember that you are not writing a how-to guide about a particular method. You should make the assumption that readers possess a basic understanding of how to investigate the research problem on their own and, therefore, you do not have to go into great detail about specific methodological procedures. The focus should be on how you applied a method , not on the mechanics of doing a method. NOTE: An exception to this rule is if you select an unconventional approach to doing the method; if this is the case, be sure to explain why this approach was chosen and how it enhances the overall research process. Problem Blindness It is almost a given that you will encounter problems when collecting or generating your data. Do not ignore these problems or pretend they did not occur. Often, documenting how you overcame obstacles can form an interesting part of the methodology. It demonstrates to the reader that you can provide a cogent rationale for the decisions you made to minimize the impact of any problems that arose. Literature Review Just as the literature review section of your paper provides an overview of sources you have examined while researching a particular topic, the methodology section should cite any sources that informed your choice and application of a particular method [i.e., the choice of a survey should include any citations to the works you used to help construct the survey].
It’s More than Sources of Information! A description of a research study's method should not be confused with a description of the sources of information. Such a list of sources is useful in itself, especially if it is accompanied by an explanation about the selection and use of the sources. The description of the project's methodology complements a list of sources in that it sets forth the organization and interpretation of information emanating from those sources.
Azevedo, L.F. et al. How to Write a Scientific Paper: Writing the Methods Section. Revista Portuguesa de Pneumologia 17 (2011): 232-238; Butin, Dan W. The Education Dissertation A Guide for Practitioner Scholars . Thousand Oaks, CA: Corwin, 2010; Carter, Susan. Structuring Your Research Thesis . New York: Palgrave Macmillan, 2012; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008. Methods Section . The Writer’s Handbook. Writing Center. University of Wisconsin, Madison; Writing the Experimental Report: Methods, Results, and Discussion . The Writing Lab and The OWL. Purdue University; Methods and Materials . The Structure, Format, Content, and Style of a Journal-Style Scientific Paper. Department of Biology. Bates College.
Statistical Designs and Tests? Do Not Fear Them!
Don't avoid using a quantitative approach to analyzing your research problem just because you fear the idea of applying statistical designs and tests. A qualitative approach, such as conducting interviews or content analysis of archival texts, can yield exciting new insights about a research problem, but it should not be undertaken simply because you have a disdain for running a simple regression. A well designed quantitative research study can often be accomplished in very clear and direct ways, whereas, a similar study of a qualitative nature usually requires considerable time to analyze large volumes of data and a tremendous burden to create new paths for analysis where previously no path associated with your research problem had existed.
Knowing the Relationship Between Theories and Methods
There can be multiple meaning associated with the term "theories" and the term "methods" in social sciences research. A helpful way to delineate between them is to understand "theories" as representing different ways of characterizing the social world when you research it and "methods" as representing different ways of generating and analyzing data about that social world. Framed in this way, all empirical social sciences research involves theories and methods, whether they are stated explicitly or not. However, while theories and methods are often related, it is important that, as a researcher, you deliberately separate them in order to avoid your theories playing a disproportionate role in shaping what outcomes your chosen methods produce.
Introspectively engage in an ongoing dialectic between theories and methods to help enable you to use the outcomes from your methods to interrogate and develop new theories, or ways of framing conceptually the research problem. This is how scholarship grows and branches out into new intellectual territory.
Reynolds, R. Larry. Ways of Knowing. Alternative Microeconomics. Part 1, Chapter 3. Boise State University; The Theory-Method Relationship . S-Cool Revision. United Kingdom.
FIND US ON
What is a methodology.
The methodology is perhaps the most challenging and laborious part of the dissertation . Essentially, the methodology helps in understanding the broad, philosophical approach behind the methods of research you chose to employ in your study. The research methodology elaborates on the ‘how’ part of your research.
This means that your methodology chapter should clearly state whether you chose to use quantitative or qualitative data collection techniques or a mix of both.
Your research methodology should explain the following:
You will be required to provide justifications as to why you preferred a certain method over the others. If you are trying to figure out exactly how to write methodology or the structure of a methodology for a dissertation, this article will point you in the right direction.
Students must be sure of why they chose a certain research method over another. “I figured out” or “In my opinion” statements will not be an acceptable justification. So, you will need to come up with concrete academic reasons for your selection of research methods.
The methodology generally acts as a guideline or plan for exactly how you intend to carry out your research. This is especially true for students who must submit their methodology chapter before carrying out the research.
Your methodology should link back to the literature review and clearly state why you chose certain data collection and analysis methods for your research/dissertation project.
The methodology chapter consists of the following:
For those who are submitting their dissertation as a single paper, their methodology should also touch on any modifications they had to make as their work progressed.
However, it is essential to provide academic justifications for all choices made by the researcher.
The theme of your research methodology chapter should be related to your literature review and research question (s).
You can visit your college or university library to find textbooks and articles that provide information about the commonly employed research methods .
An intensive reading of such books can help you devise your research philosophy and choose the appropriate methods. Any limitations or weaknesses of your chosen research approach should also be explained, as well as the strategies to overcome them.
To research well, you should read well! Read as many research articles (from reputed journals) as you can. Seeing how other researchers use methods in their studies and why will help you justify, in the long run, your own research method(s).
Regardless of the chosen research approach, you will find researchers who either support it or don’t. Use the arguments for and against articulated in the literature to clarify why you decided to choose the selected research design and why the research limitations are irrelevant to your research.
The typical structure of the methodology chapter is as follows:
In research jargon, generalisability is termed external validity . It means how generalisable your research findings are to other contexts, places, times, people, etc. External validity is expected to be significantly high, especially in quantitative studies.
According to USC-Research Guides (2017) , a research design’s primary function is to enable the researcher to answer the research questions through evidence effectively. Generally, this section will shed light on how you collected your data.
The researcher will have to justify their choice of data collection methods, such as the one that was reviewed, the use of data tools (interviews, phone surveys, questionnaires, observation, online surveys , etc.) and the like.
Moreover, data sampling choice should also be clearly explained with a focus on how you chose the ethnicity, group, profession and age of the participants.
It is recommended to prepare these questions at the start of your research. You should develop your research problem and questions. This approach can allow the room to change or modify research questions if your data collection methods do not give the desired results.
It’s a good practice to keep referring to your research questions whilst planning or writing the research design section. This will help your reader recall what the research is about; why you have done what you did. Even though this technique is recommended to be applied at the start of every section within a dissertation, it’s especially beneficial in the methodology section.
In short, you will need to make sure that the data you are going to collect relates to the topic you are exploring. The complexity and length of the research design section will vary depending on your academic subject and the scope of your research, but a well-written research design will have the following characteristics:
This will discuss your chosen philosophy to strengthen your research and the research model. Commonly employed philosophies in academia are
There are several other research philosophies that you could adopt.
The choice of philosophy will depend on many factors, including your academic subject and the type and complexity of the research study. Regardless of which philosophy is used, you will be required to make different assumptions about the world.
Once you have chosen your research philosophy, the next step will describe your research context to answer all the questions, including when, where, why, how and what of your research.
Essentially, as a researcher, you will be required to decide whether you will be using a qualitative method, a quantitative method or a mix of both.
Using both qualitative and quantitative methods leads to the use of a mixed-methods approach. This approach also goes by another seldom-used name: eclectic approach.
The process of data collection is different for each method. Typically, you would want to decide whether you will adopt the positivist approach, defining your hypothesis and testing it against reality.
If this is the case, you will be required to take the quantitative approach, collecting numerical data at a large scale (from 30 or more respondents) and testing your hypotheses with this data.
Collecting data from at least 30 respondents/participants ensures reliable statistical analysis . This is especially true for quantitative studies. If the data contains less than 30 responses, it won’t be enough to carry out reliable statistical analyses on such data.
The other option for you would be to base your research on a qualitative approach, which will point you in a direction where you will be investigating broader areas by identifying people’s emotions and perceptions of a subject.
With a qualitative approach, you will have to collect responses from respondents and look at them in all their richness to develop theories about the field you are exploring.
Finally, you can also use a mix of qualitative and quantitative methods (which is becoming increasingly popular among researchers these days). This method is beneficial if you are interested in putting quantitative data into a real-world context or reflecting different perspectives on a subject.
Research philosophy in the ‘research onion.’
This section will require you to clearly specify how you gathered the data and briefly discuss the tools you used to analyse it. For example, you may choose to conduct surveys and/or interviews as part of the data collection process.
Similarly, if you used software such as Excel or SPSS to process the data , you will have to justify your software choice. In this section of your methodology chapter , you will also have to explain how you arrived at your findings and how reliable they are.
It is important to note that your readers or supervisor would want to see a correlation between your findings and the hypothesis/research questions you based your study on at the very beginning.
Your supervisor or dissertation research assistant can play a key role in helping you write the methodology chapter according to established research standards. So, keep your supervisor in the loop to get their contributions and recommendations throughout the process.
In this section, you should briefly describe the methods you’ve used to analyse the data you’ve collected.
The qualitative method includes analysing language, images, audio, videos, or any textual data (textual analysis). The following types of methods are used in textual analysis .
Discourse analysis:
Discourse analysis is an essential aspect of studying a language and its uses in day-to-day life.
Content analysis:
It is a method of studying and retrieving meaningful information from documents Thematic analysis:
It’s a method of identifying patterns of themes in the collected information, such as face-to-face interviews, texts, and transcripts.
Example: After collecting the data, it was checked thoroughly to find the missing information. The interviews were transcribed, and textual analysis was conducted. The repetitions of the text, types of colours displayed, and the tone of the speakers was measured.
Quantitative data analysis is used for analysing numerical data. Include the following points:
Other important sections of your methodology are:
Always consider how your research will influence other individuals who are beyond the scope of the study. This is especially true for human subjects. As a researcher, you are always expected to make sure that your research and ideas do not harm anyone in any way.Discussion concerning data protection, data handling and data confidentiality will also be included in this brief segment.
Even though there is no established rule to include ethical considerations and limitations within the methodology section, it’s generally recommended to include it in this section, as it makes more sense than including it, say, after the discussions section or within the conclusion.
This is mainly because limitations almost always occur in the methodology stage of research. And ethical considerations need to be taken while sampling, an important aspect of the research methodology.
Here are some examples of ethical issues that you should be mindful of
All such issues should be categorically addressed and a justification provided for your chosen research methodology by highlighting the study’s benefits.
Is your research study and findings reliable for other researchers in your field of work? To establish yourself as a reliable researcher, your study should be both authentic and reliable.
Reliability means the extent to which your research can yield similar results if it was replicated in another setting, at a different time, or under different circumstances. If replication occurs and different findings come to light, your (original) research would be deemed unreliable.
Good dissertation writers will always acknowledge the limitations of their research study. Limitations in data sampling can decrease your results’ reliability.
A classic example of research limitation is collecting responses from people of a certain age group when you could have targeted a more representative cross-section of the population.Be humble and admit to your own study’s limitations. Doing so makes your referees, editors, supervisors, readers and anyone else involved in the research enterprise aware that you were also aware of the things that limited your study.
Limitations are NOT the same as implications. Sometimes, the two can be confused. Limitations lead to implications, that is, due to a certain factor being absent in the study (limitation) for instance, future research could be carried out in a setting where that factor is present (implication).
At this point, you might have a basic understanding of how to craft a well-written, organised, accurate methodology section for your dissertation. An example might help bring all the aforementioned points home. Here is a dissertation methodology example in pdf to better understand how to write methodology for a dissertation.
Sample Dissertation Methodology
If not, we can help. Our panel of experts makes sure to keep the 3 pillars of Research Methodology strong.
A scientific or lab-based study.
A methodology section for a scientific study will need to elaborate on reproducibility and meticulousness more than anything else. If your methods have obvious flaws, the readers are not going to be impressed. Therefore, it is important to ensure that your chosen research methodology is vigorous in nature.
Any information related to the procedure, setup and equipment should be clearly stated so other researchers in your field of study can work with the same method in the future if needed.
Variables that are likely to falsify your data must be taken into the equation to avoid ambiguities. It is recommended to present a comprehensive strategy to deal with these variables when gathering and analysing the data and drawing conclusions.
Statistical models employed as part of your scientific study will have to be justified, and so your methodology should include details of those statistical models.
Another scholar in the future might use any aspect of your methodology as the starting point for their research. For example, they might base their research on your methodology but analyse the data using other statistical models. Hence, this is something you should be mindful of.
Like scientific or lab-based research, a behavioural and social sciences methodology needs to be built along the same lines. The chosen methodology should demonstrate reproducibility and firmness so other scholars can use your whole dissertation methodology or a part of it based on their research needs.
But there are additional issues that the researcher must take into consideration when working with human subjects. As a starting point, you will need to decide whether your analysis will be based on qualitative data, quantitative data or mixed-method of research, where qualitative data is used to provide contextual background to quantitative data or the other way around.
Here are some questions for you to consider:
While you will be required to demonstrate that you have taken care of the above questions, it is equally important to make sure that you address your research study’s ethical issues side-by-side.
Of course, the first step in that regard will be to obtain formal approval for your research design from the ethics bodies (such as IRBs – institutional review boards), but still, there will be many more issues that could trigger a sense of grief and discomfort among some of the readers.
The rigour and dependability of the methods of research employed remain undisputed and unquestionable for humanities and arts-based dissertations as well. However, the way you convince your readers of your dissertation’s thoroughness is slightly different.
Unlike social science dissertation or a scientific study, the methodology of dissertations in arts and humanities subjects needs to be directly linked to the literature review regardless of how innovative your dissertation’s topic might be.
For example, you could demonstrate the relationship between A and B to discover a new theoretical background or use existing theories in a new framework.
The methodology section of humanities and arts-based dissertations is less complex, so there might be no need to justify it in detail. Students can achieve a seamless transition from the literature review to the analysis section.
However, like with every other type of research methodology, it is important to provide a detailed justification of your chosen methodology and relate it to the research problem.
Failing to do so could leave some readers unconvinced of your theoretical foundations’ suitability, which could potentially jeopardise your whole research.
Make sure that you are paying attention to and giving enough information about the social and historical background of the theoretical frameworks your research methodology is based on. This is especially important if there is an essential difference of opinion between your research and the research done on the subject in the past.
A justification of why opposing schools of thought disagree and why you still went ahead to use aspects of these schools of thought in your methodology should be clearly presented for the readers to understand how they would support your readings.
Some degree programs in the arts allow students to undertake a portfolio of artworks or creative writing rather than produce an extended dissertation research project.However, in practice, your creative research will be required to be submitted along with a comprehensive evaluative paper, including background information and an explanation that hypothesises your innovative exercise.
While this might seem like an easy thing to do, critical evaluation of someone’s work is highly complex and notorious in nature. This further reinforces the argument of developing a rigorous methodology and adhering to it.
As a scholar, you will be expected to showcase the ability to critically analyse your methodology and show that you are capable of critically evaluating your own creative work.Such an approach will help you justify your method of creating the work, which will give the readers the impression that your research is grounded in theory.
All chapters of a dissertation paper are interconnected. This means that there will undoubtedly be some information that would overlap between the different chapters of the dissertation .
For example, some of the text material may seem appropriate to both the literature review and methodology sections; you might even end up moving information from pillar to post between different chapters as you edit and improve your dissertation .
However, make sure that you are not making the following a part of your dissertation methodology, even though it may seem appropriate to fit them in there:
It might seem relevant to include details of the models your dissertation methodology is based on. However, a detailed review of models and precedents used by other scholars and theorists will better fit in the literature review chapter, which you can link back to. This will help the readers understand why you decided to go in favour of or against a certain tactic.
There is absolutely no need to provide extensive details of things like lab equipment and experiment procedures. Having such information in the methodology chapter would discourage some readers who might not be interested in your equipment, setup, lab environment, etc.
Your aim as the author of the document will be to retain the readers’ interest and make the methodology chapter as readable as possible.
While it is important to get all the information relating to how others can reproduce your experiment, it is equally important to ensure your methodology section isn’t unnecessarily long. Again, additional information is better to be placed within the appendices chapter.
The methodology is not the section to provide raw data, even if you are only discussing the data collection process. All such information should be moved to the appendices section.
Even if you feel some finding or numerical data is crucial to be presented within the methodology section, you can, at most, make brief comments about such data. Its discussion, however, is only allowed in the discussions section .
The factors which can determine if your dissertation methodology is ‘great’ depend on many factors, including the level of study you are currently enrolled in.
Undergraduate dissertations are, of course, less complex and less demanding. At most universities in the UK, undergraduate students are required to exhibit the ability to conduct thorough research as they engage for the first time with theoretical and conceptual frameworks in their chosen research area.
As an undergraduate student, you will be expected to showcase the capacity to reproduce what you have learnt from theorists in your academic subject, transform your leanings into a methodology that would help you address the research problem, and test the research hypothesis, as mentioned in the introduction chapter.
A great undergraduate-level dissertation will incorporate different schools of thought and make a valuable contribution to existing knowledge. However, in general, undergraduate-level dissertations’ focus should be to show thorough desk-based and independent research skills.
Postgraduate dissertation papers are much more compound and challenging because they are expected to make a substantial contribution to existing knowledge.
Depending on the academic institute, some postgraduate students are even required to develop a project published by leading academic journals as an approval of their research skills.
It is important to recognise the importance of a postgraduate dissertation towards building your professional career, especially if your work is considered impactful in your area of study and receives citations from multiple scholars, enhancing your reputation in academic communities.
Even if some academics cite your literature review and conclusion in their own work, it is a well-known fact that your methodology framework will result in many more citations regardless of your academic subject.
Other scholars and researchers in your area of study are likely to give much more value to a well-crafted methodology, especially one they can use as the starting point for their own research.
Of course, they can alter, refine and enhance your methodology in one way or another. They can even apply your methodological framework to a new data set or apply it in a completely new situation that is irrelevant to your work.
Finally, postgraduate dissertations are expected to be highly convincing and demonstrate in-depth engagement. They should be reproducible and show rigour, so the findings and conclusions can be regarded as authentic and reliable among scientific and academic communities.
The methodology is the door to success when it comes to dissertation projects. An original methodology that takes into consideration all aspects of research is likely to have an impact on the field of study.
As a postgraduate student, you should ask yourself, Is my dissertation methodology reproducible and transferable? Producing a methodology that others can reproduce in the future is as important as answering research questions .
The methodology chapter can either make or break the grade of your research/dissertation paper. It’s one of the research elements that leave a memorable impression on your readers. So, it would help if you took your time when it comes to choosing the right design and philosophical approach for your research.
Always use authentic academic sources and discuss your plans in detail with your supervisor if you believe your research design or approach has flaws in it.
Did this article help you learn how to write a dissertation methodology and how to structure a dissertation methodology? Let us know in your comments.
Avail of our dissertation writing services ! At ResearchProspect, we have Master’s and PhD qualified dissertation writers for all academic subjects, so you can be confident that the writer we will assign to your dissertation order will be an expert in your field of study. They can help you with your whole dissertation or just a part of it. You decide how much or how little help you need.
Are you looking for intriguing and trending dissertation topics? Get inspired by our list of free dissertation topics on all subjects.
Looking for an easy guide to follow to write your essay? Here is our detailed essay guide explaining how to write an essay and examples and types of an essay.
Learn about the steps required to successfully complete their research project. Make sure to follow these steps in their respective order.
USEFUL LINKS
LEARNING RESOURCES
COMPANY DETAILS
Home » What are Research Methods?
Imagine you’re starting on a journey of discovery, and research methods are your compass, map, and tools. These methods guide us in exploring the vast landscape of knowledge, ensuring our journey is structured, reliable, and fruitful.
Table of Contents
Research Methods are systematic strategies, steps, and tools that researchers use to gather, analyze, and interpret data about a particular topic. It’s like cooking a new recipe; you need the right ingredients (data), a good method (research design), and the proper tools (instruments like surveys or experiments) to create a delightful dish (knowledge).
Qualitative research.
This is akin to painting a portrait. It focuses on understanding concepts, thoughts, and experiences through detailed, descriptive data. Imagine sitting down with someone and listening to their story to grasp the depth of their experiences. Tools for this might include interviews , observations , and textual analysis .
Now, imagine yourself counting stars in the sky. This method deals with numbers and statistical analysis. It seeks to quantify the problem by generating numerical data or data that can be transformed into usable statistics. Surveys with multiple-choice questions or experiments where you measure and compare are typical tools here.
Sometimes, a single perspective isn’t enough. Mixed methods blend the colors of both qualitative and quantitative research, offering a more comprehensive picture. It’s like using both a microscope and a telescope; you get the detail and the big picture.
Identifying the Problem : Every journey begins with recognizing where you want to go. What’s the question you’re burning to answer? This step involves defining the scope and purpose of your research.
Literature Review : Before you set out, you need to map the terrain by exploring what others have discovered before you. This involves reading and summarizing existing research on your topic.
Designing the Study : Here’s where you plan your route. Will you conduct interviews? Send out surveys? Observe behaviors? This step involves deciding on your research method, participants, and tools.
Collecting Data : Time to hit the road and gather your data. This is the hands-on part of your research, where you implement your chosen methods to collect information.
Analyzing Data : With your treasures in hand, you now sift through your findings, looking for patterns, themes, or statistical relationships. This step often involves software for qualitative or quantitative analysis.
Interpreting Results : What have you discovered? This stage is about making sense of your data, connecting the dots, and understanding what your findings mean in the context of your research question.
Reporting and Sharing Findings : The final step is to share your journey’s story. This could be a research paper, a presentation, or any format that communicates your discoveries to others.
Imagine you’re a guest in someone’s home; you must be respectful and considerate. Similarly, ethical considerations are paramount in research. This means ensuring confidentiality, obtaining informed consent, and treating all subjects (people, animals, the environment) with respect and dignity.
Research methods are the compass, map, and tools that guide us through the terrain of knowledge. They enable us to ask important questions, systematically gather and analyze data, and contribute valuable insights to our understanding of the world. As you start on your research journey, embrace the adventure, respect the process, and look forward to the discoveries that await you.
Purpose of a white paper, how to write a white paper, the bottom line.
Adam Hayes, Ph.D., CFA, is a financial writer with 15+ years Wall Street experience as a derivatives trader. Besides his extensive derivative trading expertise, Adam is an expert in economics and behavioral finance. Adam received his master's in economics from The New School for Social Research and his Ph.D. from the University of Wisconsin-Madison in sociology. He is a CFA charterholder as well as holding FINRA Series 7, 55 & 63 licenses. He currently researches and teaches economic sociology and the social studies of finance at the Hebrew University in Jerusalem.
Thomas J Catalano is a CFP and Registered Investment Adviser with the state of South Carolina, where he launched his own financial advisory firm in 2018. Thomas' experience gives him expertise in a variety of areas including investments, retirement, insurance, and financial planning.
Investopedia / Michela Buttignol
A white paper is an informational document issued by a company or not-for-profit organization to promote or highlight the features of a solution, product, or service that it offers or plans to offer.
White papers are also used as a method of presenting government policies and legislation and gauging public opinion.
White papers are sales and marketing documents used to entice or persuade potential customers to learn more about a particular product, service, technology, or methodology.
White papers are commonly designed for business-to-business (B2B) marketing purposes between a manufacturer and a wholesaler , or between a wholesaler and a retailer. It can provide an in-depth report or guide about a specific product or topic and is meant to educate its readers.
The facts presented in white papers are often backed by research and statistics from reliable sources and can include charts, graphs, tables, and other ways of visualizing data. A white paper can communicate an organization’s philosophy or present research findings related to an industry.
A startup , large corporation, or government agency will use white papers differently. There are three main types of white papers: backgrounders, numbered lists, and problem/solution white papers.
Backgrounders detail the technical features of a new product or service. Designed to simplify complicated technical information, they are used to:
Numbered lists highlight the key takeaways of a new product or service, and are often formatted with headings and bullet points such as the following familiar format:
Problem/solution papers identify specific problems faced by potential customers and suggest a data-driven argument about how a featured product or service provides a solution to:
White papers differ from other marketing materials, such as brochures. Brochures and traditional marketing materials might be flashy and obvious, but a white paper is intended to provide persuasive and factual evidence that solves a problem or challenge.
White papers are commonly at least 2,500 words in length and written in an academic style.
A white paper should provide well-researched information that is not found with a simple Internet search and has a compelling narrative to keep the reader’s attention. The author of a white paper should:
All of the documents listed below, publicly available on Microsoft’s website, focus on aspects of the company’s suite of cloud services. In contrast with brochures, these white papers don’t have a clear sales pitch. Instead, they dive into relevant topics, such as cloud security, hybrid clouds, and the economic benefits of adopting cloud computing.
Cryptocurrencies have also been known to publish white papers during initial coin offerings (ICOs) and frequently issued white papers to entice users and “investors” to their projects.
Bitcoin famously launched a few months after the pseudonymous Satoshi Nakamoto issued its famous white paper online in October 2008.
White papers may have developed from the use of “Blue Papers” in 19th century Britain, where a Parliament report cover was blue. When a topic for the government was less serious, the blue cover was discarded and published with white covers. These reports were called White Papers. In the United States, the use of government white papers often means a background report or guidance on a specific issue.
A white paper is an informational document issued by a company, government agency, or not-for-profit organization to promote the features of a solution, product, or service that it offers or plans to offer. The facts presented in white papers are often backed by research and statistics from reliable sources and are commonly written in one of three formats: backgrounders, numbered lists, and problem/solution papers.
Bitcoin.org. " Bitcoin: A Peer-to-Peer Electronic Cash System ."
Michigan State University. " Finding British Parliamentary Papers in the M.S.U. Libraries, Collections Guide No. 6 (Advanced): Parliamentary, or Sessional Papers--Discussion ."
dusanpetkovic / Getty Images
A study published this week from researchers at the American Cancer Society in The Lancet Public Health adds to growing evidence that young adults are more likely to develop several types of cancer than older generations.
We’ve covered this phenomenon before . Research published last year in JAMA Network Open showed that people in their 30s, and young women in particular, saw disproportionately high rates of cancer between 2010 and 2019. Breast, thyroid, and colon or rectal cancers were diagnosed the most in people under 50. Cancers of the appendix and the intrahepatic bile duct grew the fastest among this group.
How do the latest findings add to the conversation about rising cancer rates in young adults?
Different from other studies of early-onset cancer rates, this new paper tracked both case rates and mortality rates. It also encompasses more cancer types than many of the major studies published recently.
Here’s what you need to know.
17 of the 34 cancers included in the study became more common in younger adults compared to older generations:
Many of these cancers are considered rare, meaning there are fewer than 200,000 cases per year.
To understand how cancer diagnoses and death rates have changed over the years, researchers divided patients into “birth cohorts,” which were separated by five-year intervals from 1920 to 1990.
In addition to the cancers identified in previous research, the authors found that each cohort had a higher risk of developing eight cancer types than the groups before them:
Rates for the other nine cancer types declined for the first few decades of the study, but have picked up in younger generations.
The risk of developing cancer is two to three times greater for millennials compared to baby boomers for small intestine cancer, kidney cancer, pancreatic cancer, and liver and intrahepatic bile duct cancer in females.
While cancer rates have climbed over time, death rates don’t always follow suit. However, mortality risk increased alongside incidence rates for liver cancer in females, endometrial cancer, gallbladder cancer, testicular cancer, and colorectal cancer.
It’s important to understand cancer trends in young people because they can provide insight about exposure to carcinogenic factors in early life and young adulthood, the authors write. Elevated cancer risk in young people “foreshadow future disease burden as these young cohorts carry their increased risk into older age, when cancers most frequently occur.”
There are lots of reasons why cancer rates may be rising in young adults, and they likely differ by cancer type:
The researchers analyzed health records data for more than 23 million patients ages 25 to 84 years old who were diagnosed with cancer between 2000 and 2019. They analyzed incidence rates for 34 cancer types and death rates for 25 cancer types.
Sung H, Jiang C, Bandi P, et al. Differences in cancer rates among adults born between 1920 and 1990 in the USA: an analysis of population-based cancer registry data . Lancet Public Health . 2024;9(8):e583-e593. doi:10.1016/S2468-2667(24)00156-7
Koh B, Tan DJH, Ng CH, et al. Patterns in cancer incidence among people younger than 50 years in the US, 2010 to 2019 . JAMA Netw Open. 2023;6(8):e2328171. doi:10.1001/jamanetworkopen.2023.28171
National Cancer Institute. Cancer stat facts: common cancer sites .
Sung H, Siegel R, Rosenberg PS, et al. Emerging cancer trends among young adults in the USA: analysis of a population-based cancer registry . Lancet Public Health . 2019;4(3):e137-e147. doi:10.1016/S2468-2667(18)30267-6
Bell CF, Lei X, Haas A, et al. Risk of cancer after diagnosis of cardiovascular disease . JACC CardioOncol . 2023;5(4):431-440. doi:10.1016/j.jaccao.2023.01.010
By Claire Bugos Bugos is a senior news reporter at Verywell Health. She holds a bachelor's degree in journalism from Northwestern University.
Smart grid technologies: communication technologies and standards, a comprehensive review of the application characteristics and traffic requirements of a smart grid communications network.
Smart grid projects in europe: current status, maturity and future scenarios, demand side management: demand response, intelligent energy systems, and smart loads, rural electrification through village grids - assessing the cost competitiveness of isolated renewable energy technologies in indonesia, smart distribution: coupled microgrids, battery energy storage technology for power systems-an overview, the new frontier of communications research: smart grid and smart metering, vehicle-to-grid power fundamentals: calculating capacity and net revenue, related papers.
Showing 1 through 3 of 0 Related Papers
A psychologist explains how the ‘lion’s gate portal’ can benefit you.
Days like 8/8 can benefit you regardless of your belief in them as they create the perfect storm of ... [+] positivity, placebo and manifestation practice.
Research has confirmed time and again that the gaps between psychological science and spirituality are wide. While one uses treatment modalities developed through scientific rigor, the other banks on faith, belief and optimism.
Paradoxically, however, psychological healing often intersects with spirituality in the realm of practice. “Manifestation” exercises such as meditation and chanting, positive visualization, journaling and affirmations are prescribed in both spaces regularly and are often rooted in gaining more knowledge of and control over the subconscious and unconscious mind.
Research published in 2023 also indicates that certain psychological constructs, like being in a “flow state,” mirror spiritual experiences. The study further argues that incorporating spirituality into your life may enhance self-understanding and potential through self-belief, a goal therapists often set for clients they treat.
All of this is to say that there are many paths that lead to a desired destination. Whether you are a realist with elaborate plans for the future or you’re a spiritual soul building a deeper connection with the universe, manifestation exercises can help you break substantial ground on the journey you’re already on.
And while there is no perfect time to start this journey, many swear by certain fated days, meant to be more powerful and “bountiful” than others. Today is supposed to be one such day, marking the opening of the “Lion’s Gate portal.” Here’s the lore behind the popular legend.
Wwe smackdown results: winners and grades as roman reigns returns, daniel cormier calls out ufc for protecting its ‘golden goose’, the astrological tale behind lion’s gate portal.
Spiritual practitioners claim the eighth of August to be the day the universe supposedly opens a cosmic gateway known as the Lion’s Gate Portal. With Sirius rising and the Sun in Leo, believers claim this is a magical window for transformation and manifestation, as if the universe itself is conspiring to grant all wishes.
For those who believe the lore, it presents a tantalizing chance to harness the universe’s supposed powers. Whether it’s celestial truth or just a fanciful story lacking scientific or cosmic corroboration, the intent to start manifesting in your life is never unuseful. Regardless of these beliefs, manifestation can always help people achieve their best potential.
While they may use vastly different language, construct different arguments and are trying to prove different things—spiritual healing and psychological healing often coincide when it comes to execution. Here’s a psychologist’s take on why manifestation works in both worlds:
Whether ordained by the universe or not, there may not be a better time than now to channelize your mental and spiritual energy toward manifesting the goals you desire to achieve. Here’s why the efficacy of these tools can feel like magic:
While the myths surrounding events like the Lion’s Gate portal may blend astrological assumptions into daily life, the practice of manifestation itself holds significant psychological value at all times in life. The power of intention, belief and structured practice can have profound effects on cognitive health and personal growth. By understanding and harnessing these psychological techniques, individuals can achieve positive transformations, regardless of their spiritual beliefs.
Test your levels of spirituality by taking the science-backed Ego Dissolution Scale, here .
One Community. Many Voices. Create a free account to share your thoughts.
Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.
In order to do so, please follow the posting rules in our site's Terms of Service. We've summarized some of those key rules below. Simply put, keep it civil.
Your post will be rejected if we notice that it seems to contain:
User accounts will be blocked if we notice or believe that users are engaged in:
So, how can you be a power user?
Thanks for reading our community guidelines. Please read the full list of posting rules found in our site's Terms of Service.
Grab your spot at the free arXiv Accessibility Forum
Help | Advanced Search
Title: a comparison of llm finetuning methods & evaluation metrics with travel chatbot use case.
Abstract: This research compares large language model (LLM) fine-tuning methods, including Quantized Low Rank Adapter (QLoRA), Retrieval Augmented fine-tuning (RAFT), and Reinforcement Learning from Human Feedback (RLHF), and additionally compared LLM evaluation methods including End to End (E2E) benchmark method of "Golden Answers", traditional natural language processing (NLP) metrics, RAG Assessment (Ragas), OpenAI GPT-4 evaluation metrics, and human evaluation, using the travel chatbot use case. The travel dataset was sourced from the the Reddit API by requesting posts from travel-related subreddits to get travel-related conversation prompts and personalized travel experiences, and augmented for each fine-tuning method. We used two pretrained LLMs utilized for fine-tuning research: LLaMa 2 7B, and Mistral 7B. QLoRA and RAFT are applied to the two pretrained models. The inferences from these models are extensively evaluated against the aforementioned metrics. The best model according to human evaluation and some GPT-4 metrics was Mistral RAFT, so this underwent a Reinforcement Learning from Human Feedback (RLHF) training pipeline, and ultimately was evaluated as the best model. Our main findings are that: 1) quantitative and Ragas metrics do not align with human evaluation, 2) Open AI GPT-4 evaluation most aligns with human evaluation, 3) it is essential to keep humans in the loop for evaluation because, 4) traditional NLP metrics insufficient, 5) Mistral generally outperformed LLaMa, 6) RAFT outperforms QLoRA, but still needs postprocessing, 7) RLHF improves model performance significantly. Next steps include improving data quality, increasing data quantity, exploring RAG methods, and focusing data collection on a specific city, which would improve data quality by narrowing the focus, while creating a useful product.
Subjects: | Computation and Language (cs.CL); Artificial Intelligence (cs.AI) |
Cite as: | [cs.CL] |
(or [cs.CL] for this version) | |
Focus to learn more arXiv-issued DOI via DataCite |
Access paper:.
Code, data and media associated with this article, recommenders and search tools.
arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.
Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.
Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .
You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.
All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .
Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.
Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.
Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.
Original Submission Date Received: .
Find support for a specific problem in the support section of our website.
Please let us know what you think of our products and services.
Visit our dedicated information section to learn more about MDPI.
Advances in research on bacterial oxidation of mn(ii): a visualized bibliometric analysis based on citespace.
2. materials and methods, 2.1. data source and retrieval strategy, 2.2. data analysis and visualization, 3.1. trend analysis of annual publications, 3.2. analysis of countries and institutions, 3.3. analysis of disciplinary classifications, 3.4. analysis of authors of co-occurrence and co-citation, 3.5. analysis of journal citations, 3.6. analysis of co-cited references, 3.7. analysis of temporal and burst of keywords, 3.8. analysis of keyword clusters, 4. discussion, 4.1. research hotspots and trends, 4.1.1. species and ecological distribution, 4.1.2. factors influencing bacterial mn(ii) oxidation, 4.1.3. mechanisms of mn(ii) oxidation in bacteria, 4.1.4. environmental applications, 4.2. outlook, 5. conclusions, author contributions, data availability statement, conflicts of interest.
Click here to enlarge figure
Rank | Country | Count | Rank | Country | Centrality |
---|---|---|---|---|---|
1 | China | 197 | 1 | USA | 0.38 |
2 | USA | 128 | 2 | China | 0.34 |
3 | Japan | 33 | 3 | Germany | 0.15 |
4 | Germany | 20 | 4 | Netherlands | 0.14 |
5 | India | 18 | 5 | England | 0.09 |
6 | Canada | 11 | 6 | Japan | 0.04 |
7 | England | 10 | 7 | South Korea | 0.04 |
8 | France | 9 | 8 | Pakistan | 0.04 |
9 | Australia | 8 | 9 | Mexico | 0.04 |
10 | South Korea | 9 | 10 | France | 0.03 |
Rank | Count | Centrality | Institution | Country |
---|---|---|---|---|
1 | 41 | 0.19 | Chinese Academy of Science | China |
2 | 28 | 0.06 | Harbin Institute of Technology | China |
3 | 27 | 0.23 | Oregon Health and Science University | USA |
4 | 20 | 0.07 | Huazhong Agricultural University | China |
5 | 14 | 0.00 | University of Chinese Academy of Sciences | China |
6 | 12 | 0.00 | Xi’an University of Architecture and Technology | China |
7 | 9 | 0.05 | Hiroshima University | Japan |
8 | 8 | 0.04 | Woods Hole Oceanographic Institution | USA |
9 | 8 | 0.03 | Smithsonian Institution | USA |
10 | 8 | 0.00 | Beijing University of Technology | China |
Rank | Category | Count | Rank | Category | Centrality |
---|---|---|---|---|---|
1 | Environmental Sciences & Ecology | 151 | 1 | Environmental Sciences & Ecology | 0.41 |
2 | Environmental Sciences | 122 | 2 | Chemistry | 0.41 |
3 | Engineering | 87 | 3 | Biotechnology & Applied Microbiology | 0.39 |
4 | Microbiology | 73 | 4 | Biochemistry & Molecular Biology | 0.34 |
5 | Engineering, Environmental | 72 | 5 | Environmental Sciences | 0.32 |
6 | Biotechnology & Applied Microbiology | 52 | 6 | Engineering | 0.18 |
7 | Water Resources | 39 | 7 | Chemistry | 0.18 |
8 | Geology | 32 | 8 | Microbiology | 0.15 |
9 | Geosciences | 28 | 9 | Agriculture | 0.14 |
10 | Biochemistry & Molecular Biology | 25 | 10 | Toxicology | 0.09 |
Rank | Top Ten Productive Author | Count | Rank | Top Ten Co-Cited Author | Citation |
---|---|---|---|---|---|
1 | Tebo BM | 22 | 1 | Tebo BM | 212 |
2 | Bai YH | 13 | 2 | Francis CA | 104 |
3 | Qu JH | 12 | 3 | Learman DR | 94 |
4 | Zhang J | 10 | 4 | Villalobos M | 89 |
5 | Pan XL | 9 | 5 | Geszvain K | 87 |
6 | Hansel CM | 8 | 6 | Dick GJ | 80 |
7 | Liu F | 8 | 7 | Webb SM | 79 |
8 | He ZF | 7 | 8 | Miyata N | 74 |
9 | Santelli CM | 7 | 9 | Anderson CR | 71 |
10 | Wei Z | 7 | 10 | Brouwers GJ | 69 |
Rank | Citation | Cited Journal | IF | JCR | Country |
---|---|---|---|---|---|
1 | 301 | Applied and Environmental Microbiology | 4.32 | Q2 | USA |
2 | 232 | Environmental Science & Technology | 11.09 | Q1 | USA |
3 | 223 | Water Research | 12.75 | Q1 | England |
4 | 214 | Geochimica et Cosmochimica Acta | 4.97 | Q1 | USA |
5 | 205 | Geomicrobiology Journal | 2.30 | Q3 | USA |
6 | 188 | Journal of Bacteriology | 3.06 | Q3 | USA |
7 | 180 | Proceedings of the National Academy of Science of the United States of America | 10.71 | Q1 | USA |
8 | 168 | Annual Review of Earth and Planetary Sciences | 14.29 | Q1 | USA |
9 | 158 | Chemosphere | 8.80 | Q1 | England |
10 | 153 | PLoS One | 3.64 | Q2 | USA |
Title | Authors | Year | Citation Frequency |
---|---|---|---|
Synergistic effects of biogenic manganese oxide and Mn(II)-oxidizing bacterium Pseudomonas putida strain MnB1 on the degradation of 17 α-ethinylestradiol | Tran TN et al. [ ] | 2018 | 30 |
A novel manganese oxidizing bacterium-Aeromonas hydrophila strain DS02: Mn(II) oxidization and biogenic Mn oxides generation | Zhang Y et al. [ ] | 2019 | 29 |
Elimination of Manganese(II,III) Oxidation in Pseudomonas Putida GB-1 by a Double Knockout of Two Putative Multicopper Oxidase Genes | Geszvain K et al. [ ] | 2013 | 27 |
Mn(II, III) oxidation and MnO mineralization by an expressed bacterial multicopper oxidase | Butterfield CN et al. [ ] | 2013 | 23 |
Diverse manganese(II)-oxidizing bacteria are prevalent in drinking water systems | Marcus DN et al. [ ] | 2017 | 22 |
Effective start-up biofiltration method for Fe, Mn, and ammonia removal and bacterial community analysis | Cai YN et al. [ ] | 2015 | 22 |
Extracellular haem peroxidases mediate Mn(II) oxidation in a marine Roseobacter bacterium via superoxide production | Andeer PF et al. [ ] | 2015 | 21 |
CotA, a multicopper oxidase from Bacillus pumilus WH4, exhibits manganese-oxidase activity | Su JM et al. [ ] | 2013 | 21 |
Formation of manganese oxides by bacterially generated superoxide | Learman DR et al. [ ] | 2011 | 20 |
Identification of a third Mn(II) oxidase enzyme in Pseudomonas putida GB-1 | Geszvain K et al. [ ] | 2016 | 20 |
Rank | Keyword | Frequency | Centrality |
---|---|---|---|
1 | Mn(II) oxidation | 104 | 0.08 |
2 | oxidation | 92 | 0.16 |
3 | iron | 81 | 0.19 |
4 | identification | 65 | 0.10 |
5 | removal | 62 | 0.09 |
6 | multicopper oxidase | 61 | 0.04 |
7 | water | 42 | 0.06 |
8 | oxides | 42 | 0.07 |
9 | spores | 38 | 0.07 |
10 | mechanisms | 38 | 0.06 |
11 | manganese oxidation | 38 | 0.05 |
12 | biogenic manganese oxides | 34 | 0.09 |
13 | microbial community | 32 | 0.11 |
14 | bacteria | 32 | 0.08 |
15 | adsorption | 32 | 0.13 |
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
Mo, W.; Wang, H.; Wang, J.; Wang, Y.; Liu, Y.; Luo, Y.; He, M.; Cheng, S.; Mei, H.; He, J.; et al. Advances in Research on Bacterial Oxidation of Mn(II): A Visualized Bibliometric Analysis Based on CiteSpace. Microorganisms 2024 , 12 , 1611. https://doi.org/10.3390/microorganisms12081611
Mo W, Wang H, Wang J, Wang Y, Liu Y, Luo Y, He M, Cheng S, Mei H, He J, et al. Advances in Research on Bacterial Oxidation of Mn(II): A Visualized Bibliometric Analysis Based on CiteSpace. Microorganisms . 2024; 12(8):1611. https://doi.org/10.3390/microorganisms12081611
Mo, Wentao, Hang Wang, Jianghan Wang, Yue Wang, Yunfei Liu, Yi Luo, Minghui He, Shuang Cheng, Huiting Mei, Jin He, and et al. 2024. "Advances in Research on Bacterial Oxidation of Mn(II): A Visualized Bibliometric Analysis Based on CiteSpace" Microorganisms 12, no. 8: 1611. https://doi.org/10.3390/microorganisms12081611
Article access statistics, further information, mdpi initiatives, follow mdpi.
Subscribe to receive issue release notifications and newsletters from MDPI journals
Wu, Y, Han, F, Peng, X, Gao, L, Zhao, Y, & Zhang, J. "Research on a Fast Resistance Performance Prediction Method for SUBOFF Model Based on Physics-Informed Neural Networks (PINNs)." Proceedings of the ASME 2024 43rd International Conference on Ocean, Offshore and Arctic Engineering . Volume 6: Polar and Arctic Sciences and Technology; CFD, FSI, and AI . Singapore, Singapore. June 9–14, 2024. V006T08A041. ASME. https://doi.org/10.1115/OMAE2024-126300
Download citation file:
Physics-Informed Neural Networks (PINNs) represent a novel intelligent algorithm for solving partial differential equations (PDEs), which has been partially validated in solving Navier-Stokes equations. However, numerous challenges remain in the application of PINNs to the calculation of hydrodynamic performance of ships.
This paper utilizes discrete solutions of the Navier-Stokes (N-S) equations obtained via the Finite Volume Method (FVM) and Reynolds Averaged Navier-Stokes (RANS) approach of the CFD method for training PINNs to directly solve flow field information. This approach achieves a solution with approximate accuracy to that based on the CFD method, but with significantly reduced time consumption. The development of this algorithm could offer a novel perspective for calculating the hydrodynamic performance of ships.
The study focuses on the DARPA SUBOFF-B model, using field information obtained at specific speeds through the CFD method, including pressure and velocity fields, as the training set to train the neural parameters within PINNs. This enables the PINNs model to predict the flow field information around the hull and calculate the hull resistance through integration. The results predicted by the PINNs method are compared and analyzed against those calculated via the CFD method, Shows the comparison of the predicted results of the hull surface pressure field, as well as the relative error of the predicted total resistance., thereby proving the feasibility of the method.
Product added to cart., email alerts, related proceedings papers, related articles, related chapters, affiliations.
Sign In or Create an Account
COMMENTS
Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research and your dissertation topic.
The research methodology is an important section of any research paper or thesis, as it describes the methods and procedures that will be used to conduct the research. It should include details about the research design, data collection methods, data analysis techniques, and any ethical considerations.
Definition, Types, and Examples. Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of ...
One of the most common deficiencies found in research papers is that the proposed methodology is not suitable to achieving the stated objective of your paper. Describe the specific methods of data collection you are going to use, such as, surveys, interviews, questionnaires, observation, archival research. If you are analyzing existing data ...
What is research methodology? Research methodology simply refers to the practical "how" of a research study. More specifically, it's about how a researcher systematically designs a study to ensure valid and reliable results that address the research aims, objectives and research questions. Specifically, how the researcher went about deciding:
A research methodology encompasses the way in which you intend to carry out your research. This includes how you plan to tackle things like collection methods, statistical analysis, participant observations, and more. You can think of your research methodology as being a formula. One part will be how you plan on putting your research into ...
Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design. When planning your methods, there are two key decisions you will make. First, decide how you will collect data. Your methods depend on what type of data you need to answer your research question:
Research methodology is the systematic process of planning, executing, and evaluating scientific investigation. It encompasses the techniques, tools, and procedures used to collect, analyze, and interpret data, ensuring the reliability and validity of research findings.
1. Qualitative research methodology. Qualitative research methodology is aimed at understanding concepts, thoughts, or experiences. This approach is descriptive and is often utilized to gather in-depth insights into people's attitudes, behaviors, or cultures. Qualitative research methodology involves methods like interviews, focus groups, and ...
Research methodology is the process or the way you intend to execute your study. The methodology section of a research paper outlines how you plan to conduct your study. It covers various steps such as collecting data, statistical analysis, observing participants, and other procedures involved in the research process
The methodology section of your paper describes how your research was conducted. This information allows readers to check whether your approach is accurate and dependable. A good methodology can help increase the reader's trust in your findings. First, we will define and differentiate quantitative and qualitative research.
Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research. ... Authors' expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. ...
The methodology in a research paper, thesis paper or dissertation is the section in which you describe the actions you took to investigate and research a problem and your rationale for the specific processes and techniques you use within your research to identify, collect and analyze information that helps you understand the problem. ...
Research methodology is a way of explaining how a researcher intends to carry out their research. It's a logical, systematic plan to resolve a research problem. A methodology details a researcher's approach to the research to ensure reliable, valid results that address their aims and objectives. It encompasses what data they're going to collect ...
A research methodology is different from a research method because research methods are the tools you use to gather your data (Dawson, 2019). You must consider several issues when it comes to selecting the most appropriate methodology for your topic. Issues might include research limitations and ethical dilemmas that might impact the quality of ...
Background Methodological studies - studies that evaluate the design, analysis or reporting of other research-related reports - play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste. Main body We provide an overview of some of the key aspects of ...
components of methodology one could add. For example, the historical roots of science and science and social policy are legitimate topics that could be covered as well. Yet, in developing an appreciation for methodology and the skills involved in many of the key facets of actually conducting research, the five will suffice.
In a research paper, thesis, or dissertation, the methodology section describes the steps you took to investigate and research a hypothesis and your rationale for the specific processes and techniques used to identify, collect, and analyze data. The methodology element of your research report enables readers to assess the study's overall ...
Revised on 10 October 2022. Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research.
The methodology is a proper study or analysis of all the methods used in the research. Methods are simply behavior or tools used to select research techniques. The methodology is applied at the initial stage of the research/study. Methods are used and applied at a later stage of the study/ research.
The research method must be appropriate to the objectives of the study. For example, be sure you have a large enough sample size to be able to generalize and make recommendations based upon the findings. ... One of the most common deficiencies found in research papers is that the proposed methodology is unsuited to achieving the stated ...
The methodology is perhaps the most challenging and laborious part of the dissertation. Essentially, the methodology helps in understanding the broad, philosophical approach behind the methods of research you chose to employ in your study. The research methodology elaborates on the 'how' part of your research.
Research methods are the compass, map, and tools that guide us through the terrain of knowledge. They enable us to ask important questions, systematically gather and analyze data, and contribute valuable insights to our understanding of the world. As you start on your research journey, embrace the adventure, respect the process, and look ...
White Paper: A white paper is an informational document, issued by a company or not-for-profit organization, to promote or highlight the features of a solution, product, or service. White papers ...
A new study adds to the recent body of research on rising cancer rates in young adults. Here's what you should know this time. ... Different from other studies of early-onset cancer rates, this new paper tracked both case rates and mortality rates. It also encompasses more cancer types than many of the major studies published recently ...
Purpose: The general objective of this study was to analyze smart grid technologies and their role in sustainable energy management. Methodology: The study adopted a desktop research methodology. Desk research refers to secondary data or that which can be collected without fieldwork. Desk research is basically involved in collecting data from existing resources hence it is often considered a ...
Research has confirmed time and again that the gaps between psychological science and spirituality are wide. While one uses treatment modalities developed through scientific rigor, the other banks ...
View PDF HTML (experimental) Abstract: This research compares large language model (LLM) fine-tuning methods, including Quantized Low Rank Adapter (QLoRA), Retrieval Augmented fine-tuning (RAFT), and Reinforcement Learning from Human Feedback (RLHF), and additionally compared LLM evaluation methods including End to End (E2E) benchmark method of "Golden Answers", traditional natural language ...
Manganese (Mn) pollution poses a serious threat to the health of animals, plants, and humans. The microbial-mediated Mn(II) removal method has received widespread attention because of its rapid growth, high efficiency, and economy. Mn(II)-oxidizing bacteria can oxidize toxic soluble Mn(II) into non-toxic Mn(III/IV) oxides, which can further participate in the transformation of other heavy ...
Abstract. Physics-Informed Neural Networks (PINNs) represent a novel intelligent algorithm for solving partial differential equations (PDEs), which has been partially validated in solving Navier-Stokes equations. However, numerous challenges remain in the application of PINNs to the calculation of hydrodynamic performance of ships.This paper utilizes discrete solutions of the Navier-Stokes (N ...