Search Results
49 items found
- Collecting Data
Make sure to turn WiFi off on the computer before opening NetStation Acquisition. Instead of collecting your own data, you can also use publicly available EEG data. See how to access this data from Open-Access EEG Datasets
- Channel Locations
If you’re using Dr. Thom’s dissertation experiment data (AA & AB), you need to choose the file that specifies the channel locations for the data. For the Dissertation Experiment data, this file is GSN256.sfp. This file can be found in the Matlab folder on the collection room Mac, or you can search for it. If you’re using new data from the 64 channel net, use the GSN64.sfp or the GSN65.sfp file (found in the Matlab folder), depending on how many channels the imported data has. You can check that here: Setting Channel Locations in eeglab Step-by-step instructions with screenshots Click Edit → Channel Locations Then click Read locations Find and Select the GSN256.sfp file and click open Click Ok Click Ok again
- Converting & Importing Data
Convert to .raw Once you record the data, it will be saved as a .mff file. Find this .mff file and convert it to .raw using Netstation Tools. [Add steps/tutorial for how to convert to raw] Alternatively, you can use the mff2eeglab.m function that came with the EGI Matlab MFF API (/Users/eeglab/Documents/MATLAB/EGI_Matlab_MFF_API_v2_2014-07-25/demoCode/mff2eeglab.m) if you can figure out how to get it to work. Learn How to Use Matlab Matlab has some good introductory tutorials that can be found here. Ask Dr. Thom (or someone) for the login and password. You should start with the Matlab Onramp, and then move on to the MATLAB Fundamentals Academic Tutorial if you want more detailed tutorials. These will answer important questions such as: "What is a function?" "How do I use an array?" "How do I create a loop?" After doing the Matlab Onramp tutorials, you may want to look at the following Matlab Fundamentals Academic Tutorials: 01: Introduction 02: Working with the MATLAB User Interface 03: Variables and Expressions 06: Automating Commands with Scripts 10: Logic and Flow Control 13: Writing Functions Open Matlab Open eeglab and Import a .raw File
- Re-reference Data
When the data is taken, it is referenced off of a reference electrode in the center of the scalp. We need to convert it to an average reference, which means that the average voltage over the entire scalp is calculated, and then the data is referenced off of that average value. Click Tools → Re-reference Make sure “Compute Average Reference” is checked, then click Ok In the next window that pops up, click Ok.
- Step 1: Understanding Meta-Analysis
Because meta-analyses summarize (all) the existing literature of a particular field in a quantitative manner, each individual study, in a sense, becomes a participant from which you must collect your data. Data collection occurs through a process called coding (See Step 4 and Step 5). Once you have coded all the articles, you will need to select the meta-analytic model that best fits the type of data you have collected (Step 6). You will then begin the analysis process using SPSS. Before you begin this adventure, it is important to understand that doing a meta-analysis is an iterative process. Thus, you may have to repeat steps multiple times before your meta is complete. STRENGTHS OF META-ANALYSIS: Takes into account multiple factors (moderators) outside the variable of interest that may influence the dependent variable. Indicates areas for improvement in existing techniques and suggests future research possibilities. Combines multiple methods of measuring the same variable into a single model. LIMITATIONS: Too few or too many relevant articles on a particular subject. Can only prove correlation, not causation. Dependent on the information that others have reported.
- functional Near-Infrared Spectroscopy
About fNIRS Functional near-infrared spectroscopy and imaging (fNIRS/fNIRI) is a tool that detects neuroactivation by measuring oxygenated and deoxygenated blood flow. This is accomplished through spectroscopically measuring absorbance of the chromophore hemoglobin in blood that flows to neurally activated regions. Figure 1 shows rat brain vasculature that is constantly modifying itself in response to nutrient and oxygen demands, pruning and sprouting new vessel branches within days. Figure 1 Brain vasculature. Red arrows indicate areas of vessel pruning determined by blood flow, and white arrowheads indicate vessel sprouting. Obtained from http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1001375 BASED ON THE HEMODYNAMIC RESPONSE Neuroactivation is coupled with the hemodynamic response. Activated neurons increase metabolism and oxygen consumption, a change that is reflected in the local abundance of oxygenated hemoglobin (HbO) and deoxygenated hemoglobin (HHb). This process is called the hemodynamic response and can be measured noninvasvely by fNIRI. Figure 2 shows the hemodynamic response curve and shows the rapid increase in blood flow followed by the return to a homeostatic level once the need is met. Figure 2 The hemodynamic response with relative blood flow in a certain location of the brain over time. Obtained from http://www.jarrodmillman.com/rcsds/lectures/convolution_background.html MEASURED BY SPECTROSCOPY Optical imaging relies on the property of light scattering and also requires a modification to the Beer-Lambert Law. As photons penetrate the opaque medium of scalp, skull, meninges, and several centimeters of brain cortical tissue, they are scattered in an infinite number of paths. Although it is impossible to predict the paths of individual photons, bulk photon movements can be predicted. One particular sensitivity profile is called 'the photon banana' and describes how some photons are scattered in a direction that archs back toward the scalp where a photodetector is positioned several centimeters away from the source. Along the way, photons are absorbed by chromophores, HbO and HHb. The detector measures the change in absorbance over time as an electrical signal. Detectors "hear" signals from their nearest neighbor sources. Varying the distance between sources and detectors alters the depth of light penetration into the cortical tissue. Furthermore, depth measurements vary with scalp and skull thickness which are unique to every individual. THE PARTS OPTODES/PROBES These terms are used interchangeably. They are sensor devices that use light to measure a substance and are the general terms for sources and detectors. Ensure complete and proper contact with the scalp. SOURCES Emits light at 760 nm and 850 nm, the optimum wavelengths at which HHb and HbO absorb respectively. HANDLE WITH CARE! Do not scratch the bulb surface because this will skew absorbance readings. Clean with alcohol wipes following each trial. DETECTORS Propagates the transmitted light along a fiber-optic cable. HANDLE WITH CARE! Do not bend or fold into sharp corners as this will break the glass mirrors inside the cable. Clean with alcohol wipes following each trial. CAP Two sizes are available in 56" and 58" and are assigned according to subject head circumference. DOGBONES Plastic loops that secure a 3 cm distance between a source and a detector. Their positions represent channels and are the purple lines that connect sources and detectors on the montage images. MONTAGES A montage is a unique arrangement of sources and detectors that targets a specific region of the cortex. The two used montages of interest to us are the motor cortex 8x4 (sources x dectors) and the prefrontal cortex 8x4. Go to [link] to view other montages.
- Step 2: Establishing a Question
Possibly one of the most important steps in conducting a meta-analysis is developing a question. Your question should reflect your interests and can be as broad or specific as desired. The question may need to be modified slightly throughout the process due to a lack of sufficient articles or an overwhelming number of relevant articles. During this process, it is also important to verify that your work in this area will be original. Use the various search engines to determine whether or not there is a pre-existing meta-analysis in your area of interest. If there is not, you can begin to create search terms designed to gather all the relevant articles (Step 3). However, if someone has already meta-analyzed the field of interest, you should consider several factors before abandoning your topic in despair. How old is the existing meta-analysis, and how many studies have been published in the field since its publication date? It may be time for an update! Do you agree with the existing meta-analysis' exclusion criteria? If they were too exclusive/limiting you can argue the case that they did not accurately summarize the field. Are there multiple moderators they did not examine that you would be interested in? Your focus could be in a different direction. If none of these produce a valid reason to redo the existing meta, it's time to give up and return to the top of this page with a new question.
- Heart Rate Variability
What is Heart Rate Variability (HRV)? Heart rate variability (HRV) is the change in time from one R peak of a QRS cycle to the next throughout the duration of a recording (as seen in the figure below). It is closely associated with the autonomic regulation of cardiac functioning and can act as a measure of sympathetic and parasympathetic interactions (Thayer 2006; Shaffer et al. 2014). Individuals with lower HRV are thought to have less adaptability to stress, and low HRV has been linked to greater physiological and psychological health problems. There are numerous ways to measure HRV, which has led to a wide variety of methodological differences between studies. Two articles (Laborde et al. 2017; Malik et al. 1996) have set forth suggestions for universal methods and measures to be implemented across studies. However, not all of the suggestions in these articles are followed or possible based on study design. Thus, it is critical to compare methods and agreement between studies before comparing results. Data collection Device The standard for measuring HRV is ECG recording, which measures HRV from the raw heart rate waveform. However, there are several other devices, such as finger monitors and chest straps, that record inter-beat intervals (IBI) and can also be used to analyze HRV. Unlike ECG, devices that record IBI only measure the interval times between R peaks, and thus are more difficult to inspect so as to ensure quality data (Laborde et al. 2017). Thus, it is important to assess the reliability of a device in comparison to ECG recording before assessing data. Lead configuration There are many different lead configurations for measuring ECG. One way to collect ECG data is through a lead II configuration, where one electrode is placed just under the right collarbone, and the other electrode is placed on the left side of the body on the rib cage. Sampling rate There is no set sampling rate for collecting ECG data, however, it has been proposed that ECG data should be sampled between 250-1000 Hz (Malik et al. 1996). One study tested this suggestion, by examining lead II ECG recordings sampled at 1000 Hz, and down-sampling these recordings to 19 different frequencies (Ellis et al. 2015). This study found that frequencies down to 125 Hz without interpolation yielded consistent results compared to 1000 Hz, confirming the suggestions of Malik et al. (1996). Another study that examined emergency room patients, rather than healthy patients, found similar results after down-sampling a 1000 Hz ECG recording to 500, 250, 100, and 50 Hz (Kwon et al. 2018). These two studies demonstrate that acceptable ECG sampling frequencies can range from 250 Hz (even as low as 100 Hz in some cases) to 1000 Hz and still yield comparable results, thus further reinforcing the suggestions provided in Malik et al. (1996). Analysis Artifact correction Artifact correction is a critical part of HRV analysis. Artifacts (missed, abnormal, or added beats) can significantly alter the results of an analysis, and thus need to be corrected (Berntson and Stowell 1998). Artifact correction can be performed manually, however, there is also a function in the Kubios Premium software that allows for automatic artifact correction (Lipponen and Tarvainen 2019). This function detects artifacts through a decision-based algorithm that works in conjunction with Kubios’s RR interval detection algorithm. Analysis parameters The three major domains of HRV analysis are time-domain, frequency-domain, and non-linear analysis. Each method of analysis has different parameters that act as measures of various physiological functions (Laborde et al. 2017). Fast Fourier transform (FFT) Of the three HRV analysis methods, frequency-domain parameters are common measures reported in the literature. There are two main methods for a frequency analysis; fast Fourier transformation (FFT) and autoregressive modeling (AR) (Laborde et al. 2017). Each of these methods has certain advantages and disadvantages, and which method is used will vary from study to study. An FFT analysis measures HRV through estimating power from power spectral density (PSD) calculations (Malik et al. 1996). Data is windowed, which helps to reduce spectral leakage while preserving spectrum resolution (Badilini and Blanche 1996). One consideration a researcher must take into account for an FFT analysis is what type of window and window settings will be used. For a five-minute sample, Kubios recommends a Hanning window with a window length of 150 samples, an overlap of 50%, and an FFT width of 512 samples (Tarvainen MP, email correspondence, 2020). Additionally, when data is windowed, the power of the spectrum becomes mismatched with the variance of the original signal (Badilini and Blanche 1996). In order to adjust for this, a scale factor is typically applied. For Kubios, the scale factor is found as equation 1 in Badilini and Blanche (1996), while for AcqKnowledge there is no scale factor. Log transformation Since HRV measures typically result in a non-normal distribution, it is common in the literature to transform the data by taking the natural log (LN) prior to statistical analysis (Laborde et al. 2017). This can be verified by creating histograms of the distribution before and after transformation to verify that data has been properly treated. Kubios methods A standard software for HRV analysis is Kubios (Tarvainen et al. 2014), which allows for automated analysis of HRV data. The methods Kubios uses for HRV analysis can be found on their website (About HRV… 2020). Additionally, further information is provided in this statement from Kubios support (Tarvainen MP, email correspondence, 2020): Before both spectrum estimates the RR data is interpolated (by default at 4 Hz) using cubic spline, to have equidistantly sampled data. FFT spectrum estimate: We use Welch's periodogram approach, where the spectrum is computed by averaging spectra from overlapping windows. The window width and overlap can be adjusted from preferences. Typically, I recommend that you should have at least couple of windows to average, e.g. in case of 5-min analysis sample, you could set the window width=150 secs, overlap=50%; to have three overlapping windows. In addition, you can adjust the points in frequency domain Nfft. If Nfft is higher than points within a windowed data, then zero padding is applied to do spectral interpolation. The spectral resolution of FFT spectrum depends on the window width, the longer the window the better the frequency resolution. For a 150 sec window, the frequency resolution is roughly 1/150s ~ 0.007 Hz Spectral power is scaled with the sampling frequency, length of data and window function (we use Hanning window inside the Welch's periodogram). The scaling is done according to Parseval's theorem, meaning that total power of spectrum is equal to variance of the data. AR spectrum estimate: You can adjust the AR model order in this approach, which also defines the frequency resolution of the spectra. In theory the AR method has better frequency resolution due to implicit extrapolation of the data, but in practical application this may be insignificant. The AR model coefficients are solved using a forward-backward least squares method and the AR spectrum is scaled according to Parseval's theorem (model residual variance and sampling frequency are required for this).” References About HRV - Kubios HRV. 2020. Kuopio, Finland: Kubios Oy; [accessed 2020 June 22]. https://www.kubios.com/about-hrv/#top Badilini F, Blanche P. 1996. Hrv spectral analysis by the averaged periodogram: Does the total power of the spectrum really match with the variance of the tachogram? Annals of Noninvasive Electrocardiology. 1(4):423–429. doi:10.1111/j.1542-474X.1996.tb00300.x. Berntson GG, Stowell JR. 1998. ECG artifacts and heart period variability: Don’t miss a beat! Psychophysiol. 35(1):127–132. doi:10.1111/1469-8986.3510127. Ellis RJ, Zhu B, Koenig J, Thayer JF, Wang Y. 2015. A careful look at ECG sampling frequency and R-peak interpolation on short-term measures of heart rate variability. Physiol Meas. 36(9):1827–1852. doi:10.1088/0967-3334/36/9/1827. Kwon O, Jeong J, Kim HB, Kwon IH, Park SY, Kim JE, Choi Y. 2018. Electrocardiogram sampling frequency range acceptable for heart rate variability analysis. Healthc Inform Res. 24(3):198–206. doi:10.4258/hir.2018.24.3.198. Laborde S, Mosley E, Thayer JF. 2017. Heart rate variability and cardiac vagal tone in psychophysiological research – Recommendations for experiment planning, data analysis, and data reporting. Front Psychol. 8:213. doi:10.3389/fpsyg.2017.00213. Lipponen JA, Tarvainen MP. 2019. A robust algorithm for heart rate variability time series artefact correction using novel beat classification. J of Med Engr & Tech. 43(3):173–181. doi:10.1080/03091902.2019.1640306. Malik M, Bigger JT, Camm AJ, Kleiger RE, Malliani A, Moss AJ, Schwartz PJ. 1996. Heart rate variability: Standards of measurement, physiological interpretation, and clinical use. Eur Heart J. 17(3):354–381. doi:10.1093/oxfordjournals.eurheartj.a014868. Tarvainen MP, Niskanen J-P, Lipponen JA, Ranta-aho PO, Karjalainen PA. 2014. Kubios HRV – Heart rate variability analysis software. Comp Meth and Pgrm in Biomed. 113(1):210–220. doi:10.1016/j.cmpb.2013.07.024.
- Data Collection
Introduction In order to achieve precise and trustworthy data, proper setup and calibration of the NIRScout is key. The setup process involves fitting the participant with headgear to hold the FNIRS optodes. During this process, hair sweeping is required to create an optimal optode-scalp placement. Because of the vast difference in hair length and thickness between participants, ultrasound gel may be required for use on participants with thick and (or) long hair. Calibration of the NIRScout system shows how well the optodes are placed. Warming up the NIRScout and Preparation 30 minutes prior to the arrival of the participant switch the NIRS system on to allow for warm up time. Open NIRx Star. Welcome the participant and ask if they would have a seat in the EEG collection room. Begin by explaining the procedure, what we will be doing, and what the participant will be doing during the testing as well as the time requirement. NOTE: determine whether their hair will need gel: ideal=shot-medium fine hair. Thick longer hair makes the sweeping process more difficult and may require ultrasound gel from the beginning Placing the NIRScap and Optodes Consult this video for step by step instructions for how to apply the NIRScap. Selecting a Montage Click “configure hardware” located in the top left hand corner of the window. Within this page select the desired montage from the center dropdown window. For the purposes of this tutorial, “motor_8x4” is selected. Once the montage has been selected, close the window. Calibrating the NIRScout Once the cap and over-cap are secure, turn off the lights and close the door and click “calibrate” under “Main Functions” to calibrate the optodes. After the system is calibrated, click “details” under “Main functions” this will bring up the quality scale window, where you are able to view the topographical layout and operational values of the optodes. The numbers within the boxes describe the sensors and detectors. For instance, if a box read “3-2”, this would indicate sensor 3 and detector 2 (sensor is listed first, detector second) Within this screen, toggle between the tabs at the top of the screen to view how well the optodes are placed. Boxes displayed in green are optimal, and values should be compared to the quality scale bar on the right end of the window. Visualizing data and adjusting skin blood flow levels Following the selection of the montage, adjust the skin blood flow waves by sliding “Scale Factor of each channel [mmol/l]” and “LP Filter Cutoff (Hz)” until the waves are more linear, as shown in the image. Beginning the Recording Before beginning the recording, click, “configure hardware” and within the Hardware configuration window, click “Preview Display” This will open up a GUI that allows the visualization of bloodflow by hemisphere. Once the recording begins you will not be able to open this GUI. After the recording has begun, click “Prepare display” within the GUI window to get a real-time interpretation of blood The recording process for NIRx NIRStar can be confusing. To begin recording, click “Preview” this toggles the recording on, but the file will not be saved until the “record” button is clicked. Once “preview” has been selected, a Subject Demographic window will be brought up. Fill in the information as desired and click “Done”. When ready, click record. Run the baseline as long as desired before adding events. In a motor test, the first marker can be added by pressing “F1” while the second marker is added by pressing “F2”. This is visualized by a vertical black line in the data visualization window. In dot probe tasks, the computer will add events automatically. After the test has ended, click “stop” to end the recording. Removing gear and Thanking the Participant Thank the participant for their time and help them out of the headgear.
- Step 3: Collecting Articles
Before you can begin collecting articles, you must create search terms (that can be put into different databases) in order to retrieve all of the relevant articles pertaining to the meta-analysis. To begin, start by using broad search terms in order to get a feel for how many relevant articles there may be. Once you have created a set of broad search terms, you can begin to narrow the search. In order to narrow the search terms it is important to: Put all phrases in quotation marks. Ex: “flow mediated dilation” or “physical activity” Put a capitalized OR in between different variations of a word. Ex: fatigue OR energy Ex: glucose OR lipid OR fat Put NOT before any search bar with phrases that the researcher specifically does not want to appear in the search results (this may or may not be necessary). Ex: NOT animal OR rat OR mouse The last step to developing search terms is making sure that they work. In general, about half of the search results will be rejected and a meta-analysis requires there to be about 20 relevant articles in order to be considered sufficient. Therefore, the search terms should yield approximately 40 articles, or more. Once you have established your search terms, it is time to start collecting articles. Began this step by searching in databases such as Web of Science, PubMed, and PsycInfo rather than Google Scholar. The use of these databases will yield a more specific collection of studies that will be easier to work with. The process of collecting articles requires: Application of knowledge of the question/field. Everyone to have an understanding of any necessary procedures, measurements, etc. An established set of exclusion criteria. It is crucial that everyone understand which studies can be accepted as well as which ones should be rejected. Be sure to take careful notes on established rejection criteria so that you can refer back to them when questionable articles appear. Keep a careful list (preferably in Excel) of articles as you go through them. Remember that your goal is to collect all articles relevant to your field of interest, so recheck your searches regularly for new publications. The Excel sheet should include the following information: First author's last name Study title Study year Search engine (ex. Web of Science, PubMed, etc.) Article's number in the search Status (referring to whether the article was accepted or rejected) Reason for rejecting It is also helpful to color-code the excel sheet, with rejected articles one color and accepted articles another color. HOWEVER. DO NOT REJECT ARTICLES IN THIS INITIAL PHASE OF COLLECTING UNLESS YOU ARE 100% SURE THEY DO NOT MEET THE INCLUSION CRITERIA.
- HRV Analysis Methods
ECG analysis One method to obtain heart rate variability (HRV) data is through electrocardiogram (ECG) recordings. ECG recordings are collected through a lead II configuration using Physio16 and NetStation. From there, ECG recordings are exported to a txt file using MatLab. In order to pre-process data prior to analysis, recordings are then filtered and artifacts are corrected manually in AcqKnowledge. After pre-processing, data sets are now ready for analysis in Kubios. The features used in Kubios will vary based on the particular study. For a 5 minute analysis, a 5 minute sample will be taken from the data set, automatic artifact correction will be applied (in case of any potential error during pre-processing), and the FFT settings will be adjusted to 512 points, a window size of 150 s, and an overlap of 50%. For one study, all data sets will be saved to an SPSS batch file for statistical analysis. Before analysis, data should be checked for a normal distribution using histograms, and transformed through a natural log (LN) transformation. RR analysis For alternative recordings devices, such as finger monitors or chest straps, RR intervals are recorded as opposed to the raw waveform. After recording, the RR intervals can be emailed as txt files and then uploaded to Box from email. Since AcqKnowledge has no option to import RR intervals without an ECG waveform, there are no pre-processing steps for this form of data. Instead, RR data can be imported straight to Kubios where analysis will be the same procedure as with an ECG recording (select sample length, turn on automatic artifact correction, adjust FFT settings according to the recording length, etc.). Since only the RR intervals are recorded with these devices, there will be no ECG waveform. Instead, analysis is based on the tachogram. As with ECG, all data sets from the same study should be saved to an SPSS batch file for statistical analysis. Before analysis, data should be checked for a normal distribution using histograms, and transformed through a natural log (LN) transformation. For an in-depth explanation of these methods, proceed to the HRV tutorials section
- Step 5: Code Articles
Once you have developed a coding framework, you can begin the actual data collection process. By the end of this step you will know more about your variable of interest than you ever wanted to. There are a few supplies you may want to acquire before beginning this process: 1. A set of different colored highlighters. These are helpful for transferring the information from each article into your spreadsheet. The process goes much faster when you assign a specific color to each piece of information you need to extract from the article. Then, as you read the paper, you can mark it with the various colors, and easily find this information again when it comes time to enter the study's data into the coding spreadsheet. 2. A stack of post-its. As you will most likely not remember the specifics of every paper you read, it is helpful to stick a post-it on the front of each paper you read with a brief description of the information contained in that article and/or any unique features. 3. An unlimited supply of printer paper and ink. As you begin to code, it is extremely important that you read each article carefully so as to not miss any relevant information. Enter the data from each study into the corresponding column as you code. In addition, take special note of whether data is reported as means +/- SD or SEM so that you can report the variance in the appropriate column. It is also helpful to mark cells without data as either NR (not reported) or NA (not applicable) in order to indicate that you looked for that data and could not find it. Do not be afraid to ask questions throughout this process, and work together at first to ensure the entire research team is coding articles in the same manner. DIVIDING STUDIES INTO MULTIPLE EFFECTS: Perhaps the most complicated aspect of the coding process is the concept of dividing one study into multiple effects. "Effect" is a meta-analytic term used to describe a row of data. Each row corresponds to an article; there can be multiple effects in one article, but not multiple articles in one effect. Several factors influence the number of effects that can be derived from a single article: 1. Whether or not you need a control group for comparison. 2. How much data the article reports. If your variable of interest requires a control group (see Analysis: Step 1), then the number of possible effects from a study is limited by the number of corresponding controls. For example, If you are interested in studying the effects of exercise on energy and fatigue, each exercise intervention must be compared with a control group. Thus, if you have a study looking at the varying effects of exercise, diet + exercise, and doing nothing, you can only code one relevant effect (not including the control group) because you have no way of dividing out the interacting effects of diet combined with exercise. However, if this same study also included a diet only group, you would then code two relevant effects because you can compare the exercise group to the nothing group, and the diet + exercise group to the diet only group. Conversely, if you do not need a control group for each effect, the number of effects you can derive from any particular study is only limited by the amount of data they report. For example, if the study tests both men and women but does not report their data regarding your variable of interest separately, you will only have one effect. However, if they do divide the data into two groups, you can code two effects. REJECTING ARTICLES: Articles that have made it past the initial round of collection should not be rejected without careful consideration. It is important to read the entire article before making a decision on its inclusion/exclusion status. At times, you may be tempted to reject an article because they do not report a part of the necessary information. Depending on your exclusion criteria, you may be able to do this. However, if the information is reported in a graph, you can estimate the values using a technique described in more detail HERE. It may also be possible to calculate the necessary data from reported p or t values. In addition, there is always the option of emailing the corresponding author listed on the paper to request the necessary information. If the author never responds, you may exclude the study in good conscience, knowing you have done your part to try and include it. When you do reject articles, be sure to keep a record of which ones were rejected and why. This can be included in the spreadsheet you used to collect articles.