Entry
Reader's guide
Entries A-Z
Subject index
F Ratio
The F ratio is the ratio of variances or variancelike quantities. A simple example is the testing of whether the variance between groups is significantly larger than the variance within groups in analysis of variance. The general notion underlying such a test is that the differences between individuals within groups are only random differences (error variance), but that the differences between group means stand out above chance (alternative hypothesis).
An Example
An example of calculating this can be found when the partition of variance is dealt with. A series of scores 2234456789 with a mean of 5 and variation 54 is partitioned into two groups. We can think of two groups (e.g., a group of men and a group of women) for which the scores of a quantitative variable are considered—let us say, their smoking behavior measured as the number of cigarettes per day. The group of men with scores 2 2 3 4 4 has a mean of 3 and a variation of 4. The group of women with scores56789hasa mean of 7 and a variation of 10. The within-group variation is 4 + 10 = 14. The between-group variation is defined by the group means 3 and 7. Their mean is 5 and their variation 8. The between-group variation entirely determined by the group is obtained by multiplying the variation between groups by the number of individuals per group: 8 × 5 = 40. It appears that 54 = 40 + 14 or SST = SSB + SSW (sum of squares total = sum of squares between + sum of squares within). An analogous partition of the total into a between and within component can be carried out for the degrees of freedom. One degree of freedom is lost to the mean of a series. Therefore, there are 9 degrees of freedom for the total of 10 scores. For the between component, we calculate for 2 groups, so that there is 1 degree of freedom. For the within component, we calculate for 2 × 5 individuals, each time losing a degree of freedom to the group mean, so that 2 (5 − 1) = 8 remain. We see here, too, that total = between + within, because 9 = 1 + 8. Should one now want to work with variances instead of variations (Mean squares, MS, instead of Sum of Squares, SS), then one must divide each of the variations concerned by the corresponding degrees of freedom: MSB = 40/1 and MSW = 14/8. The ratio of this between-variance MSB and the within-variance MSW is a statistical quantityF = (40/1)/(14/8) = 22.86, for which the sampling distribution for the degrees of freedom, 1 for the numerator and 8 for the denominator, is distributed as F (the symbol F was chosen to commemorate the contributions by Sir Ronald Fisher). The critical value of F for 1 degree of freedom for the numerator and 8 degrees of freedom for the denominator is found in statistical tables of the theoretical F distribution and is equal to 5.32 for a postulated α of .05. The value of the F ratio we found in our empirical example is greater, and we can therefore, even with these small numbers, reject the null hypothesis with a probability of 95%. The null hypothesis states that the ratio MSB/MSW is equal to 1; in other words, that the differences between groups (MSB) are not greater than the randomly assumed differences between individuals within groups (MSW = error variance).
...
- Analysis of Variance
- Association and Correlation
- Association
- Association Model
- Asymmetric Measures
- Biserial Correlation
- Canonical Correlation Analysis
- Correlation
- Correspondence Analysis
- Intraclass Correlation
- Multiple Correlation
- Part Correlation
- Partial Correlation
- Pearson's Correlation Coefficient
- Semipartial Correlation
- Simple Correlation (Regression)
- Spearman Correlation Coefficient
- Strength of Association
- Symmetric Measures
- Basic Qualitative Research
- Basic Statistics
- F Ratio
- N(n)
- t-Test
- X¯
- Y Variable
- z-Test
- Alternative Hypothesis
- Average
- Bar Graph
- Bell-Shaped Curve
- Bimodal
- Case
- Causal Modeling
- Cell
- Covariance
- Cumulative Frequency Polygon
- Data
- Dependent Variable
- Dispersion
- Exploratory Data Analysis
- Frequency Distribution
- Histogram
- Hypothesis
- Independent Variable
- Measures of Central Tendency
- Median
- Null Hypothesis
- Pie Chart
- Regression
- Standard Deviation
- Statistic
- Causal Modeling
- Discourse/Conversation Analysis
- Econometrics
- Epistemology
- Ethnography
- Evaluation
- Event History Analysis
- Experimental Design
- Factor Analysis and Related Techniques
- Feminist Methodology
- Generalized Linear Models
- Historical/Comparative
- Interviewing in Qualitative Research
- Latent Variable Model
- Life History/Biography
- Log-Linear Models (Categorical Dependent Variables)
- Longitudinal Analysis
- Mathematics and Formal Models
- Measurement Level
- Measurement Testing and Classification
- Multilevel Analysis
- Multiple Regression
- Qualitative Data Analysis
- Sampling in Qualitative Research
- Sampling in Surveys
- Scaling
- Significance Testing
- Simple Regression
- Survey Design
- Time Series
- ARIMA
- Box-Jenkins Modeling
- Cointegration
- Detrending
- Durbin-Watson Statistic
- Error Correction Models
- Forecasting
- Granger Causality
- Interrupted Time-Series Design
- Intervention Analysis
- Lag Structure
- Moving Average
- Periodicity
- Serial Correlation
- Spectral Analysis
- Time-Series Cross-Section (TSCS) Models
- Time-Series Data (Analysis/Design)
- Trend Analysis
- Loading...
Get a 30 day FREE TRIAL
-
Watch videos from a variety of sources bringing classroom topics to life
-
Read modern, diverse business cases
-
Explore hundreds of books and reference titles
Sage Recommends
We found other relevant content for you on other Sage platforms.
Have you created a personal profile? Login or create a profile so that you can save clips, playlists and searches