Using DIF to Investigate Strengths and Weaknesses in Mathematics Achievement. Profiles of 10 Different Countries. Enis Dogan

Size: px
Start display at page:

Download "Using DIF to Investigate Strengths and Weaknesses in Mathematics Achievement. Profiles of 10 Different Countries. Enis Dogan"

Transcription

1 Using DIF to Investigate Strengths and Weaknesses in Mathematics Achievement Profiles of 10 Different Countries Enis Dogan Teachers College, Columbia University Anabelle Guerrero Teachers College, Columbia University Kikumi Tatsuoka Teachers College, Columbia University Paper Presented at the Annual Meeting of the National Council on Measurement in Education (NCME) Montreal, Canada April 12-14,2005 1

2 Using DIF to Investigate Strengths and Weaknesses in Mathematics Achievement Profiles of 10 Different Countries INTRODUCTION The Third International Mathematics Science Study (TIMSS, 1995) and Third International Mathematics and Science Study-Repeat (TIMSS-R, 1999) have provided invaluable information that has helped researchers and educators understand differences and similarities in teaching and learning of mathematics across different nations. Recent work of Corter and Tatsuoka (2002) demonstrated new ways of exploring micro-level analysis of mathematics achievement as measured by TIMSS-R. These researchers used the Rule Space Model (Tatsuoka, 1987; Tatsuoka & Tatsuoka, 1989) in their work. The Rule Space Model (RSM) was developed for analyzing latent cognitive and educationalskill variables, such as whether or not a student possesses a particular piece of knowledge or cognitive processing skills required to solve a particular problem. The model is used first to determine these variables, or items attributes in other words, and then to make probabilistic statements about whether these attributes are mastered or not by individual students. By using this model Corter and Tatsuoka (2002) determined and validated twenty-seven, process and skill attributes (Appendix 1) that explain the underlying constructs measured by TIMSS-R Population 2 (7 th and 8 th grade levels) mathematics test. They also determined how these attributed are mastered across different countries by using the attribute mastery probabilities again produced by the RSM. As Van Linden (1998, p.574) described, In international assessments the achievements are bound to represent multidimensionality rather than unidimensional knowledge. Also, national populations can be expected to have different distributions on 2

3 each of the dimensions; in fact international assessments are designed just to detect such differences. Corter and Tatsuoka s above-mentioned work served to unfold the multidimensional structure of the TIMSS-R mathematics test and thereby reveal the specific strengths and weaknesses of participating countries on existing dimensions. Shortly after the release of the results of the TIMSS-R emphasis had been on the relative rankings of the participating countries, which Schmidt et al (1998) refer to as horse race aspects of cross-national comparisons. These rankings, however, were based on mean total scores obtained from each country. Many researchers criticized this approach. For example Schmidt et al (1998) argued that student achievement in mathematics and science is inherently multidimensional and Highly aggregated scores of broadly sampled domains are inherently misleading and mask fundamental, educationally relevant diversities at more specific levels of the curriculum (p.503). He suggested that Explaining country performance differences requires looking beyond total scores,, even looking beyond specific topics to items (p.519). What was the reason behind this soon-after criticized approach then? Klieme and Baumert (2001) argued that the use of unidimensional test models, and total scores derived within the context of these models, are preferred most of the time because of convenience: The relatively good fit of unidimensional test models is of great benefit for system monitoring purposes and international comparisons because it allows for parsimonious descriptions of results and provides robust basis for time-series analyses. From the perspective of teaching and learning, however, this parsimony is a drawback, as results provide no information on the specific strengths and weaknesses of different student populations or educational treatments. For teaching and learning to be improved, 3

4 specific diagnostic information is needed that helps to identify potential points of intervention. (p. 386) The common suggestion by these researchers, as discussed above, is that they all point to the necessity to look deeper in order to understand the strengths and weaknesses of different countries and go beyond the unidimensional test scores and rankings based on them. The work of Corter and Tatsuoka (2002) addressed this necessity. The purpose of this study is to strengthen these results and demonstrate how Differential Item Functioning (DIF) methods can also be used to explore country-specific strengths and weaknesses. Alternative Explanations for DIF The traditional use of Differential Item Functioning (DIF) aims to detect items that are biased to certain subgroups of test takers in a population. DIF occurs when testtakers from different groups with identical overall test scores, or ability level, differ systematically with regard to the probability of solving particular test item(s). The first step in DIF analysis is to determine the subgroups to be examined. Many DIF studies, for example, focus on whether certain items in a test show bias towards a particular gender or an ethnic group. Technically speaking, the group that is hypothesized to be more likely to be subject to item bias is named as the focal group. The performance of individuals in the focal groups is compared to the performance of individuals in a reference group. After accounting for overall ability, or total score, if the probability of correctly answering a particular item is different for the two groups, the item is said to display DIF. Therefore the traditional understanding of DIF is that it indicates unwelcome item bias. 4

5 On the other hand several researchers have made attempts to investigate alternative explanations of DIF. The focus has been on the multidimensionality of tests as a source of DIF and how DIF can be used to explain group-specific strengths and weaknesses. Klieme and Baumert (2001) points to these attempts in their 2001 article: Recent work in educational measurement,, assumes that DIF reflects the multidimensionality that is inherent in broad competency constructs and leads to differential achievement profiles. Thus, DIF parameters can be used to identify the relative strengths and weaknesses of certain student subpopulations. (p. 385) Several other researchers also share the position that DIF can provide substantial information that helps to gain a deeper understanding of relative strengths and weaknesses in subpopulations (Calvert, 2001). According to this view, DIF is not necessarily an indicator of bias, but can provide information of value in education, because the existence of bias reflects ( ) differences in learning experiences involved for providing a correct response to the item (Keeves & Masters, 1999, p.12). Other researchers also suggested that not all cases of DIF have to be interpreted as item bias that will jeopardize the fairness of the test. They argued that DIF can be viewed as indicator for differential effects of specific curricular or instructional conditions (Miller & Linn, 1988; Tatsuoka, Linn, Tatsuoka & Yamamato, 1988). De Ayala et al (1999) hold a similar position in this regard. They suggested that DIF can be conceptualized as a type of multidimensionality occurring when an item measures multiple abilities and when the manifest groups differ in their relative locations to one another on these abilities. In certain situations examinee samples may indeed consist of examinees from different latent classes or subpopulations: the mechanism that gives 5

6 rise to DIF is best modeled by the assumption of latent classes within which the items can be scaled (p. 6). This study capitalizes on these arguments. The work of Corter and Tatsuoka (2002) has already shown that different countries have different learning patterns in mathematics. They showed that different TIMSS-R items possess different cognitive demands and students in different countries exhibit different mastery profiles on these demands. One implication of this is that on certain TIMSS items, individuals from certain countries may perform differently than others even after controlling for overall ability levels. Therefore these items may demonstrate DIF not because they are biased items but because they involve attributes that are mastered differently in focal and reference groups defined in the DIF analysis. In other words, differential performance of different countries on different cognitive demands on a test may lead to DIF on certain items in which these attributes are involved. In fact the link between cognitive profiles of manifest groups and DIF has been demonstrated by Tatsuoka, Linn, Tatsuoka and Yamamoto (1988) with a different data set. These researchers demonstrated that when focal and reference groups are defined according to their cognitive profiles, they demonstrated different performances on items with different underlying cognitive tasks. They argued that these items do not necessarily show unwelcome bias; in fact these items are desirable items in the sense that they demonstrate the differences between cognitive mastery profiles of the manifest groups. In sum, the present study aims to illustrate two points: (1) cognitive demands of an item may cause that item to display DIF if these demands are mastered in different ways by the focal and reference groups; (2) DIF can, therefore, serve to illustrate specific strengths and weaknesses of different countries when they are entered as the focal and 6

7 reference groups into the DIF analysis. Most of the early researchers who were interested in country-specific strengths and weaknesses focused on effects of curriculum. The idea was that a country s achievement results will be better in areas that constitute an important part of that country s curriculum. They hypothesized that if a topic is devoted more time and learning opportunity, then in that topic the country will perform better compared to others that do not give similar emphasis to it (Schmidt, Jakwerth & McKnight, 1998). Here we suggest that exploring the cognitive demands of a test and how these demands are mastered across different countries is also an effective way of exploring country-specific strengths and weaknesses. Following are the specific questions addressed in this study: 1. Different items in TIMSS-R mathematics test possess different attributes. Is there a relationship between the DIF parameters obtained from these items and attribute mastery levels of countries when these countries are used to define focal and reference groups? 2. Which of the 44 Booklet 1 items of TIMSS-R mathematics test display DIF when each one of the 10 countries is used as the focal group and the other 9 are used as the reference group? What are the attribute contents of these DIF items? Do the focal and reference groups perform differently on the attributes that are involved in the DIF items? METHODS Data Data comes from 10 different countries that participated in TIMSS-R. These countries are USA, Japan, Russia, Turkey, Korea, Czech Rep., Canada, Netherlands, 7

8 Australia and Finland. These countries were chosen for the study because of their differential performances on TIMSS-R. Data consist of student response vectors (coded as right or wrong ) to 44 questions from Booklet 1 of TIMSS-R Population 2 mathematics test and attribute mastery probability vectors for the same students as calculated in Corter and Tatsuoka s 2002 work (Appendix 3). An attribute mastery probability is a measure of level of mastery of an attribute by each student inferred by the RSM from his/her items response patterns. Table 1 The sample size for the study, broken down by country, is tabulated below: Sample size according to country Country Sample Size USA 4411 JAPAN 2371 RUSSIA 2178 TURKEY 3900 KOREA 3045 CZECK R CANADA 4364 NETHERLANDS 1480 AUSTRALIA 2023 FINLAND 1452 Total Procedure DIF analyses in this study were performed by means of using the Simultaneous Item Bias Test (SIBTEST). The SIBTEST is a nonparametric method of detecting differential item functioning (DIF) that was developed as an extension of Shealy and Stout s (1993) multidimensional item response theory. In this framework, DIF is conceptualized as a difference in the probability of endorsing a keyed item response, 8

9 occurring when individuals in groups having the same levels of the latent attribute of interest, possess different amounts of nuisance abilities that influence responding. SIBTEST detects bias by comparing the responses of examinees in the reference and the focal groups that have been allocated to bins using their scores on a matching subtest (Stout & Roussos, 1996). An important feature of SIBTEST is that it uses a regression estimate of the true score instead of an observed score as the matching variable. As a result, examinees are matched on an estimated latent ability score rather than an observed score. Although Mantel-Haenszel (MH) procedures may be considered the gold standard in DIF detection (Roussos & Stout, 1996), researchers have demonstrated that SIBTEST has superior statistical characteristics compared to MH procedures for detecting uniform DIF (Gierl, Jodoin, Ackerman, 2000). While performing the DIF analyses countries were used to define the focal and reference groups in this study. Analyses were performed in two stages in order to answer the two above-mentioned research questions. In the first stage, focal and reference groups were defined by means of contrasting each country to one another. As mentioned earlier data consist of students response vectors on 44 items from 10 different countries. This allowed for 45 different DIF analyses on the 44 items (see Appendix 2 for the complete list of comparisons). As a result a 45 by 44 matrix was created to store the DIF parameters (β coefficients) produced by the SIBTEST. This matrix was then combined with another matrix which contained attribute mastery probability differences for all 45 comparisons (see Appendix 3). These differences were calculated by means of subtracting the mastery probability of the focal group from that of the reference group. 9

10 This second matrix was a 45 by 27 matrix because it stored these mastery probability differences on 27 different attributes for the same 45 country comparisons. We hypothesized that differences in attribute mastery profiles among countries can be a source of DIF when these countries are contrasted, as focal and reference groups, on items that involve these attributes. For instance, an item that involves three different attributes may display DIF when countries define the focal and reference groups if these countries perform differently on one or more of the attributes involved in this item. Involvement of different attributes creates a multidimensional structure for the item and because of this multidimensional structure countries may perform differently on this item even after controlling for their unidimensional (overall) performance. Therefore we expected a significant relationship between the two matrices described above. In order to investigate this relationship we focused on 5 geometry items. We performed a multivariate regression in which we regressed the DIF parameters obtained from 5 geometry items (Appendix 4) on attribute mastery differences on 4 geometry-related attributes. As an attempt to answer the second research question, 10 other DIF analyses were performed. This time in each analysis, one of the 10 countries defined the focal group and the other 9 countries, combined together, defined the reference group. For each run the SIBTEST was asked to detect items that displayed significant DIF against the focal group. These items were then tabulated with their corresponding attribute involvements. Separate t-tests were also performed on these attributes to see if the focal group performed lower than the reference group, as this was expected to be a source of DIF. 10

11 RESULTS Investigating the Relationship between DIF parameters and Attribute Mastery Probability Differences In order to investigate the relationship between DIF parameters and attribute mastery probability differences, we focused on 5 geometry items: A5, C3, E2, I7, T3 (T3 is a released item, see Appendix 5) and four geometry-related attributes: C4, P2, P3 and P7. A multivariate regression analysis was performed using the DIF parameters from these 5 items as the dependent variables. The attribute mastery probability differences on attributes C4, P2, P3 and P7 were used as the independent variables. We hypothesized that the magnitude of the DIF parameters on these items would depend on the magnitude of the difference between the focal and the reference groups in mastering the abovementioned attributes. If the differential performances of the focal and reference groups lead to DIF on certain items then there must be a relationship between the DIF parameters and how the groups perform on attributes that are involved in these items. Therefore in order to see this relationship, DIF parameters obtained from the above mentioned items, across all 45 comparisons, were regressed on the attribute mastery probability differences, across the same 45 comparisons, on related attributes. The relationship between the DIF parameters and the attribute mastery probability differences was significant. The multivariate test produced an approximate F-value of 43.3 with a p value of 0. Details of the multivariate test results are displayed in Table 2. 11

12 Table 2. Multivariate Tests of Significance predicting DIF parameters from attribute mastery probability differences Test Name Value Approx. F Hypoth. DF Error DF Sig. of F Pillais Hotellings Wilks Roys Univariate F-tests also yielded significant results. 71% of the variance in DIF parameters for item A5; 39.82% of that of item C3; % of that of item E2, % of that of item I7 and 75.44% of that of item T3 were explained by the attribute mastery probability differences on attributes C4, P2, P3 and P7 (Table 3). Table 3. Univariate F-tests predicting DIF parameters from attribute mastery probability differences Variable Sq. Mul. R Adj. R-sq. Hypoth. MS Error MS F Sig. of F Β (A5) Β (C3) Β (E2) Β (I7) Β (T3) Furthermore, individual univariate regression analyses revealed which of the four independent variables were significant predictors of these DIF parameters. 12

13 Univariate Regression Analyses Predicting β (A5) from C4, P2, P3 and P7 Univariate regression analysis using the DIF parameters obtained from item A5 as the dependent variable indicated that attribute mastery probability differences on attributes C4, P3 and P7 were significant predictors of this variable (Table 4). Table 4. Univariate regression analysis. Dependent variable: β (A5) Predictor B Beta Std. Error t-value Sig. of t C P P P However, the B coefficients (the slopes) of two of these three variables were not in the expected direction. When the attribute mastery probability differences were computed, the mastery probability of the focal group was subtracted from that of the reference group. Therefore positive values on these variables indicated that the focal group performed lower on these attributes. On the other hand, positive DIF parameters indicate bias against the focal group. Therefore according to our hypothesis higher values on DIF parameters were expected to be associated with higher values on the attribute mastery probability differences,and vice versa. That would mean that the predictors on Table 4, were expected to have positive and significant B coefficients. The only predictor that did so is C4. Therefore the results indicate that the higher the difference between attribute mastery probabilities, for attribute C4, between the focal and the reference country (in favor of the reference group), the higher the DIF parameter on item A5 (against the focal country). This is exactly the kind of relationship we expected, because we hypothesized 13

14 that the DIF parameters against the focal group, belonging to a particular item, would be higher when the focal group performed lower on one or more of the attributes that are involved in that item. Predicting β (C3) from C4, P2, P3 and P7 As a result of the univariate regression analysis using the DIF parameters obtained from item C3 as the dependent variable, only the attribute mastery probability differences on attribute P7 was found to be a significant predictor of this variable (Table 5). However, the relationship was not in the expected direction. Table 5. Univariate regression analysis. Dependent variable: β (C3) Predictor B Beta Std. Error t-value Sig. of t C P P P Predicting β (E2) from C4, P2, P3 and P7 Below are the results for the univariate regression analysis using the DIF parameters obtained from item E2 as the dependent variable (Table 6). The attribute mastery probability differences on attributes C4 and P2 were found to be positively and significantly related to the magnitude of DIF (against the focal group) on item E2. This result was again in line with our expectation. When the focal group performed lower on these attributes (C4 and P2) the magnitude of DIF against it increased on item E2. 14

15 Table 6. Univariate regression analysis. Dependent variable: β (E2) Predictor B Beta Std. Error t-value Sig. of t C P P P Predicting β (I7) from C4, P2, P3 and P7 When the dependent variable was the DIF parameters obtained from item I7, regression analysis results indicated the attribute mastery probability differences on attribute P3 as a significant predictor, which is positively related to the dependent variable (Table 7). This means that as the gap between the focal group and the references group increases (on attribute P3), so does the magnitude of DIF (against the focal group) on item I7. Table 7. Univariate regression analysis. Dependent variable: β (I7) Predictor B Beta Std. Error t-value Sig. of t C P P P Predicting β (T3) from C4, P2, P3 and P7 The results of the last univariate regression indicated the attribute mastery probability differences on attributes P2 and P3 to be significantly and positively related to DIF parameters obtained from item T3 (Table 8). This confirmed our expectation that 15

16 the magnitude of DIF parameters against the focal group would increase as the performance of the focal group decreased (relative to the reference group) on relevant attributes. Table 8. Univariate regression analysis. Dependent variable: β (T3) Predictor B Beta Std. Error t-value Sig. of t C P P P Investigating the DIF Items for 10 countries and their attribute contents In order to answer the second research question listed above, 10 separate DIF analyses were performed. Results in the previous section come from 45 DIF runs where focal and reference groups were defined by matching each country with one another (Appendix 2). In this section, however, each one of the 10 countries was used as the focal group while the other 9 defined the reference group (Table 9). The purpose was to detect items displaying significant DIF against the focal groups. Again the SIBTEST was used for this purpose. The Bonferonni correction was employed to control for Type I error by dividing the original alpha level (.05) by the number of items (44) that were entered into the analyses. Table 9 summarizes how many such items were detected for each run. When Turkey was the focal group the number of significant DIF items reached a maximum. When this country was used to define the focal group, the SIBTEST reported 25 % of all items as displaying significant DIF against the focal group. When Canada was the focal group, however, the number of significant DIF items dropped down to 6 only. 16

17 Table 9. Number of significant DIF items. Total number of items entered is 44. GROUPS Focal Reference Number of sig. DIF items % of sig. DIF items USA All other JAPAN All other RUSSIA All other TURKEY All other KOREA All other CZECK R. All other CANADA All other NETHERLANDS All other AUSTRALIA All other FINLAND All other Five of these analyses are discussed in more detailed below. A complete list of DIF items on all 10 analyses are listed in Appendix 6. Analysis 1 In this analysis USA was used as the focal group, and the data from the other 9 countries were combined together and used as the reference group. Table 10 displays which items displaying DIF as well as their attribute contents. As seen in this table 10 nine different items from a set of 44 items from Booklet 1 of TIMSS-R mathematics test displayed significant DIF against USA. All items were significant at level. However, beta coefficients for each of these items were further classified as either negligible or or following the criteria set by Roussos & Stout (1996). If the beta coefficient was smaller than.05 it was classified as negligible, coefficients between.05 and.1 were classified as and coefficients that are greater than.1 were classified as. This way, 1 of the DIF items that passed 17

18 the significant test for this analysis was classified as negligible; 5 were classified as and 3 were classified as. Table 10. Significant DIF items and their attribute contents. ITEM β Std error p-value DIF-class Content- Attributes Process- Attributes Skill- Attributes A C2* C4* P7* S5 S12 A C4* P3* P8* S3 S8* C C4* P3* S3 S5 S8* C negligible C2* P3* S3 S5 S7* S8* E C4* P2* P5* P8* S6* E C4* P1 P2* P7* P8* I C1 P2* P5* P8* S11* P2* P5* P6* P7* I C4* P8* S3 P1 P2* P4* P5* T C1 C3* P8* P9* S7* S10* S11* *Attributes on which the focal group performed significantly lower (p<.05) than the reference group. Corter and Tatsuoka (2002) coded all TIMM-R mathematics items according to their attribute involvements. We used their coding in order to determine which attributes were involved in which of the DIF items. After this point, separate t-tests were run on the mastery probabilities between the focal and the reference groups on these attributes. Results indicated that on all of the nine DIF items there was at least one attribute on which the focal group (USA) performed significantly lower than the reference group (Table 10). This was in line with our expectations because we hypothesized that the low performance of the focal group, on relevant attributes, was part of the reason why certain items displayed DIF against focal group. When the attribute contents of these 9 DIF items are examined, it is seen that content attribute C4 (Basic concepts and properties of two-dimensional Geometry) 18

19 appears in 5 of them. Another geometry related attribute P2 (Computational applications of knowledge in arithmetic and geometry) also appeared in 5 of these 9 DIF items. Attribute P8 (Applying and Evaluating Mathematical Correctness) is another common attribute: it appeared in 6 times in these 9 DIF attributes. It looked like these 3 attributes served as a leading source of DIF for this particular country. Analysis 2 In this second analysis Russia defined the focal group, and the data from the other 9 countries were combined together and used as the reference group. There were 10 items in this analysis that displayed DIF against the focal group, namely students from Russia. Table 11 displays the list of these items as well as their attribute contents. Table 11. Significant DIF items and their attribute contents. std Content- Process- Skill- ITEM β error p-value DIF-class Attributes Attributes Attributes B C4 P5* P7* S3 S6* C C2 P3 S3 S5* S7 S8 C Moderate C1* C4 P3 P6 S3 S6* I Large C2 P1 P5* P8 S11 I Moderate C2 P2 P9 S2 P2 P5* P6 P7* I C4 P8 S3 I C2 C5* P1 P3 P9 S11 S3 S5* S6* S1A C1* P5* P6 P7* P8 S10 S12 S3 S5* S6* S1B C1* C3 C5* P5* P6 P8 S10 S11 S2C C3 C4 P4 S10 *Attributes on which the focal group performed significantly lower (p<.05) than the reference group. 19

20 Following the same procedure described above, these items were classified according to the magnitude of DIF parameters obtained from them. As a result 4 of these 10 items were classified as, the rest were DIF items. When separate t-tests were run on the mastery probabilities between the focal and the reference groups on the attributes that are involved in these 10 DIF items, it was seen that on 8 of these 10 DIF items there were at least one or more attributes on which Russia performed significantly lower than the reference group. Most frequently appearing attributes were as follows: P5 (Logical reasoning includes case reasoning, deductive thinking skills, if-then, necessary and sufficient, generalization skills); S6 (Patterns and relationships); C1 (Basic concepts, properties and operations in whole numbers and integers); P7 (Generating, visualizing and reading Figures and Graphs) and S5 (Evaluate/Verify/Check Options). Analysis 3 In this third analysis Turkey was used as the focal group, and the data from the other 9 countries were combined together and used as the reference group. Table 12 displays which items were detected as DIF items as well as their attribute contents. Turkey was the country with the maximum number of DIF items with 11 items. This was actually expected because among all the ten countries Turkey had the poorest attribute performance on 21 of all 27 attributes. We hypothesized that the difference in attribute mastery probabilities between the focal and the reference groups serves as a source of DIF. Therefore it was not surprising that the number of items that displayed 20

21 DIF against the focal group reached its maximum value when the country with the poorest attribute mastery profile was used as the focal group. Table 12. Significant DIF items and their attribute contents. std Content- Process- ITEM β error p-value DIF-class Attributes Attributes Skill-Attributes B P9* S3* S5* B C2* P2* P8* P9* S2* S5* B C4* P5* P7* S3* S5* B C1* C3* P1* S5* C C4* P3* S3* S5* S8* E C5* P3* S3* S4* S8* E C1* P5* P8* S2* S5* I C2* P2* P9* S2* I C5* P2* P7* S1a P5* P6* P7* S3* S5* S6* S10* C1* P8* S11* T C4* P2* P6* P8* S3* *Attributes on which the focal group performed significantly lower (p<.05) than the reference group. These 11 DIF items were further classified using the criteria discussed above. Two of these 11 items were classified as, the rest were DIF items. Following the procedure applied to DIF items for USA, we performed t-tests on the mastery probabilities between the focal and the reference groups on the attributes that are involved in these 11 DIF items. On all of the attributes that were involved in these 11 DIF items the focal group (Turkey) performed significantly lower than the reference group. This result also strengthened our conviction that performance of the focal group on attributes involved in an item has an impact on determining whether that item displays DIF against the focal group or not. Among these attributes content attribute C1 (Basic concepts, properties and operations in whole numbers and integers), process attributes P2 (Computational applications of knowledge in arithmetic and geometry) and P8 (Applying and Evaluating 21

22 Mathematical Correctness), and skill attributes S5 (Evaluate/Verify/Check Options) and S3 (Using figures, tables, charts and graphs) appeared most in these DIF items with 3, 4, 4, 7 and 6 counts respectively. Analysis 4 In the third analysis Canada was used as the focal group, and the data from the other 9 countries were combined together and used as the reference group. Table 13 displays which items were detected as DIF items as well as their attribute contents. Table 13. Significant DIF items and their attribute contents. ITEM β std error p-value DIF-class Content- Attributes Process- Attributes Skill- Attributes A C4 P3 P8 S3 S7 A C1 C5* P1 P2 P8 S7 S11* E C4 P2 P5*P8 S5 E C4 P1 P2 P7 P8 S11* I C3* C5* P3 P5* P9* S2 S5 S11* I C2 P1 P5* P8 S11* *Attributes on which the focal group performed significantly lower (p<.05) than the reference group. When Canada was the focal group 6 items displayed DIF. One of these 6 items was classified as and the other five were classified as DIF items. On 5 if these 6 items there was at least one attribute on which Canada performed significantly lower than the reference group. S11 (Using words to communicate questions (word problem) was the most frequently observed attribute, on which Canada performed lower, in these 6 DIF items. It appeared in 4 of the 6 DIF items. P5 (Logical reasoning) 22

23 followed it with 3 appearances; and C5 (Data, probability, and basic statistics) followed P5 with 2 appearances. Analysis 5 In this fourth analysis Finland was used as the focal group, and the data from the other 9 countries were combined together and used as the reference group. Table 14 displays which items were detected as DIF items as well as their attribute contents. When Finland was the focal group 8 items displayed DIF. Two of those 8 items was classified as and the other six were classified as DIF items. Table 14. Significant DIF items and their attribute contents. std Content- Process- ITEM β error p-value DIF-class Attributes Attributes Skill-Attributes A C4 P3* P8 S3 S5 B C1* P1 P2 S7 S11 B C1* C3* P1 S5 I C3* C5 P3 P5* P9* S2 S5 S11 I C2 P1 P7 S8 S11 S1b S3 S5 S6*S10* C1* C3* C5 P5* P6 P8 S11 S2a C1* C3* C5 P5* P6 P8 S3 S6* S10* S11 S2b C2 S10* *Attributes on which the focal group performed significantly lower (p<.05) than the reference group. On 7 if these 8 items there was at least one attribute on which Canada performed significantly lower than the reference group. Content attributes C1 (Basic concepts, properties and operations in whole numbers and integers) and C3 (Basic concepts, properties and operations in elementary algebra) were the attributes with most frequent involvement with the DIF items. Both 23

24 appeared 4 times in these 8 DIF items. Process attributes P3 (Judgmental applications of knowledge in arithmetic and geometry), P5 (Logical reasoning) and skill attribute S10 (Open-ended item type) followed these attributes with 3 appearances. SUMMARY and DISCUSSIONS The purpose of this study was twofold: (1) to investigate whether there is a significant relationship between the DIF parameters obtained from 44 TIMSS-R mathematics items and mastery levels of countries on attributes that belong to these items when these countries are used to define focal and reference groups; (2) to detect strengths and weaknesses of ten TIMSS-R countries by means of combining DIF analysis and the RSM techniques. Corter and Tatsuoka (2002) demonstrated the different cognitive demands, or attributes, of TIMSS-R mathematics items and how different countries performed differently on these demands. We hypothesized that the differential performance of these countries on certain attributes would lead to DIF on items that these attributes are involved. More specifically if the country that defines the focal group performs lower on one or more of the attributes involved in an item, that item is expected to display DIF against this country. When DIF is conceptualized this way, items displaying DIF cannot be regarded as unwelcome biased items any more. These items become indicators of micro-level performance differences among countries after controlling for their macrolevel or overall, performance. In order to address the research questions listed above we performed two different series of DIF analyses. Data came from 10 different countries that participated TIMSS- R. 24

25 First we ran 45 different DIF analyses using each and combination of these countries to define the focal and the reference groups. These analyses produced DIF parameters for all 44 items under investigation for all 45 comparisons. Parameters obtained from 5 geometry items were regressed on attribute mastery probability differences, across 45 comparisons again, on 4 geometry-related attributes. Positive values on these attribute mastery probability differences indicated lower performances on behalf of the focal group. On the other hand, positive values on DIF parameters indicated DIF against the focal group. Therefore we expected a positive relationship between these two sets of variables. On 4 of these 5 geometry items we detected such relationship. We used only four predictors to explain variance in 5 dependant variables with 45 observations only. Finding significant results with these degrees of freedom strengthened our conviction that magnitude of DIF depended on differential performances of countries on relevant attributes. Second part of the study involved 10 different DIF analyses again using the 10 countries as manifest groups. This time each country was used to define the focal group while the other 9 served as the reference group. We detected a total of 90 items as displaying DIF against the focal group across these 10 analyses (Appendix 6). We explored the attribute contents of these DIF items for 4 of these 10 analyses. There were 34 DIF items in these 4 analyses and 32 of these items involved at least one attribute on which the focal group performed significantly lower than the reference group. We also determined which attributes were common in DIF items for each of these 4 analyses. This study revealed one of the sources of DIF in international assessments; that is the differential performance of countries on different cognitive demands that underlie the 25

26 test. Further studies should focus on building models that take into account other possible sources of DIF. An item showing DIF might be the worst item in a test; it might be the best one as well depending on the source of DIF. Therefore teasing out the effect of each source of DIF makes all the difference. Future research should focus on doing just that, as this is a way to understand DIF rather than just to detect it. 26

27 REFERENCES Calvert, T. (2001). Exploring differential item functioning (DIF) with the Rasch Model: A cross-country comparison of gender differences on eight grade science items. Seattle, WA: AERA. Corter, J. E., & Tatsuoka, K. (2002). Cognitive and measurement foundations of diagnostic assessment in mathematics. The College Board: Technical Report. De Ayala, R.J., Kim, S., Stapleton, L. M., Dayton, C.M. (1999). A Reconceptualization of Differential Item Functioning. Paper Presented at the Annual Meeting of American Educational Research Association (AERA) Gierl, M.J., Jodoin, M. G., Ackerman, T. A. Performance of Mantel Haenszel, Simultaneous Item Bias Test, and Logistic Regression When the proportion of DIF Items is Large. Paper Presented at the Annual Meeting of American Educational Research Association (AERA) 2000 Keeves, J.P.,& Masters, G.N. (1999). Introduction. In G.N. Masters & J.P. Keeves (Eds.), Advances in measurement in educational research and assessment. Oxford: Pergamon, Klieme, E., & Baumert, J. (2001). Identifying National Cultures of Mathematics Education: Analysis of Cognitive Demands and Differential Item Functioning in TIMSS. European Journal of Psychology of Education,16(3), Miller, M. D., & Linn, R. L. (1988). Invariance of Item Characteristic Functions with Variations in Instructional Coverage. Journal of Educational Measurement, 25(3), Schmidt, W. H., Jakwerth, P. M., McKnight, C.C. (1998). Curriculum sensitive assessment: Content does make a difference. International Journal of Educational Research, 29, Shealy, R., & Stout, W. (1996).A Model-Based Standardization Approach that Separates True Bias/DIF from Group Ability Differences and Detects Test Bias/DTF as Well as Item Bias/DIF. Psychometrika, 58(2), Roussos, L., & Stout, W. (1996). A Multidimensionality-Based DIF Analysis Paradigm. Applied Psychological Measurement, 20(4), Tatsuoka, K. K. (1983). Rule Space: An Approach for Dealing with Misconceptions Based on Item Response Theory. Journal of Educational Measurement, 20(4), Tatsuoka, K. K., & Tatsuoka, M. M. (1983). Spotting Erroneous Rules of Operation by the Individual Consistency Index. Journal of Educational Measurement, 20(3) p

28 Tatsuoka, K. K., Linn, R. L., & Yamamoto, K. (1988). Differential item functioning resulting from use of different solution strategies. Journal of Educational Measurement, 25, Tatsuoka, K. K., Tatsuoka, C. M. (1989). Rule Space. In Kotz and Johnson (Eds.), Encyclopedia of Statistical Sciences. New York, NY: Wiley. Van der Linden, W.J. (1998). A discussion of some methodological issues in international assessments. International Journal of Educational Research, 29,

29 APPENDIX 1 A List of Knowledge, Skill, and Process Attributes Derived to Explain Performance on the TIMSS-R (1999) Mathematics Items CONTENT ATTRIBUTES C1 C2 C3 C4 C5 C6 Basic concepts, properties and operations in whole numbers and integers Basic concepts, properties and operations in fractions and decimals Basic concepts, properties and operations in elementary algebra Basic concepts and properties of two-dimensional Geometry Data, probability, and basic statistics Using tools to measure (or estimating) length, time, angle, temperature PROCESS ATTRIBUTES P1 P2 P3 P4 P5 P6 P7 P8 P9 P10 Translate/formulate equations and expressions to solve a problem Computational applications of knowledge in arithmetic and geometry Judgmental applications of knowledge in arithmetic and geometry Applying rules in algebra Logical reasoning includes case reasoning, deductive thinking skills, if-then, necessary and sufficient, generalization skills Problem Search; Analytic Thinking, Problem Restructuring and Inductive Thinking Generating, visualizing and reading Figures and Graphs Applying and Evaluating Mathematical Correctness Management of Data and Procedures Quantitative and Logical Reading SKILL (ITEM TYPE) ATTRIBUTES S1 Unit conversion S2 Apply number properties and relationships; number sense/number line S3 Using figures, tables, charts and graphs S4 Approximation/Estimation S5 Evaluate/Verify/Check Options S6 Patterns and relationships (be able to apply inductive thinking skills) S7 Using proportional reasoning S8 Solving novel or unfamiliar problems S9 Comparison of two/or more entities (deleted because of low frequencies in each booklet) S10 Open-ended item, in which an answer is not given S11 Using words to communicate questions (word problem) 29

30 APPENDIX 2 List of all 45 DIF analyses Analysis Focal Group Reference Group 1 USA JAPAN 2 USA RUSSIA 3 USA TURKEY 4 USA KOREA 5 USA CZECK R. 6 USA CANADA 7 USA NETHERLANDS 8 USA AUSTRALIA 9 USA FINLAND 10 JAPAN RUSSIA 11 JAPAN TURKEY 12 JAPAN KOREA 13 JAPAN CZECK R. 14 JAPAN CANADA 15 JAPAN NETHERLANDS 16 JAPAN AUSTRALIA 17 JAPAN FINLAND 18 RUSSIA TURKEY 19 RUSSIA KOREA 20 RUSSIA CZECK R. 21 RUSSIA CANADA 22 RUSSIA NETHERLANDS 23 RUSSIA AUSTRALIA 24 RUSSIA FINLAND 25 TURKEY KOREA 26 TURKEY CZECK R. 27 TURKEY CANADA 28 TURKEY NETHERLANDS 29 TURKEY AUSTRALIA 30 TURKEY FINLAND 31 KOREA CZECK R. 32 KOREA CANADA 33 KOREA NETHERLANDS 34 KOREA AUSTRALIA 35 KOREA FINLAND 36 CZECK R. CANADA 37 CZECK R. NETHERLANDS 38 CZECK R. AUSTRALIA 39 CZECK R. FINLAND 40 CANADA NETHERLANDS 41 CANADA AUSTRALIA 42 CANADA FINLAND 43 NETHERLANDS AUSTRALIA 44 NETHERLANDS FINLAND 45 AUSTRALIA FINLAND 30

31 APPENDIX 3 Mean Attribute Mastery Probabilities for 10 countries Attribute USA Japan Russia Turkey Korea Czech R. Canada Netherlands Australia Finland c c c c c c p p p p p p p p p p s s s s s s s s s s

32 APPENDIX 4 DIF parameters (Beta coefficients) obtained from DIF analyses for 5 geometry items Comparison (focal-reference) β (A5) β (C3) β (E2) β (I7) β (T3) USA-JAPAN USA-RUSSIA USA-TURKEY USA-KOREA USA-CZECH REP USA-CANADA USA-NETHERLANDS USA-AUSTRALIA USA-FINLAND JAPAN-RUSSIA JAPAN-TURKEY JAPAN-KOREA JAPAN-CZECH REP JAPAN-CANADA JAPAN-NETHERLANDS JAPAN-AUSTRALIA JAPAN-FINLAND RUSSIA-TURKEY RUSSIA-KOREA RUSSIA-CZECH REP RUSSIA-CANADA RUSSIA-NETHERLANDS RUSSIA-AUSTRALIA RUSSIA-FINLAND TURKEY-KOREA TURKEY-CZECH REP TURKEY-CANADA TURKEY-NETHERLANDS TURKEY-AUSTRALIA TURKEY-FINLAND KOREA-CZECH REP KOREA-CANADA KOREA-NETHERLANDS KOREA-AUSTRALIA KOREA-FINLAND CZECH REP.-CANADA CZECH REP.-NETHERLANDS CZECH REP.-AUSTRALIA CZECH REP.-FINLAND CANADA-NETHERLANDS CANADA-AUSTRALIA CANADA-FINLAND NETHERLANDS-AUSTRALIA NETHERLANDS-FINLAND AUSTRALIA-FINLAND

33 APPENDIX 5 Item T3 The figure shows a shaded rectangle inside a parallelogram. 3cm cm What is the area of the shaded rectangle? Answer: 33

34 APPENDIX 6 Dif items among 10 countries Focal Group Reference Group ITEM β Std error p- value DIFclass USA All other 9 A USA All other 9 A USA All other 9 C USA All other 9 C negligible USA All other 9 E USA All other 9 E USA All other 9 I USA All other 9 I USA All other 9 T JAPAN All other 9 A JAPAN All other 9 A JAPAN All other 9 B JAPAN All other 9 B JAPAN All other 9 C JAPAN All other 9 E JAPAN All other 9 I JAPAN All other 9 S2b JAPAN All other 9 T RUSSIA All other 9 S2c TURKEY All other 9 B TURKEY All other 9 B TURKEY All other 9 B TURKEY All other 9 B TURKEY All other 9 C TURKEY All other 9 E TURKEY All other 9 E TURKEY All other 9 I Large TURKEY All other 9 I TURKEY All other 9 S1a TURKEY All other 9 T KOREA All other 9 A KOREA All other 9 A KOREA All other 9 B KOREA All other 9 C

Comparison of parametric and nonparametric item response techniques in determining differential item functioning in polytomous scale

Comparison of parametric and nonparametric item response techniques in determining differential item functioning in polytomous scale American Journal of Theoretical and Applied Statistics 2014; 3(2): 31-38 Published online March 20, 2014 (http://www.sciencepublishinggroup.com/j/ajtas) doi: 10.11648/j.ajtas.20140302.11 Comparison of

More information

DRAFT. New York State Testing Program Grade 8 Common Core Mathematics Test. Released Questions with Annotations

DRAFT. New York State Testing Program Grade 8 Common Core Mathematics Test. Released Questions with Annotations DRAFT New York State Testing Program Grade 8 Common Core Mathematics Test Released Questions with Annotations August 2014 THE STATE EDUCATION DEPARTMENT / THE UNIVERSITY OF THE STATE OF NEW YORK / ALBANY,

More information

Methods for the Comparison of DIF across Assessments W. Holmes Finch Maria Hernandez Finch Brian F. French David E. McIntosh

Methods for the Comparison of DIF across Assessments W. Holmes Finch Maria Hernandez Finch Brian F. French David E. McIntosh Methods for the Comparison of DIF across Assessments W. Holmes Finch Maria Hernandez Finch Brian F. French David E. McIntosh Impact of DIF on Assessments Individuals administrating assessments must select

More information

Nine Week SOL Time Allotment. A.4a, b and A.5a - Properties. A.1b and A.3c - Order of Operations. A.1b - Evaluating Expression

Nine Week SOL Time Allotment. A.4a, b and A.5a - Properties. A.1b and A.3c - Order of Operations. A.1b - Evaluating Expression 6/5/2018 Nine Week SOL Time Allotment A.4a, b and A.5a - Properties A.1b and A.3c - Order of Operations A.1b - Evaluating Expression 3 Days 1 Day 4 Days 1 8.17 and 8.18 - Simplifying Expressions 4 Days

More information

Running head: LOGISTIC REGRESSION FOR DIF DETECTION. Regression Procedure for DIF Detection. Michael G. Jodoin and Mark J. Gierl

Running head: LOGISTIC REGRESSION FOR DIF DETECTION. Regression Procedure for DIF Detection. Michael G. Jodoin and Mark J. Gierl Logistic Regression for DIF Detection 1 Running head: LOGISTIC REGRESSION FOR DIF DETECTION Evaluating Type I Error and Power Using an Effect Size Measure with the Logistic Regression Procedure for DIF

More information

Nine Week SOL Time Allotment

Nine Week SOL Time Allotment 6/5/2018 Nine Week SOL Time Allotment 1 Prerequisite Skills and Beginning of year activities A.1 Translating and Evaluating Expressions and Equations A.4 ace Solve Multi-step equations including variables

More information

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA:

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: MULTIVARIATE ANALYSIS OF VARIANCE MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: 1. Cell sizes : o

More information

Major Matrix Mathematics Education 7-12 Licensure - NEW

Major Matrix Mathematics Education 7-12 Licensure - NEW Course Name(s) Number(s) Choose One: MATH 303Differential Equations MATH 304 Mathematical Modeling MATH 305 History and Philosophy of Mathematics MATH 405 Advanced Calculus MATH 406 Mathematical Statistics

More information

Variation of geospatial thinking in answering geography questions based on topographic maps

Variation of geospatial thinking in answering geography questions based on topographic maps Variation of geospatial thinking in answering geography questions based on topographic maps Yoshiki Wakabayashi*, Yuri Matsui** * Tokyo Metropolitan University ** Itabashi-ku, Tokyo Abstract. This study

More information

One-Way ANOVA. Some examples of when ANOVA would be appropriate include:

One-Way ANOVA. Some examples of when ANOVA would be appropriate include: One-Way ANOVA 1. Purpose Analysis of variance (ANOVA) is used when one wishes to determine whether two or more groups (e.g., classes A, B, and C) differ on some outcome of interest (e.g., an achievement

More information

EDMS Modern Measurement Theories. Explanatory IRT and Cognitive Diagnosis Models. (Session 9)

EDMS Modern Measurement Theories. Explanatory IRT and Cognitive Diagnosis Models. (Session 9) EDMS 724 - Modern Measurement Theories Explanatory IRT and Cognitive Diagnosis Models (Session 9) Spring Semester 2008 Department of Measurement, Statistics, and Evaluation (EDMS) University of Maryland

More information

New York State Testing Program Grade 8 Common Core Mathematics Test. Released Questions with Annotations

New York State Testing Program Grade 8 Common Core Mathematics Test. Released Questions with Annotations New York State Testing Program Grade 8 Common Core Mathematics Test Released Questions with Annotations August 2013 THE STATE EDUCATION DEPARTMENT / THE UNIVERSITY OF THE STATE OF NEW YORK / ALBANY, NY

More information

Pre-Algebra (6/7) Pacing Guide

Pre-Algebra (6/7) Pacing Guide Pre-Algebra (6/7) Pacing Guide Vision Statement Imagine a classroom, a school, or a school district where all students have access to high-quality, engaging mathematics instruction. There are ambitious

More information

A Study of Statistical Power and Type I Errors in Testing a Factor Analytic. Model for Group Differences in Regression Intercepts

A Study of Statistical Power and Type I Errors in Testing a Factor Analytic. Model for Group Differences in Regression Intercepts A Study of Statistical Power and Type I Errors in Testing a Factor Analytic Model for Group Differences in Regression Intercepts by Margarita Olivera Aguilar A Thesis Presented in Partial Fulfillment of

More information

Simple Linear Regression: One Quantitative IV

Simple Linear Regression: One Quantitative IV Simple Linear Regression: One Quantitative IV Linear regression is frequently used to explain variation observed in a dependent variable (DV) with theoretically linked independent variables (IV). For example,

More information

Because it might not make a big DIF: Assessing differential test functioning

Because it might not make a big DIF: Assessing differential test functioning Because it might not make a big DIF: Assessing differential test functioning David B. Flora R. Philip Chalmers Alyssa Counsell Department of Psychology, Quantitative Methods Area Differential item functioning

More information

An Overview of Item Response Theory. Michael C. Edwards, PhD

An Overview of Item Response Theory. Michael C. Edwards, PhD An Overview of Item Response Theory Michael C. Edwards, PhD Overview General overview of psychometrics Reliability and validity Different models and approaches Item response theory (IRT) Conceptual framework

More information

PIRLS 2016 Achievement Scaling Methodology 1

PIRLS 2016 Achievement Scaling Methodology 1 CHAPTER 11 PIRLS 2016 Achievement Scaling Methodology 1 The PIRLS approach to scaling the achievement data, based on item response theory (IRT) scaling with marginal estimation, was developed originally

More information

International Student Achievement in Mathematics

International Student Achievement in Mathematics Chapter 1 International Student Achievement in Mathematics Chapter 1 contains the TIMSS 2007 achievement results for fourth and eighth grade students in mathematics for each of the participating countries

More information

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Putra Malaysia Serdang Use in experiment, quasi-experiment

More information

C A R I B B E A N E X A M I N A T I O N S C O U N C I L REPORT ON CANDIDATES WORK IN THE CARIBBEAN SECONDARY EDUCATION CERTIFICATE EXAMINATION

C A R I B B E A N E X A M I N A T I O N S C O U N C I L REPORT ON CANDIDATES WORK IN THE CARIBBEAN SECONDARY EDUCATION CERTIFICATE EXAMINATION C A R I B B E A N E X A M I N A T I O N S C O U N C I L REPORT ON CANDIDATES WORK IN THE CARIBBEAN SECONDARY EDUCATION CERTIFICATE EXAMINATION MAY/JUNE 2013 MATHEMATICS GENERAL PROFICIENCY EXAMINATION

More information

CISC - Curriculum & Instruction Steering Committee. California County Superintendents Educational Services Association

CISC - Curriculum & Instruction Steering Committee. California County Superintendents Educational Services Association CISC - Curriculum & Instruction Steering Committee California County Superintendents Educational Services Association Primary Content Module The Winning EQUATION Algebra I - Linear Equations and Inequalities

More information

Content Descriptions Based on the Common Core Georgia Performance Standards (CCGPS) CCGPS Coordinate Algebra

Content Descriptions Based on the Common Core Georgia Performance Standards (CCGPS) CCGPS Coordinate Algebra Content Descriptions Based on the Common Core Georgia Performance Standards (CCGPS) CCGPS Coordinate Algebra Introduction The State Board of Education is required by Georgia law (A+ Educational Reform

More information

Assessing the relation between language comprehension and performance in general chemistry. Appendices

Assessing the relation between language comprehension and performance in general chemistry. Appendices Assessing the relation between language comprehension and performance in general chemistry Daniel T. Pyburn a, Samuel Pazicni* a, Victor A. Benassi b, and Elizabeth E. Tappin c a Department of Chemistry,

More information

Teaching Linear Algebra, Analytic Geometry and Basic Vector Calculus with Mathematica at Riga Technical University

Teaching Linear Algebra, Analytic Geometry and Basic Vector Calculus with Mathematica at Riga Technical University 5th WSEAS / IASME International Conference on ENGINEERING EDUCATION (EE'8), Heraklion, Greece, July -4, 8 Teaching Linear Algebra, Analytic Geometry and Basic Vector Calculus with Mathematica at Riga Technical

More information

APPENDIX B SUMMARIES OF SUBJECT MATTER TOPICS WITH RELATED CALIFORNIA AND NCTM STANDARDS PART 1

APPENDIX B SUMMARIES OF SUBJECT MATTER TOPICS WITH RELATED CALIFORNIA AND NCTM STANDARDS PART 1 APPENDIX B SUMMARIES OF SUBJECT MATTER TOPICS WITH RELATED CALIFORNIA AND NCTM STANDARDS This appendix lists the summaries of the subject matter topics presented in Section 2 of the Statement. After each

More information

Student Mathematical Connections in an Introductory Linear Algebra Course. Spencer Payton Washington State University

Student Mathematical Connections in an Introductory Linear Algebra Course. Spencer Payton Washington State University Student Mathematical Connections in an Introductory Linear Algebra Course Spencer Payton Washington State University In an introductory linear algebra course, students are expected to learn a plethora

More information

Nine Week SOL Time Allotment. 6.4 Exponents and Perfect Squares 1 week. 6.3 Integer Understanding 1 week. 6.6ab Integer Operations

Nine Week SOL Time Allotment. 6.4 Exponents and Perfect Squares 1 week. 6.3 Integer Understanding 1 week. 6.6ab Integer Operations 6/5/2018 1 Nine Week SOL Time Allotment 1 6.4 Exponents and Perfect Squares 1 week 6.3 Integer Understanding 1 week 6.6ab Integer Operations 2 weeks 6.6c Order of Operations 2 weeks 6.8 Coordinate Plane

More information

All rights reserved. Reproduction of these materials for instructional purposes in public school classrooms in Virginia is permitted.

All rights reserved. Reproduction of these materials for instructional purposes in public school classrooms in Virginia is permitted. Algebra I Copyright 2009 by the Virginia Department of Education P.O. Box 2120 Richmond, Virginia 23218-2120 http://www.doe.virginia.gov All rights reserved. Reproduction of these materials for instructional

More information

SPIRAL PROGRESSION in the K + 12 MATHEMATICS CURRICULUM

SPIRAL PROGRESSION in the K + 12 MATHEMATICS CURRICULUM SPIRAL PROGRESSION in the K + 12 MATHEMATICS CURRICULUM Soledad A. Ulep University of the Philippines National Institute for Science and Mathematics Education Development (UP NISMED) Objective of the Presentation

More information

Download PDF Syllabus of Class 10th CBSE Mathematics Academic year

Download PDF Syllabus of Class 10th CBSE Mathematics Academic year Download PDF Syllabus of Class 10th CBSE Mathematics Academic year 2018-2019 Download PDF Syllabus of Class 11th CBSE Mathematics Academic year 2018-2019 The Syllabus in the subject of Mathematics has

More information

MATHEMATICS (MATH) Calendar

MATHEMATICS (MATH) Calendar MATHEMATICS (MATH) This is a list of the Mathematics (MATH) courses available at KPU. For information about transfer of credit amongst institutions in B.C. and to see how individual courses transfer, go

More information

ILLINOIS LICENSURE TESTING SYSTEM

ILLINOIS LICENSURE TESTING SYSTEM ILLINOIS LICENSURE TESTING SYSTEM FIELD 115: MATHEMATICS November 2003 Illinois Licensure Testing System FIELD 115: MATHEMATICS November 2003 Subarea Range of Objectives I. Processes and Applications 01

More information

Physics Performance at the TIMSS Advanced 2008 International Benchmarks

Physics Performance at the TIMSS Advanced 2008 International Benchmarks Chapter 9 Physics Performance at the TIMSS Advanced 2008 International Benchmarks As was described more fully in the Introduction, the TIMSS Advanced 2008 physics achievement scale summarizes students

More information

URSULINE ACADEMY Curriculum Guide

URSULINE ACADEMY Curriculum Guide URSULINE ACADEMY 2018-2019 Curriculum Guide MATHEMATICS MT 510 MATHEMATICAL STRATEGIES Description: This course is designed to improve the students understanding of algebraic concepts MT 511 ALGEBRA I

More information

Course Goals and Course Objectives, as of Fall Math 102: Intermediate Algebra

Course Goals and Course Objectives, as of Fall Math 102: Intermediate Algebra Course Goals and Course Objectives, as of Fall 2015 Math 102: Intermediate Algebra Interpret mathematical models such as formulas, graphs, tables, and schematics, and draw inferences from them. Represent

More information

An Introduction to Path Analysis

An Introduction to Path Analysis An Introduction to Path Analysis PRE 905: Multivariate Analysis Lecture 10: April 15, 2014 PRE 905: Lecture 10 Path Analysis Today s Lecture Path analysis starting with multivariate regression then arriving

More information

Center for Advanced Studies in Measurement and Assessment. CASMA Research Report. Hierarchical Cognitive Diagnostic Analysis: Simulation Study

Center for Advanced Studies in Measurement and Assessment. CASMA Research Report. Hierarchical Cognitive Diagnostic Analysis: Simulation Study Center for Advanced Studies in Measurement and Assessment CASMA Research Report Number 38 Hierarchical Cognitive Diagnostic Analysis: Simulation Study Yu-Lan Su, Won-Chan Lee, & Kyong Mi Choi Dec 2013

More information

Psychometric Issues in Formative Assessment: Measuring Student Learning Throughout the Academic Year Using Interim Assessments

Psychometric Issues in Formative Assessment: Measuring Student Learning Throughout the Academic Year Using Interim Assessments Psychometric Issues in Formative Assessment: Measuring Student Learning Throughout the Academic Year Using Interim Assessments Jonathan Templin The University of Georgia Neal Kingston and Wenhao Wang University

More information

Statistical and psychometric methods for measurement: G Theory, DIF, & Linking

Statistical and psychometric methods for measurement: G Theory, DIF, & Linking Statistical and psychometric methods for measurement: G Theory, DIF, & Linking Andrew Ho, Harvard Graduate School of Education The World Bank, Psychometrics Mini Course 2 Washington, DC. June 27, 2018

More information

Multiple Linear Regression II. Lecture 8. Overview. Readings

Multiple Linear Regression II. Lecture 8. Overview. Readings Multiple Linear Regression II Lecture 8 Image source:https://commons.wikimedia.org/wiki/file:autobunnskr%c3%a4iz-ro-a201.jpg Survey Research & Design in Psychology James Neill, 2016 Creative Commons Attribution

More information

Multiple Linear Regression II. Lecture 8. Overview. Readings. Summary of MLR I. Summary of MLR I. Summary of MLR I

Multiple Linear Regression II. Lecture 8. Overview. Readings. Summary of MLR I. Summary of MLR I. Summary of MLR I Multiple Linear Regression II Lecture 8 Image source:https://commons.wikimedia.org/wiki/file:autobunnskr%c3%a4iz-ro-a201.jpg Survey Research & Design in Psychology James Neill, 2016 Creative Commons Attribution

More information

Equations and Inequalities

Equations and Inequalities Algebra I SOL Expanded Test Blueprint Summary Table Blue Hyperlinks link to Understanding the Standards and Essential Knowledge, Skills, and Processes Reporting Category Algebra I Standards of Learning

More information

Unit 1.1 Equations. Quarter 1. Section Days Lesson Notes. Algebra 1 Unit & Lesson Overviews Mathematics Variables and Expressions

Unit 1.1 Equations. Quarter 1. Section Days Lesson Notes. Algebra 1 Unit & Lesson Overviews Mathematics Variables and Expressions Unit 1.1 Equations Quarter 1 Section Days Lesson Notes 1.1 1 Variables and Expressions 1.2 1.3 1 Solving Equations by Addition, Subtraction, Multiplying or Dividing 1.4 1 Solving Two-Step and Multi-Step

More information

Mathematics (MAT) MAT 051 Pre-Algebra. 4 Hours. Prerequisites: None. 4 hours weekly (4-0)

Mathematics (MAT) MAT 051 Pre-Algebra. 4 Hours. Prerequisites: None. 4 hours weekly (4-0) Mathematics (MAT) MAT 051 Pre-Algebra 4 Hours Prerequisites: None 4 hours weekly (4-0) MAT 051 is designed as a review of the basic operations of arithmetic and an introduction to algebra. The student

More information

Summary Report: MAA Program Study Group on Computing and Computational Science

Summary Report: MAA Program Study Group on Computing and Computational Science Summary Report: MAA Program Study Group on Computing and Computational Science Introduction and Themes Henry M. Walker, Grinnell College (Chair) Daniel Kaplan, Macalester College Douglas Baldwin, SUNY

More information

Year 9 plan Victorian Curriculum: Humanities Semester Two (Geography/Economics and Business)

Year 9 plan Victorian Curriculum: Humanities Semester Two (Geography/Economics and Business) Year 9 plan Victorian Curriculum: Humanities Semester Two (Geography/Economics and Business) Implementation year: School name: Kyabram P-12 College Prepared By: Rowena Morris Identify Curriculum Curriculum

More information

I L L I N O I S UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN

I L L I N O I S UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN Introduction Edps/Psych/Stat/ 584 Applied Multivariate Statistics Carolyn J Anderson Department of Educational Psychology I L L I N O I S UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN c Board of Trustees,

More information

An Introduction to Mplus and Path Analysis

An Introduction to Mplus and Path Analysis An Introduction to Mplus and Path Analysis PSYC 943: Fundamentals of Multivariate Modeling Lecture 10: October 30, 2013 PSYC 943: Lecture 10 Today s Lecture Path analysis starting with multivariate regression

More information

Donghoh Kim & Se-Kang Kim

Donghoh Kim & Se-Kang Kim Behav Res (202) 44:239 243 DOI 0.3758/s3428-02-093- Comparing patterns of component loadings: Principal Analysis (PCA) versus Independent Analysis (ICA) in analyzing multivariate non-normal data Donghoh

More information

Stevens 2. Aufl. S Multivariate Tests c

Stevens 2. Aufl. S Multivariate Tests c Stevens 2. Aufl. S. 200 General Linear Model Between-Subjects Factors 1,00 2,00 3,00 N 11 11 11 Effect a. Exact statistic Pillai's Trace Wilks' Lambda Hotelling's Trace Roy's Largest Root Pillai's Trace

More information

MATHEMATICS (IX-X) (CODE NO. 041) Session

MATHEMATICS (IX-X) (CODE NO. 041) Session MATHEMATICS (IX-X) (CODE NO. 041) Session 2018-19 The Syllabus in the subject of Mathematics has undergone changes from time to time in accordance with growth of the subject and emerging needs of the society.

More information

Data Analyses in Multivariate Regression Chii-Dean Joey Lin, SDSU, San Diego, CA

Data Analyses in Multivariate Regression Chii-Dean Joey Lin, SDSU, San Diego, CA Data Analyses in Multivariate Regression Chii-Dean Joey Lin, SDSU, San Diego, CA ABSTRACT Regression analysis is one of the most used statistical methodologies. It can be used to describe or predict causal

More information

Missouri Educator Gateway Assessments

Missouri Educator Gateway Assessments Missouri Educator Gateway Assessments June 2014 Content Domain Range of Competencies Approximate Percentage of Test Score I. Number and Operations 0001 0002 19% II. Algebra and Functions 0003 0006 36%

More information

Amarillo ISD Math Curriculum

Amarillo ISD Math Curriculum Amarillo Independent School District follows the Texas Essential Knowledge and Skills (TEKS). All of AISD curriculum and documents and resources are aligned to the TEKS. The State of Texas State Board

More information

THE ROLE OF COMPUTER BASED TECHNOLOGY IN DEVELOPING UNDERSTANDING OF THE CONCEPT OF SAMPLING DISTRIBUTION

THE ROLE OF COMPUTER BASED TECHNOLOGY IN DEVELOPING UNDERSTANDING OF THE CONCEPT OF SAMPLING DISTRIBUTION THE ROLE OF COMPUTER BASED TECHNOLOGY IN DEVELOPING UNDERSTANDING OF THE CONCEPT OF SAMPLING DISTRIBUTION Kay Lipson Swinburne University of Technology Australia Traditionally, the concept of sampling

More information

Q-Matrix Development. NCME 2009 Workshop

Q-Matrix Development. NCME 2009 Workshop Q-Matrix Development NCME 2009 Workshop Introduction We will define the Q-matrix Then we will discuss method of developing your own Q-matrix Talk about possible problems of the Q-matrix to avoid The Q-matrix

More information

WORLD GEOGRAPHY GRADE 10

WORLD GEOGRAPHY GRADE 10 Parent / Student Course Information SOCIAL STUDIES WORLD GEOGRAPHY GRADE 10 Counselors are available to assist parents and students with course selections and career planning. Parents may arrange to meet

More information

Compiled by Dr. Yuwadee Wongbundhit, Executive Director Curriculum and Instruction

Compiled by Dr. Yuwadee Wongbundhit, Executive Director Curriculum and Instruction SUNSHINE STATE STANDARDS MATHEMATICS ENCHMARKS AND 2005 TO 2009 CONTENT FOCUS Compiled by Dr. Yuwadee Wongbundhit, Executive Director OVERVIEW The purpose of this report is to compile the content focus

More information

Smarter Balanced Assessment Consortium Claims, Targets, and Standard Alignment for Math Interim Assessment Blocks

Smarter Balanced Assessment Consortium Claims, Targets, and Standard Alignment for Math Interim Assessment Blocks Smarter Balanced Assessment Consortium Claims, Targets, and Standard Alignment for Math Interim Assessment Blocks The Smarter Balanced Assessment Consortium (SBAC) has created a hierarchy comprised of

More information

Trends in Human Development Index of European Union

Trends in Human Development Index of European Union Trends in Human Development Index of European Union Department of Statistics, Hacettepe University, Beytepe, Ankara, Turkey spxl@hacettepe.edu.tr, deryacal@hacettepe.edu.tr Abstract: The Human Development

More information

Analysis of Covariance. The following example illustrates a case where the covariate is affected by the treatments.

Analysis of Covariance. The following example illustrates a case where the covariate is affected by the treatments. Analysis of Covariance In some experiments, the experimental units (subjects) are nonhomogeneous or there is variation in the experimental conditions that are not due to the treatments. For example, a

More information

The pre-knowledge of the physics I students on vectors

The pre-knowledge of the physics I students on vectors The pre-knowledge of the physics I students on vectors M R Mhlongo, M E Sithole and A N Moloi Physics Department, University of Limpopo, Medunsa Campus. RSA mrebecca@ul.ac.za Abstract. The purpose of this

More information

Area1 Scaled Score (NAPLEX) .535 ** **.000 N. Sig. (2-tailed)

Area1 Scaled Score (NAPLEX) .535 ** **.000 N. Sig. (2-tailed) Institutional Assessment Report Texas Southern University College of Pharmacy and Health Sciences "An Analysis of 2013 NAPLEX, P4-Comp. Exams and P3 courses The following analysis illustrates relationships

More information

Mifflin County School District Planned Instruction

Mifflin County School District Planned Instruction Mifflin County School District Planned Instruction Title of Planned Instruction: Advanced Algebra II Subject Area: Mathematics Grade Level: Grades 9-12 Prerequisites: Algebra I with a grade of A or B Course

More information

NORTH ALLEGHENY SCHOOL DISTRICT MATHEMATICS DEPARTMENT HONORS PRE-CALCULUS SYLLABUS COURSE NUMBER: 3421

NORTH ALLEGHENY SCHOOL DISTRICT MATHEMATICS DEPARTMENT HONORS PRE-CALCULUS SYLLABUS COURSE NUMBER: 3421 NORTH ALLEGHENY SCHOOL DISTRICT MATHEMATICS DEPARTMENT HONORS PRE-CALCULUS SYLLABUS COURSE NUMBER: 3421 Units of Credit: 1.0 credits, honors weight Course Length: 184 days (full year) Course Overview This

More information

Career and College Readiness in Terms of Next Generation Science Standards (NGSS)

Career and College Readiness in Terms of Next Generation Science Standards (NGSS) Career and College Readiness in Terms of Next Generation Science Standards (NGSS) Michigan An NGSS Lead State Partner Next Generation Science Standards for Today s Students and Tomorrow s Workforce Building

More information

Testing and Interpreting Interaction Effects in Multilevel Models

Testing and Interpreting Interaction Effects in Multilevel Models Testing and Interpreting Interaction Effects in Multilevel Models Joseph J. Stevens University of Oregon and Ann C. Schulte Arizona State University Presented at the annual AERA conference, Washington,

More information

TASC Test Math Practice Items

TASC Test Math Practice Items TASC Test Practice Items Use these items to practice for the TASC subtest. Before you begin, review the information below about Using Gridded Response Item Blocks. Once you reach the end of the test, check

More information

Amarillo ISD Algebra II Standards

Amarillo ISD Algebra II Standards Amarillo Independent School District follows the Texas Essential Knowledge and Skills (TEKS). All of AISD curriculum and documents and resources are aligned to the TEKS. The State of Texas State Board

More information

MATHEMATICS (IX-X) (Code No. 041)

MATHEMATICS (IX-X) (Code No. 041) MATHEMATICS (IX-X) (Code No. 041) The Syllabus in the subject of Mathematics has undergone changes from time to time in accordance with growth of the subject and emerging needs of the society. The present

More information

Planned Course: Algebra IA Mifflin County School District Date of Board Approval: April 25, 2013

Planned Course: Algebra IA Mifflin County School District Date of Board Approval: April 25, 2013 : Algebra IA Mifflin County School District Date of Board Approval: April 25, 2013 Glossary of Curriculum Summative Assessment: Seeks to make an overall judgment of progress made at the end of a defined

More information

Item Response Theory (IRT) Analysis of Item Sets

Item Response Theory (IRT) Analysis of Item Sets University of Connecticut DigitalCommons@UConn NERA Conference Proceedings 2011 Northeastern Educational Research Association (NERA) Annual Conference Fall 10-21-2011 Item Response Theory (IRT) Analysis

More information

STOCKHOLM UNIVERSITY Department of Economics Course name: Empirical Methods Course code: EC40 Examiner: Lena Nekby Number of credits: 7,5 credits Date of exam: Saturday, May 9, 008 Examination time: 3

More information

PARCC MODEL CONTENT FRAMEWORKS MATHEMATICS GRADE 8. Version 3.0 November 2012

PARCC MODEL CONTENT FRAMEWORKS MATHEMATICS GRADE 8. Version 3.0 November 2012 PARCC MODEL CONTENT FRAMEWORKS MATHEMATICS GRADE 8 Version 3.0 November 2012 PARCC MODEL CONTENT FRAMEWORK FOR MATHEMATICS FOR GRADE 8 Examples of Key Advances from Grade 7 to Grade 8 Students build on

More information

Amarillo ISD Math Curriculum

Amarillo ISD Math Curriculum Amarillo Independent School District follows the Texas Essential Knowledge and Skills (TEKS). All of AISD curriculum and documents and resources are aligned to the TEKS. The State of Texas State Board

More information

Design of Experiments

Design of Experiments Design of Experiments DOE Radu T. Trîmbiţaş April 20, 2016 1 Introduction The Elements Affecting the Information in a Sample Generally, the design of experiments (DOE) is a very broad subject concerned

More information

Algebra 1. Mathematics Course Syllabus

Algebra 1. Mathematics Course Syllabus Mathematics Algebra 1 2017 2018 Course Syllabus Prerequisites: Successful completion of Math 8 or Foundations for Algebra Credits: 1.0 Math, Merit The fundamental purpose of this course is to formalize

More information

TASC Test Math Practice Items

TASC Test Math Practice Items TASC Test Practice Items Use these items to practice for the TASC subtest. Before you begin, review the information below titled Using Gridded- Response Item Blocks. Once you reach the end of the test,

More information

An Analysis of Field Test Results for Assessment Items Aligned to the Middle School Topic of Atoms, Molecules, and States of Matter

An Analysis of Field Test Results for Assessment Items Aligned to the Middle School Topic of Atoms, Molecules, and States of Matter An Analysis of Field Test Results for Assessment Items Aligned to the Middle School Topic of Atoms, Molecules, and States of Matter Cari F. Herrmann Abell and George E. DeBoer AAAS Project 2061 NARST Annual

More information

LINKING IN DEVELOPMENTAL SCALES. Michelle M. Langer. Chapel Hill 2006

LINKING IN DEVELOPMENTAL SCALES. Michelle M. Langer. Chapel Hill 2006 LINKING IN DEVELOPMENTAL SCALES Michelle M. Langer A thesis submitted to the faculty of the University of North Carolina at Chapel Hill in partial fulfillment of the requirements for the degree of Master

More information

Solving Algebraic Equations in one variable

Solving Algebraic Equations in one variable Solving Algebraic Equations in one variable Written by Dave Didur August 19, 014 -- Webster s defines algebra as the branch of mathematics that deals with general statements of relations, utilizing letters

More information

Essential Academic Skills Subtest III: Mathematics (003)

Essential Academic Skills Subtest III: Mathematics (003) Essential Academic Skills Subtest III: Mathematics (003) NES, the NES logo, Pearson, the Pearson logo, and National Evaluation Series are trademarks in the U.S. and/or other countries of Pearson Education,

More information

300-Level Math Courses

300-Level Math Courses 300-Level Math Courses Math 250: Elementary Differential Equations A differential equation is an equation relating an unknown function to one or more of its derivatives; for instance, f = f is a differential

More information

PP

PP Algebra I Reviewers Nancy Akhavan Lee Richmond School Hanford, CA Kristi Knevelbaard Caruthers Elementary School Caruthers, CA Sharon Malang Mickey Cox Elementary School Clovis, CA Jessica Mele Valley

More information

A Learning Progression for Complex Numbers

A Learning Progression for Complex Numbers A Learning Progression for Complex Numbers In mathematics curriculum development around the world, the opportunity for students to study complex numbers in secondary schools is decreasing. Given that the

More information

Center for Advanced Studies in Measurement and Assessment. CASMA Research Report

Center for Advanced Studies in Measurement and Assessment. CASMA Research Report Center for Advanced Studies in Measurement and Assessment CASMA Research Report Number 25 A Analysis of a Large-scale Reading Comprehension Test. Dongmei Li Robert L. Brennan August 2007 Robert L.Brennan

More information

TABLE OF CONTENTS POLYNOMIAL EQUATIONS AND INEQUALITIES

TABLE OF CONTENTS POLYNOMIAL EQUATIONS AND INEQUALITIES COMPETENCY 1.0 ALGEBRA TABLE OF CONTENTS SKILL 1.1 1.1a. 1.1b. 1.1c. SKILL 1.2 1.2a. 1.2b. 1.2c. ALGEBRAIC STRUCTURES Know why the real and complex numbers are each a field, and that particular rings are

More information

Fairfield Public Schools

Fairfield Public Schools Mathematics Fairfield Public Schools PRE-CALCULUS 40 Pre-Calculus 40 BOE Approved 04/08/2014 1 PRE-CALCULUS 40 Critical Areas of Focus Pre-calculus combines the trigonometric, geometric, and algebraic

More information

ANCOVA. ANCOVA allows the inclusion of a 3rd source of variation into the F-formula (called the covariate) and changes the F-formula

ANCOVA. ANCOVA allows the inclusion of a 3rd source of variation into the F-formula (called the covariate) and changes the F-formula ANCOVA Workings of ANOVA & ANCOVA ANCOVA, Semi-Partial correlations, statistical control Using model plotting to think about ANCOVA & Statistical control You know how ANOVA works the total variation among

More information

Part 2 Number and Quantity

Part 2 Number and Quantity Part Number and Quantity Copyright Corwin 08 Number and Quantity Conceptual Category Overview Students have studied number from the beginning of their schooling. They start with counting. Kindergarten

More information

Working Scientifically Physics Equations and DfE Maths skills BOOKLET 1

Working Scientifically Physics Equations and DfE Maths skills BOOKLET 1 Working Scientifically Physics Equations and DfE Maths skills BOOKLET 1 Published date: Summer 2016 version 1 3 Working scientifically Science is a set of ideas about the material world. We have included

More information

Geometry Mathematics Item and Scoring Sampler 2018

Geometry Mathematics Item and Scoring Sampler 2018 Geometry Mathematics Item and Scoring Sampler 2018 COPYRIGHT GEORGIA DEPARTMENT OF EDUCATION. ALL RIGHTS RESERVED. TABLE OF CONTENTS Introduction.... 1 Types of Items Included in the Sampler and Uses of

More information

Content Descriptions Based on the state-mandated content standards. Analytic Geometry

Content Descriptions Based on the state-mandated content standards. Analytic Geometry Content Descriptions Based on the state-mandated content standards Analytic Geometry Introduction The State Board of Education is required by Georgia law (A+ Educational Reform Act of 2000, O.C.G.A. 20-2-281)

More information

Trinity Christian School Curriculum Guide

Trinity Christian School Curriculum Guide Course Title: Calculus Grade Taught: Twelfth Grade Credits: 1 credit Trinity Christian School Curriculum Guide A. Course Goals: 1. To provide students with a familiarity with the properties of linear,

More information

Scaling Methodology and Procedures for the TIMSS Mathematics and Science Scales

Scaling Methodology and Procedures for the TIMSS Mathematics and Science Scales Scaling Methodology and Procedures for the TIMSS Mathematics and Science Scales Kentaro Yamamoto Edward Kulick 14 Scaling Methodology and Procedures for the TIMSS Mathematics and Science Scales Kentaro

More information

Westerville City Schools COURSE OF STUDY Math Lab for Algebra MA101. Course Details

Westerville City Schools COURSE OF STUDY Math Lab for Algebra MA101. Course Details : Recommended Grade Level: 9, 10 Westerville City Schools COURSE OF STUDY for Algebra MA101 Course Details Course Length: Semester Credits:.25/semester mathematics elective credit Course Weighting: 1.0

More information

REVIEW 8/2/2017 陈芳华东师大英语系

REVIEW 8/2/2017 陈芳华东师大英语系 REVIEW Hypothesis testing starts with a null hypothesis and a null distribution. We compare what we have to the null distribution, if the result is too extreme to belong to the null distribution (p

More information

PREDICTING THE DISTRIBUTION OF A GOODNESS-OF-FIT STATISTIC APPROPRIATE FOR USE WITH PERFORMANCE-BASED ASSESSMENTS. Mary A. Hansen

PREDICTING THE DISTRIBUTION OF A GOODNESS-OF-FIT STATISTIC APPROPRIATE FOR USE WITH PERFORMANCE-BASED ASSESSMENTS. Mary A. Hansen PREDICTING THE DISTRIBUTION OF A GOODNESS-OF-FIT STATISTIC APPROPRIATE FOR USE WITH PERFORMANCE-BASED ASSESSMENTS by Mary A. Hansen B.S., Mathematics and Computer Science, California University of PA,

More information

TIMSS 2011 The TIMSS 2011 Instruction to Engage Students in Learning Scale, Fourth Grade

TIMSS 2011 The TIMSS 2011 Instruction to Engage Students in Learning Scale, Fourth Grade TIMSS 2011 The TIMSS 2011 Instruction to Engage Students in Learning Scale, Fourth Grade The Instruction to Engage Students in Learning (IES) scale was created based on teachers responses to how often

More information