SGarza.BroadChapters1-3 History

Hide minor edits - Show changes to output - Cancel

March 06, 2012, at 04:10 PM CST by 192.168.198.85 -
Changed line 173 from:
%color=purple% End of Denise's definitions
to:
%color=purple% End of Denise's section
March 06, 2012, at 03:24 PM CST by 192.168.198.85 -
Changed lines 160-177 from:
%color=purple% Denise's Section
to:
%color=purple% Denise's Section

scoring guide: in formal evaluation of writing, explanation for judes of the procedures by which the writing is to be scored, including description of different performance levels, the rubric to be followed, anchor essays, etc. (from comppile.org)


rubric: set of criteria, sometimes with scaled values, used to train raters or scorers in formal assessment of extended pieces of discourse, usually whole essays; use of similar criterial frames in the writing classroom (from comppile.org)

reliability: In language testing, issues connected with the degree to which a measurement outcome will be duplicated when performed again under similar conditions; for instance when a test is given again (TEST-RETEST), when a person's writing ability is tested again (WRITER-RELIABILITY), a when person scores a piece of writing again (RATER-RELIABILITY), or a when piece of writing is scored by more than one person (INTERRATER-RELIABILITY) (from comppile.org)

instructional validity: Validation of language testing that correlates results with subsequent academic performance (from comppile.org)

data: Any study that systematically collects and reports facts usable in further study, through whatever research method (interview, ethnography, experimentation, descriptive measurement, case study, etc.); discussion of the rhetorical or methodological use of data. (from comppile.org)

%color=purple% End of Denise's definitions


March 06, 2012, at 03:13 PM CST by 192.168.198.85 -
Changed line 160 from:
%color=purple% Denise
to:
%color=purple% Denise's Section
March 06, 2012, at 03:12 PM CST by 192.168.198.85 -
March 06, 2012, at 03:11 PM CST by 192.168.198.85 -
Changed lines 158-160 from:
'''HOWARD OUT'''
to:
'''HOWARD OUT'''

%color=purple% Denise
Changed line 97 from:
"Factor analysis is a collection of methods used to examine how underlying constructs the
to:
"Factor analysis is a collection of methods used to examine how underlying constructs influence the
Changed lines 94-98 from:
Factor analysis\\ (from JanetW)
to:
(JanetW interrupting Ben's entry)

Factor analysis\\
"Factor analysis is a collection of methods used to examine how underlying constructs the
responses on a number of measured variables." [For more see:
Added lines 100-101:

(End of JanetW's interruption)
Changed lines 93-95 from:
Factor analysis\\
to:

Factor analysis\\ (from JanetW)
http://www.stat-help.com/factor.pdf
Added line 125:
Changed lines 129-132 from:
Constructivism (from www.word-iq.com)
In education, constructivism is a learning theory which holds that knowledge is not transmitted unchanged from teacher to student, but instead that learning is an active process of learning. Constructivists teach techniques that place emphasis on the role of learning activities in a good curriculum. See constructivism (pedagogical). \\

Grounded theory (definition from Corbin, J.M. Basics of Qualitative Research, 2008,
to:

'''
Constructivism''' (from www.word-iq.com) In education, constructivism is a learning theory which holds that knowledge is not transmitted unchanged from teacher to student, but instead that learning is an active process of learning. Constructivists teach techniques that place emphasis on the role of learning activities in a good curriculum. See constructivism (pedagogical).

'''
Grounded theory''' (definition from Corbin, J.M. Basics of Qualitative Research, 2008,
Changed lines 129-133 from:
In education, constructivism is a learning theory which holds that knowledge is not transmitted unchanged from teacher to student, but instead that learning is an active process of learning. Constructivists teach techniques that place emphasis on the role of learning activities in a good curriculum. See constructivism (pedagogical).
Grounded theory (definition from Corbin, J.M. Basics of Qualitative Research, 2008)
(from Corbin
, J.M. Basics of Qualitative Research, 2008).
http://books.google.com/books?hl=en&lr=&id=0TI8Ugvy2Z4C&oi=fnd&pg=PT7&dq=definition+grounded+theory&ots=bkPdojPQTv&sig=V16dVcrPvxch415ALJJGzzDQXQ8#v=onepage&q=definition%20grounded%20theory&f=false
to:
In education, constructivism is a learning theory which holds that knowledge is not transmitted unchanged from teacher to student, but instead that learning is an active process of learning. Constructivists teach techniques that place emphasis on the role of learning activities in a good curriculum. See constructivism (pedagogical). \\

Grounded theory (definition from Corbin, J.M. Basics of Qualitative Research, 2008,
(from Corbin,
J.M. Basics of Qualitative Research, 2008, located on googlebooks)-
Added lines 126-136:

JanetW interrupting Ben's notes with proposed definitions for constructivism and grounded theory:
Constructivism (from www.word-iq.com)
In education, constructivism is a learning theory which holds that knowledge is not transmitted unchanged from teacher to student, but instead that learning is an active process of learning. Constructivists teach techniques that place emphasis on the role of learning activities in a good curriculum. See constructivism (pedagogical).
Grounded theory (definition from Corbin, J.M. Basics of Qualitative Research, 2008)
(from Corbin, J.M. Basics of Qualitative Research, 2008).
http://books.google.com/books?hl=en&lr=&id=0TI8Ugvy2Z4C&oi=fnd&pg=PT7&dq=definition+grounded+theory&ots=bkPdojPQTv&sig=V16dVcrPvxch415ALJJGzzDQXQ8#v=onepage&q=definition%20grounded%20theory&f=false

A specific methodology developed by Glaser and Strauss (1967) for the purpose of building theory from data. In this book grounded theory is used in a more generic sense to denote theoretical constructs derived from qualitative analysis of data.

(End of JanetW's interruption of Ben's notes)
Added lines 120-123:
from m-w.com: Definition of COMPOSITIONIST
: a teacher of writing especially in a college or university
First Known Use of COMPOSITIONIST: 1985 (JanetW)
Changed lines 82-85 from:
Interest
Tone
Legibility
Normative and formative purposes
to:
Interest\\
Tone \\
Legibility\\
Normative and formative purposes\\
Changed lines 92-94 from:
Statistical analysis
Factor analysis
to:
Statistical analysis\\
Factor analysis\\
Changed lines 105-109 from:
Validity theory
Flux capacitor
Inter-rater agreement
Dynamic criteria mapping (DCM)
to:
Validity theory\\
Flux capacitor\\
Inter-rater agreement\\
Dynamic criteria mapping (DCM)\\
Changed lines 119-123 from:
Compositionists
Communal Evaluation
Constructivist Grounded Theory
to:
Compositionists\\
Communal Evaluation\\
Constructivist Grounded Theory\\
Changed lines 128-129 from:
Textual Qualities
Textual Features
to:
Textual Qualities\\
Textual Features\\
Added line 78:
Added line 134:
Added lines 78-133:
'''BEN HERE'''
I'm going to need help with these. Hopefully I came up with something useful, I'm pretty sure I just reitterated a bunch that other people have already put on here.

Interest
Tone
Legibility
Normative and formative purposes
Descriptive and informative potential

Nietzschian and Foucaultian : our epistemological best. (which means doing our best as far as our knowledge goes)

Self authorship: the ability to collect, interpret, and analyze information and reflect on own beliefs in order to form

Statistical analysis
Factor analysis

Positivist, experimentalist paradigm: I think this means positivist theory of knowledge, see below.

Psychometrics: a branch of psychology dealing with the measurement of mental traits, capacities, and processes

Positivist Theory of Knowledge: the theory that knowledge can be acquired only through direct observation and experimentation, and not through metaphysics or theology.

Positivist Psychometric: The theory that knowledge can only be measured through direct observation and experimentation, and not through metaphysics or theology?

Scientistic: use of scientific method. I'm not sure how it applies in context...

Validity theory
Flux capacitor
Inter-rater agreement
Dynamic criteria mapping (DCM)

Generalizability: not a word

Hermeneutic: relating to or consisting in the interpretation of texts. Serving to interpret or explain something
grounded theory approach

Emic: Analyzing structural and functional elements. Using categories of people studied. Relating to the organization and interpretation of data that makes use of the categories of the people being studied.

Emically: Not a word

Compositionists
Communal Evaluation
Constructivist Grounded Theory


Concurrent Analysis: Not a clue. In context, it seems like he just meant to say

QSR Nvivo Software: http://www.qsrinternational.com/#tab_you

Textual Qualities
Textual Features

Significance: an intellectual, emotional, and poetic experience indicating that the reader experiences something meaningful, wiehgty, important, worthwhile, or affecting during her encounter with the text.

Epistemic: about knowledge or relating to knowledge
'''HOWARD OUT'''
Added lines 52-78:
'''Eda's Definitions'''
Definition of Truth: "doing our epistemological best" (3)

Self-authorship: "the ability to collect, interpret, and analyze information and reflect on one's own beliefs in order to to form judgements" (3)

Direct assessment of writing: "assessment that took actual writing as the object of judgement" (8)

Inter-rater agreement: "a key aspect of reliability" (10)

Micheal Neal's unified theory of validity: criterion-related validity

reliability (especially inter-rater agreement)

traditional construct validity

content validity

social consequences

Dynamic Criteria Mapping: "a streamlined form of qualitative inquiry that yields a detailed, complex, and useful portrait of any writing program's evaluative dynamics" (13)

Constructivist: "is used to designate the methodology actually employed in doing evaluation. It has its roots in an inquiry paradigm that is an alternative to the scientific paradigm" (14)

Significance: 40-41

'''End of Eda's definitions'''
Added line 46:
Changed lines 42-50 from:
to:
-
Omar Corona

'''Superior Writing''' Page 2 - The student whose texts follows alongside the guidelines and expectations explained on page 2.
'''Incompetent Writing''' Page 2 - The student whose writing fails to live up to these expectations and does not meet the qualifications of a superior writer.

These terms spoke to me due to the idea of there being more to writing than just the expectations of those who have set up the guidelines and rules of writing a text. Labels such as these are becoming less likely to be able to apply to today's writers as there are less and less methods of determining what qualifies as the superior text.

------------
Deleted lines 32-33:

Truth - Page 3. Doing our epistemological best.
Deleted line 23:
Added lines 31-43:

'''Willma's Post Beginning'''

Truth - Page 3. Doing our epistemological best.

Educational Testing Service - Page 5 - Develops, administers and scores more than 50 million assessment tests annually in more than 180 countries, at more than 9,000 locations worldwide. In addition to assessments, we conduct educational research, analysis and policy studies and develop a variety of customized services and products for teacher certification, English language learning and elementary, secondary and postsecondary education. http://www.ets.org/about/fast_facts

Constructivist Evaluation - Page 14 - Evaluation used to designate the methodology actually employed in doing an evaluation. Other names for Constructivist Evaluation are Interpretive and Hermeneutic (Guba and Lincoln's emphasis).

Emically - Page 16 - To develop an analysis by drawing it out of the words and concepts of the (My-Broad's) research participants.

'''Willma's Post Ending'''
Changed lines 21-31 from:
to:
Reliability and validity in educational assessment: (JanetW quoting from
http://www.ncrel.org/sdrs/areas/issues/methods/assment/as5relia.htm)


In Chapter 5 of A Tool Kit for Professional Developers: Alternative Assessment, the authors suggest guidelines for evaluating the quality of assessment instruments. Good assessment requires minimizing factors that could lead to misinterpretation of results. Three criteria for meeting this requirement are reliability, validity, and fairness.

Reliability is defined as "an indication of the consistency of scores across evaluators or over time." An assessment is considered reliable when the same results occur regardless of when the assessment occurs or who does the scoring. There should be compelling evidence to show that results are consistent across raters and across scoring occasions.

Validity is defined as "an indication of how well an assessment actually measures what it is supposed to measure." The chapter identifies three aspects of an assessment that must be evaluated for validity: tasks, extraneous interference, and consequences.

Laboratory Network Program. (1993). A Tool Kit for Professional Developers: Alternative Assessment. Portland, Oregon: Northwest Regional Educational Laboratory.
Changed line 16 from:
"I propose this definition of truth...truth means doing our epistemiological best" (3) (JanetW).
to:
"I propose this working definition of truth...truth means doing our epistemological best" (3) (JanetW).
Added line 12:
Added lines 12-21:
---


"I propose this definition of truth...truth means doing our epistemiological best" (3) (JanetW).

Definition of EPISTEMOLOGY (from m-w.com):
the study or a theory of the nature and grounds of knowledge especially with reference to its limits and validity (JanetW).

Added lines 1-11:
Charles Riss

'''Self authorship'''- ability to collect, interpret, and analyze information and reflect on own beliefs in order to form (3).

'''Inter-rater agreement'''- An agreement among assessors with minimal variances. (A key aspect of reliability).can there be validity without reliability?-(10). is not a quality of an assessment rather the quality of the decisions people (10).

'''Shared Evaluation'''- assessments to be relevant, valid, and fair,[] it must judge students according to the same skills and values by which they have been (11). This is similar to the chicken and the egg scenario. How is one assessor (Instructor) to know what each student has been taught? Broad never discusses the quantity of papers or the subject matter which in my opinion make a difference. Is it fair to judge a freshmen business student to the same standards as a science, nursing, or English major? Hardly so, unless the topic is so ambiguous that it could be masked by selective word choices and then back at the chicken and egg or the dog chasing its own tail. Ultimately what getting at is that if we want to assess power and insight of the it needs to be done by scholars in each particular subject- not just random assessors. And then again, if judging hundreds of papers at one time, how often does that occur? Are we talking about classroom assessment of approximately 20-25 students in which the instructor should be able to clarify the writing instructions with clarity, or are we talking about a lecture hall full of potential 1301 students? All of this matters, but Broad mentions classroom pedagogy several times but lacks to mention admission testing.

not trying to be cynical about Broad, but when he claims that open and meaningful discussion and debate [] would provide participants with one of the most powerful opportunities for professional development available to teachers of (12), I have to ask myself what more we could negotiate here? each instructor be able to validate their classroom writing expectations through a rubric that students could use as a reference guide?

'''Dynamic Criteria Mapping'''- streamlined form of qualitative inquiry that yields a detailed, complex, and useful portrait of any writing evaluative (13).