Historical paths or roots
- programmed instruction -
B.F. Skinner
- CAI
- a teaching machine
movement
- effective but repetitive and could be boring
- Federal support of Education 1957
- network of labs and centers - mid 60's
- curriculum development
- produced models and designs
- techniques are primarily for K-6
- a few great guidelines for adult ed.
- The Department
of Defense produced a 4 volume set,
one for each branch
- Human Performance Engineering
Organization Development
Employee Growth, Individual
In the area of education
Instructional Systems Design (ISD)
focuses on the acquisition and application of knowledge.
It's an outgrowth of
- technical advancements of 'media'
- educational pyschology - learning
- business and psych - systems learning
|
Characteristics
- achieve specific and measurable objectives
- process is intended to yield a product, whose results
are replicable
- demonstrably effective - data has been collected and supports
usage also revisions come about
Cycle of development
- formulation of objectives
- materials development
- field testing under condition of intended use.
- revisions
- recycling
Process
- formulation of objectives
the 'needs analysis'
compare status quo with the 'ideal situation'
- identify resources
- identify subject matter expert
- what instructional modules exist at this time
- what's the budget, content, and timelines
- identify the manager of product
What need are you meeting?
Materials Development
types of materials to be generated
- instructional material and procedures for learners
- and for the instructor
- assessment materials
- diagnostic for placement
- diagnostic within a program - how are we doing?
- end of unit tests - have the objectives been met?
Formulative Evaluation
- formulative evaluation
field test or tryout i.e. use under conditions of indended usage
with data collected focusing on
- effectiveness
- attitudes of students and teacher(s)
- transactional variables e.g. Was the manual used?
Could the students use the computer?
Could the teacher use the computer?
- revisions - checking expectations or objectives which were NOT met
- continue through the cycle as time, budget, and attitudes permit
Ed Techs are applied researchers.
Emphasis is on process 'models' with revisions!!
Focus is on instruction with no particular develoment model.
Focus the thinking of 'content specialists' - a kind of teamwork.
- construct and/or acquire
- test
- give and receive feedback
- REPEAT = goto #1
Objectives
- define content
- what is important to know or do
- they structure intruction
- test students
- students and instructor 'know' what is expected of them
Sequence - of objectives
The activities are the means
of learning the objective.
Information delivered via instruction is
- keyed to the specific objective - no antidotes, nothing extra
- clearly stated
- definitions
- rules
- characteristics
- procedures
- concepts/principles
- examples and nonexamples ((((for concepts)))))
- range of content/difficulty
teach the rule, not just exceptions
don't confuse the learners
- sample problems
work it through for them
- delivered or given objective by objective
Practice
- learn by doing
- the exact skill that's in the objective
- check verb
- check givens (performance conditions)
- directions should be simple with no new information
- distributed not massed practice
- give range of content/difficulty
- if possible give summary practice
if many skills are drawn together
Information Processing Model
yields a flow chart of operations or procedures
steps of performance infer steps of Instructional Design (ID)
Task classifications
5 categories
-
Intellectual Skills - how to do something
of an intell. nature - in terms of performance ( show, apply, carry out)
molecular learning
component parts - discrimination learning (color , size, shape)
-
Cognitive Strategies
an internal process
learner brings a strategy or approach to bear on a particular problem
-
Verbal Information - stored in memory (state , name, describe)
less worthwhile than Intellectual Skills
is a prerequisite to Intellectual Skills
-
Motor skills
executing a performance of bodily movement
-
Attitude learning
chosing a course of personal action
helping people to modify an action
Learning task analysis
- start with target objective and work backwards
- what does the perosn need to do to carry out the terminal objective?
- at some point make a cut off for entry behaviors
Specify criteria to judge objective and subobjectives
-
is it an observable learner behavior (performance) - unambiguous
-
conditions under which performance is assessed - the givens
-
include criteria or characteristics of acceptable performance whenever learner is asked to
demonstrate or construct
"criteria for acceptable performance is ....
-
worthwhileness - is it life like?
-
is it easily understood - as learly as possible
if written - proper grammar and syntax
-
can content be taught within time imits set
reasonable for time constraints
subobjectives = enroute and enabling objectives
Are there definitions - verbal info
rules, formulaes - how to apply
are there concepts - simple comparisons - simple
procedures or techniques that might be subobjectives.
Entry Skills
- the knowledge, skills, and attitudes brought
by the learner to the situation
- Is it an entry skill or a subobjective
5 types of learning
-
psycho motor
-
attitudes
-
verbal info
-
cognitive strategy
-
intellectual skills
-
discrimination
-
concrete concepts
-
rules - define ti's critical attribures - give + and - examples
- present from easy to hard....
-
problem solving - use of many rules
Concepts
-
Concrete - this is a circle
-
Defined concept - the definition of a circle
-
Rule learning
-
Problem solving - a novel situation
Job Aid
- presents steps in a list or flow chart form
- provides a summary
- assists in practice
- concrete example for future use
Purposes of Evaluation
accountability check
- judge impact or effectiveness (summative)
- Front-End Analysis
- evaluatiibility assessment - can it be evaluated
- formative decisions used to improve or revise
- the larger the project is in scope the
greater the number of 'points of view'
that must be checked
- impact will be more diffuse and less predictable
- time is just about everything -
anticipate needs - Be Proactive!
Don't feel proud if you're
'running around putting out fires'
Don't be reactive!
- Pragram Monitoring - is program being carried out
- Gain Assessment - e. g. 'pre and post test gain'
Model of Evaluation influences the outcome
evaluation measures accountability
- evaluate the audience
- determine what questions should be addressed
- what will be the 'acceptable' behavior or test score
for #2
- what resources are currently available
- what will be the data gathering techniques
-
phases of the evaluation describe factors of the
planning, process, and product phases.
-
for evaluation use
- standardized measure
- locally developed measure
- attitude measurements
Revision
revise until students score at least xy% or time and money run out
5 research or principle based revisions
-
review task analysis ( check all subskills )
check prerequisite or entry skills
-
determine if practice has been properly used
did instructor use designer's intent? -- criteria
-
all tasks and subtasks were practiced
-
range of tasks and range of difficulty
-
practice and posttest are of equal difficulty
-
verify unprompted practice
inform students of the 'correctness' of their answers
-
Determine if feedback has been properly used e.g.
if example answers were available to the learner,
did he/she use the as a check of their work?
-
make sure the learner knew what was expected of him/her
-
determine that there is appropriate motivation.
Patterns of Problems
-
IF performance on posttest is poor - check program data
IF performance on test is great - check attitudial data with survey or interview
Concerns
-
check task analysis if practice and test results are poor
-
was bad info given? - rules and definitions might be inadequate
-
add practice and feedback if 1 and 2 above are OK
-
If test results are poor and practice results are OK -
-
is the test = to the practice?
-
verify unprompted practice
-
add practice and check task analysis - was something left out?
-
IF test results are OK and practice results are poor -
don't worry about it.........?
-
IF test results are OK but yields bad attitude
add motivational stimuli
CHECKING -
1. Effectiveness
2. Value
3. Instruction delivered as intended? ( teacher's guide,
student materials, recordkeeping, etc. ???
the combinations of the above
Systems Theory
Influences
- organization of numerous interrelated variables (observable and
measureable)
variables are broken down without losing their interrelatedness
< prerequisites and sub-prerequisites >
- synergistic nature of the whole system
the whole > sum of the parts
teamwork with content specialist and others
- cybernetic - self-correcting
emphirical validation step by step ( feedback looping )
ed tech is biased on emphirical data
Task Analysis Procedure
front-end analysis
- analysis (most critical)
task list
settings
check existing training requirements
job performance measures
- design
- objectives
- describe entry behavior
- pretests
- sequence and structure
- development
- specifying learning activities
- instructional plan
- check exisitng materials and develop if not available
- schedule space for follow-up
- implement - using instructional management plan in its setting
- validate real setting simulation
- control - internal evaluation - Does the intended = actual outcomes??
- validate external evaluation on the job
revision - materials, objectives
last step of development is validation with individual testing, group
testing and revisions.
student achievement is proof of effectiveness
effectiveness should include norm referenced tests and
criterion-reference tests.
Models of Needs Assessments
need assessment = gap = discrepency
The different procedures (models) boil down to Kaufman's "gap".
evaluation is backwards -=- what should be and what is
3 approaches to Needs Assessment
- Indirective - beginning from the beginning determination of present behaviors
- Deductive - start with identified goals preexisting , but tentative list of goals
- Classical - hit and miss = intuitive
Check societal variables in Needs Assessment - get input from all.
All must agree to the methods (weighings) of Needs Assessment.
simple and logical
Tools used in Needs Assessment (goals and objectives of the program)
- archival data (test scores)
- questionaires
- surveys
- critical incidence behavior checking target group's behaviors
compare with norms
- interviews
- standardized test
Evaluation looks at what is (status quo) and compares it to what
should have been.
Terminal Objectives are the result!
Other possibilities (sources of objectives)
- Federal and State Mandates
- Local and National groups
- articles, editorials
- eric
- pta meetings
- pressure groups
- school records
- nea
- Federal Labs
- support groups
a developer relies on specialists (content)
NOT ADDRESSED by design - the learner and instructor interaction,
learning characteristics
ID work backwards
i.e. what will the learner do after objectives instuction?
Gagne 5 kinds of learning, how to write objectives for each and even
attitudes.
he says 'education is to prpare citizens'
2 reasons to teach something
- does it prepare the learner for something used in life
- does it prepare the learner for something he'll learn later
5 learning outcomes
( which of these is your curriculum directed?)
- intellectural skills - learning how
- cognitive strategies - govern you're learning
how learner organizes
creative problem solving - logical systems thinking - hypothesis testing
- verbal info or knowledge
- motor skills (part skills)
- attitudes
plan your objectives around these
generates -> problem solving
varieties of learning
prerequisites (skills, verbal info, cognitive strategies,
supportive prerequisites
Gagne - ID is concerned with outcomes of learning
he's always focused on the learner
behavioral objective operationalize instruction
check verbs in objective
verb tells kind of outcome (one of the 5 above)
Events of learning (instruction)
should be present in learning situation
- gain attention
- stimulate recall of prerequisite learning
- presenting the stimulus material
- providing learning guidance
- eliciting the performance
- providing feedback on performance correctness
- assessing performance
- enhancing retention
OBJ - what the learner does or produces
states observable behavior
critique development
? media and scripting - was it justified
? need for 'advance organizers'
check attitude when presenting info
check digit span, how much info is thrown out?
Instruction maps -
Knowledge maps
check differences of ways of doing needs assessment in bus & ed
(when they're done and why) distinguishes
in industry
observe a 'master'
in education
we're not observing but the 'best'
both yield heirarchies
glossary is for the learner too
briggis and wager
ISD the advanced organizer of e.t.
e.t. is a macro level of ISD
Media selection process
deliver it via microcomputer
modes of instruction
categories of delivery systems and subgrous
Individual - tutor, peer tutor, ..individual resources (discovery)
group - lecture, discussion, activities (field trips), projects
adapt delivery system to the events of learning
clinical judgement = informed common sense.
media selection
make a list of the totally unacceptable and go from there
Is this model effective? If not,
is there empirical evidence to support the model?
Instructional strategies
2 levels
- macrostrategires = organization of course -> flow of course
selection of course materials, sequencing
ausebel - advanced organizers
- macrostrategies development and selection of strategies at lesson level
after you have the objective(s) - develop / select the strategies
microlevel - sequence of instruction events leading tho the
accomplishment of a single objective.
macrolevel - sequence of instructional events that teaches a group or
set
instructional strategies = series of displays presented to the students,
and from which the student is to learn
sequence and relationship among displays make up the instructional
strategies
line them up in a logical order, use, revise
applicable only to cognitive objectives
----
A general model for effective instructional strategies at the microlevel
- intro -->
- generality <--> help
- instances <--> help
- practice <--> feedback
generality - statement of fact, definition, rule, procedure
help == supplemental info, example
instances = examples and nonexamples with help links to generalityes
practice = opportunitiy for learners to judge themselves
not a test
feedback = correct answer and how to determine it
summary = after last segment of instruction, recap
lesson test = profides info to learner and instructor
IF objective is -
- FACTual information/instruction
- explain learner outcome
- fact statement
- help
- practice
- practice feedback
- CONCEPT to be remembered
- explain learner outcome
- definition
- help
- example
- help
- practice
- practice feedback
- CONCEPT to be used (seen and used)
- explain learner outcome
- definition
- help
- example and nonexamples
- helps
- practice
- practice feedback
Algorithms
- a cookbook or flowchart with words
procedure which possesses 2 attributes
By following an algorithm you must get results.
Heuristics
- generalized process
3 components
- operators - tell user to perform and operation
- discriminators - a decision point, the user must discriminate
- relate operator and discriminator
uses in education
- aid to learning
- allow a person who possesses entry skills to arrive at a desired end.
- perform a task accurately
yields consistency & they remove excess discriminators
Evaluator's role
- judgements of worth
- decision - terminate, modify or maintain
- demands performance data
- control group can come from archival data if measurements are the
same
-
use multiple criterion measures
- types of data
- norm-referenced tests
- Likert type scales
- interviews with teachers, students , parents
- check priorities of others (opinions)
- description
- time tables
- promises vs. performance
- report what went on and why
- collect data
- ongoing and at intervals not just post hoc
-
check unobtrusive measures
-
attendence records
- focus-behavior and 'smiling rate' vs base rate
An evaluator is a facilitator that feeds information
back to development team.