Notes From This Week’s Tech Trends in Music Edu Readings

The following is an outline of notes/observations from this weeks’ readings for Technological Trends in Music Education.

Reading 1: Methodological Alignment in Design-Based Research – Christopher M. Hoadley

  • Data isn’t biased, but the way it is collected can be (rigor). Science can be.
  • hard to be “double-blind” where neither teacher nor student knows
  • Design-based Research
    • compliment to experimentation, but separate (although he admits in reality it may fall on a continuum)
    • “problem of context” is fundamental. Hard to ensure ‘control’ and ‘universality’
    • design-based research views outcomes as the culmination of the interaction between:
      • designed interventions
      • human psychology (nature)
      • personal histories or experiences (nurture)
      • local contexts.
    • is all about variables, embraces them?
    • different from Experimental Research because
      • teachers are aware, subjective not objective
      • nothing is expected to be universal, just “tentative generalization”
      • less about the plan, more about following revelations/leads
      • more open to results that fall outside of the initial hypothesis
    • This is a problem for “Rigor” which has been well defined for Experimental Research, but not for the relatively new Design-Based Research.
  • Hoadley: Design-based research is more rigorous in certain ways: can help connect the dots in complex realistic settings like the classroom.

Alignment

  • align measurements, theories, and treatments –> because no measurement should stand in isolation.
  • types of validity
    • Consequential Validity – how to apply the results of the experiment in practice
    • Systemic Validity – core: does research + inferences/results help us answer the question?
      • studies must inform our theories, which must inform practice.
    • Treatment Validity …?
    • **because the classroom has so many variables, design-based is the way to go, it embraces the complexity and encourages us to analyze every detail.
  • it becomes necessary to document everything

Researching Collaboration

  • lots of variations in context. Settles over time.

MFK (Multimedia Forum Kiosk) => SpeakEasy – one of the first web-based discussion tools

  • designed from a model of how collaboration would foster knowledge building (realizing that, poorly implemented, it could also hinder)
  • challenges: 1. foster discussion 2. foster learning by discussion
  • necessary features: inclusiveness and participation,etc…
  • long process: 1. develop tools & activities that function, 2. ensure that they actually meet the challenges 3. demonstrate this empirically
  • Usability does not always lead to adoption/use. Vs. “Context of Use”
    • Context of Use i.e. does it relate to class?
  • “kiosk groupies” – rarity makes it cool, but can erode discussion for other students  ((social issues)) // groups can encourage or discourage others to join depending on social factors.
  • –> once online (from MFK to SpeakEasy), new social space with new rules. Fortunately teacher “stake-out” as an intellectual space. Three years later, once www was popular, students approached it differently and the intellectual stakeout lost its impact. Example shows Design vs Context.
  • Social context vs anonymity? interesting question.
    • example of importance of iterative design: they created a feature (anonymity) that was important even to those who didn’t use it.
    • but…less likely to read anon comments
    • students look to others for cues about how to participate in this space. If anon, they’ll be anon, unless they don’t look to others for cues and have confidence.
    • –> intervention: force students to formulate own thoughts before reading others.
    • LESSONS: Design-Based Researchers’ view on anon changed

Key dif: interpretation is key, rather than trying to limit bias. Problem: tough to generalize. Upside: easy to apply.

 

Q: how does the MFK / SpeakEasy example demonstrate the power of “design-based research”?

Q: design-based research: so we should experiment on students?

Q: how does this relate to ___ concept of “iteration” ?

 

Christopher Hoadley’s slideshare:

  • does media help learning? It depends. Not enough studies.
  • Stanford VHS video: pause & rewind buttons are important piece of technology. Get together and engage / interact with a lecture. It’s not about the tech itself but how people use it. Why are we bringing a tool of any sort into the educational context?
  • With tech, we have to evaluate: Learning more, learning more efficiently, or learning differently?
  • What’s the same w/ tech?
  • What’s different w/ tech? Experiences, habits, culture, roles, economics, teaching?
    1. example: email allows less hierarchical communication in business
    2. Skills that teachers learn in class are different than skills for online
  • Hoadley’s Three Laws of Educational Technology
    1. It’s not the technology. It’s what you do with it.
    2. It’s not what the tech makes possible. It’s what the tech makes easy.
      1. examples, Brewster Kahle’s WAIS Wide Area Information Server
    3. Follow trends in learning, not in tech
      1. need to be able to learn, navigate knowledge, rather than memorize.
      2. know who vs know what.
      3. democratic media
      4. connected/global vs. isolated/local
      5. context / engagement (new) vs. content (i.e. MOOCs)
      6. summarize: we learn differently in an interconnected world where so much information is available
    4. Schools aren’t gatekeepers anymore, but a resource
      1. pick and choose rather than bundled education
      2. engagement (experience) > outcomes
      3. learning is fun > learning was forced
      4. schools are tailored / personalized > uniform

Q: What is Hoadley’s thoughts on MOOCs? Testing?

 

Karen Brennan & Mitchel Resnick – New Frameworks for studying and assessing the development of computational thinking

 

 

framework emerged fro studying interactive media designers. Context is Scratch

part 1: computation concepts (iteration, parallelism), practices (debugging, remixing), perspectives (world/self)

part 2: approach to assessing this development.

conc: suggestions for assessing the learning when young people program. Ultimately, a combo of 3 types of assessment.

 

Computational Thinking – thought process involved in formulating problems and developing solutions so that they make sense by a computer (information-processor)

  • authors take “constructionist approach to learning” learning thru design, thru engagement.
  • CT defined:
    • Computational Thinking Concepts
      • sequences – like a recipe to make something
      • loops – more succinct
      • parallelism – sequences of instruction happening at the same time.
      • events – one thing causes another to happen
      • conditionals – if this then that
      • operators – mathematical, logical (and/or/not), and string (i.e. concatenation and getLength), random,
      • data – variables and lists. Keeping score in a game.
    • Computational Thinking Practices – the process of learning, moving beyond what to how
      • be Incremental and iterative – adapt, get feedback, try new ideas
      • Test and Debug – what is the problem? how do you deal?
      • Reuse and Remix – build off others to create things much more complex than solo. What is reasonable to borrow? How do you credit? How to assess?
      • Abstract and Modularize – easier to understand / communicate ideas
    • Computational Thinking Perspectives
      • Expressing – media is to make, not consume.
      • Connecting – interact with others
      • Questioning – interrogate the world, consider how it is programmed. You can even reprogram Scratch itself.

Assessing learning thru design –

  1. Project Portfolio Analysis – analyze portfolios for what type of blocks they’re using and how much.
    • – not all avail to analyze
    • – No process!
  2. Artifact-based interviews – select two projects and tell us about them (project creation)
    •  background (how have you evolved?)
    • project creation (framing / process)  *most important for assessing computational thinking concepts and practices.
    • online community
    • looking forward
      • Interviews can reveal conceptual gaps that computer analysis can’t capture
    • + nuanced, product–>process
    • – timeconsuming, limitation of memory / boasting “I never get stuck!” . Constraints of project
  3. Design scenarios (tests)
    • process in action rather than from recollection
    • nature of the questions might appeal to some users more intrinsically than others.
    • might feel like a test even tho they are designed to feel like “helping a peer”

Six Suggestions for Assessing Computational Thinking via Programming

  1. Support further learning – make it useful / connect to the learners.
  2. Incorporate Artifacts (project examples) for rich assessment to see progress one time
  3. Illuminate the Process – it’s important to think about our thinking if we are going to become a self-regulating learner
  4. Checkpoints throughout learning experience
  5. Value multiple ways of knowing – not just definition, but analyze and critique each other, debug, etc
  6. Include multiple viewpoints – don’t just rely on interviewer / interviewee but incorporate self, peer, parent, teacher and researcher assessments as much as possible.

 

Thoughts: Concepts and Practices, useful to outline and thinking about how Scratch is designed to foster. For example, the practice of Reuse and Remix – the ability to build off of the work of others – is reinforced by design. Perhaps a bias of the designers, but I like it. I’m interested in the way Scratchers address some of the unanswered questions.

–Perspectives: COOL that Scratch gives new insight into the world.

Central issue: How do we assess learning?

 

Q: reusing and remixing –> What are some of the ways that Scratch is designed to encourage Computational Thinking Practices?

Q: questions about scratch

Q: What might Hoadley say about the fact that Design Scenarios (tests for using scratch) were presented in the classroom?

Q: Brennan & Resnick emphasize the importance of developing “self-regulating learners.” How does this concept relate to ideas from last week’s readings?

 

 

Andrew R. Brown – Software Development as Music Education Research 

SoDaR approach

3 stages

  1. identify learning opportunity / define the activity
    • a situation in which new tech is likely to encourage interaction leading to learning
    • document the situation, describe activity and educational potential (objectives?)
    • initial specs w/ explicit theories, expectations and hypotheses.
    • research & reflect
  2. design and produce
    • mockup
    • reflective questions w/ focus on research objectives
  3. implement usage and refine in an educational setting (repeat this!)
    • reflection-in-action
    • collect supporting data
    • reflective questions

Different from Trad Software because exposed to real-world situations at each stage of the dev process. More like XP (eXtreme Programming).

 

Action Research – repeated observations in a deliberately altered situation. Established educational strategy. Stage 3 of SoDaR.

 

Case Study – gather data from multiple perspectives.

 

Activity Theory – study technologically-mediated experiences. Psychological, about the everyday. Real-world situations. Contextualized. But unlike Activity, SoDaR uses experiences to test hypothesis that may illicit new discoveries.

 

SoDaR – a research tool that enables new ideas about interaction, understanding (learning?), behavior to be tested thru activities using designed software that facilitate specific interactions.

 

Implementation Issues in Educational Settings:

  1. skills and teams: work with a software developer!
  2. when does software add value? When it opens up opportunities not previously possible to create a unique learning experience.
  3. Designing Interaction: can iterate based on how it’s used. Unlike trad software engineering whose goal is to deliver complete functionality at release time. “improv”
  4. Usage Context – SoDaR relies on strong link between software and a learning activity (i.e. experience / curricula)

 

jam2jam – developed by Steve Dillon

 

Q: Why do you think Scratch has taken off, but Jam2Jam is not as popular? Is it just a matter of timing?

Q: How does Brown’s description of the “usage context” for Jam2Jam development compare with Hoadley’s account of how different contexts factored into MFK/SpeakEasy?

Q: Brown / Dillon looked at existing research design approaches for Jam2Jam, did Scratch? Brennan’s article mostly cited her own experiments. In what ways might this be indicative of their approaches to design?

Q: Scratch is open ended, while SoDaR/Jam2Jam is focused on specific interactions

Q: How does Steve Dillon’s __ (  ) come into play throughout the three stages (define activity, software design/production, implement and refine) of the SoDaR approach defined here by fellow jam2jam founder Andrew R Brown?

Q: students as beta-testers

Leave a Reply

Your email address will not be published. Required fields are marked *