Audio Transformer update

Audio Transformer, the Web Audio Editor, is online in a functional demo mode. It’s not ready for public testing until I prepare my server, and there are many features (and bug fixes) yet to come. But if you’d like to check it out, here it is (source code here)…

webaudioeditor

 

I had the chance to user-test with my ICM class, and observed that graduate students tend to start by pressing the button on the left and moving to the right. Only those who really knew what they were doing with audio editing would jump straight to the fun stuff (effects). A few people also thought to press Spacebar for play/pause, but most didn’t think to do that, so I added a “spacebar” text hint to this button. At that point, there was a button to place markers at a spot on the waveform, and everyone wanted click’n’drag to select an area, so I’ve begun to implement this. I also added Loop mode, with a symbol that shows up when loop is on, though if I have the time I’d like to develop my own buttons that look like physical buttons.

“Speed Up/Down” has little effect on the waveform, so there needs to be a way to show that the length of the file is changing, otherwise it doesn’t look like those buttons do anything. I added a timer in the top-right but would like to visualize this more clearly by showing the full waveform in a skinny timeline at the top, and the selected area in the bottom. As the file shortens, the zoom-level stays the same, so the selected area will grow in proportion to the full file. This’ll make a lot more sense if I can just show you what I mean. Other comments were that the frequency spectrum colors didn’t seem to correlate to the waveform colors, and if the colors that represent sound should be link.

Before presenting this to kids in my workshop, I need to indicate when WAE is processing audio, and grey out the buttons so that they can’t freeze the program by overloading the “speed up” button.

I am subbing for my friend’s Scratch video-game workshop, and I had the chance to work on sound effects using the Scratch interface:
Scratch_2.0_Sound_Editor

 

Scratch has been a big influence on my approach to “designing for tinkerability”, as has a lot of the projects and research coming out of MIT Media Lab’s Lifelong Kindergarten Group. Its audio editor has a concise, efficient design. They don’t overload the user with too many parameters, for example, they leave things as “louder” and “softer” rather than a volume slider. This is the way I’ve implemented my effects, though I think that in the name of tinkerability, I should not only provide a preset starting point, but also provide an advanced mode for users who wish to dig deeper.

Scratch gives kids three options for sounds: 1. record your own, 2. choose from preset sounds, 3. upload a sound. The kids wanted to try each of these. One group wanted to record their own voices saying “crap!” Another went to YouTube trying to figure out a way to bring the Dr Who theme into their project. And others explored the library of existing sounds. I think that offering all of these starting points would strengthen the web audio editor.

Designing for kids as young as 2nd grade is difficult because they aren’t all able to read at the same level. This applies to words, but it also applies to symbols. For example, when I asked the kids to play a sound in Scratch, some didn’t know what button to press. They hadn’t all been exposed to a sideways triangle symbol as a play button. Even if it said “play” they might not know what it means to “play” a sound. I don’t know if there’s a better way to convey the meaning of these abstract audio concepts, but I think that using the most simple, conventional names and symbols will help establish meaning that will stick with them later in life.

As my Physical Computing teacher Tom Igoe says, there’s no such thing as ‘intuitive’, just learned behavior. So in an educational setting for kids who’ve never worked with audio before, it will be necessary to point out some things.

Just this morning, I had the opportunity to present this project to a 5-year old. At first, thanks to her guide pointing out the foam chair, she was more interested in picking up the foam chair than in working with the Audio Transformer. When she sat down, I gave a short explanation that this is a way to listen to sounds and change the way they sound. I showed her how to click and drag a file from a desktop folder into the browser, then pressed buttons to change the sound. She was much more interested in dragging the sounds than in modifying them. Click’n’drag is a difficult skill for novice computer users, but she told me she’s been working on it with her dad, and she seemed intent on mastering it now. The dragging distance proved too far for her to manage, so I helped load the sound and then encouraged her to try pressing the buttons. She didn’t understand which button to press to play the sound until I pointed it out, and from there she successfully slowed down and reversed the sound and played it back. She was on a tour of ITP so my project had a lot of competition for her time, but afterwards she said that the project was “fun.” I asked if there was anything that wasn’t fun and she said no. I think this is a good sign, but I’d like to try to make it easier to load readymade sounds—perhaps within the browser itself the way Scratch does—without the need to click and drag.

As things stand, I have several features I hope to implement:

  • Don’t afford the ability to press buttons while audio is processing (because it causes errors)  (but could be done more elegantly)
  • Allow Edits w/ better highlighting of selected area
  • Zoom mode w/ additional waveform view update, highlight selection
  • Spiff up interface with symbols that can help bridge a child’s current level of understanding with audio-related symbols that’ll carry meaning later on in life.
  • Allow Record (WebRTC?) https://github.com/muaz-khan/WebRTC-Experiment/tree/master/RecordRTC/RecordRTC-to-PHP   (but stops recording properly [gets glitchy] after ~three recording sessions or if a file is played until the end…why??)
  • More options for starting sounds (preload a range of cool sounds and waveforms)
  • Oscilloscope ( http://stuartmemo.com/wavy-jones/ ) because the wavesurfer plugin isn’t as preceise to illustrate the concept of a sine wave, triangle wave etc they just look like big blocks of sound…
  • Better Undo/Redo (download page with all files at end of session then delete them?) ///// on close, delete all of the files. Filesize limit. These are important before making the website public so as not to overload my server.
  • “Advanced Mode” allowing user to tweak effect parameters. Audacity has too many parameters, Scratch has too few, WAE should provide a simple starting point but allow tinkering for those who wish to dig deeper and explore

[ Dec 7th update: crossed two items off the list)

Reading Notes: Music Education with Digital Technology

The four chapters we read this week were very inspiring as I reflect on the experience of my digital music workshop.

:: In “The DJ Factor,” Mike Challis describes a curriculum for at-risk 14-16 year olds in the UK who have not had any previous music experience. The key to his experience is to start from music that the students are interested in, which is novel for students who often have a sharp divide between music in and out of school. In the UK, kids listen to electronic music, UK garage and hip-hop. The first stage is to put music into their hands as DJs, and through a 4-step process of removing elements to fill with original elements, they produce a piece of original music.

In Stage 1, beatmatching is used as a way to learn about beats and bars. I’ve struggled a bit to teach this concept to the kids in my workshop, and despite the fact that they are much younger (some are as young as 7), I can’t think of a more hands-on approach than using two turntables and a mixer.

In Stage 2, Reason is introduced as a beat-making tool, and students create original beats to layer over their DJ mixes. The main lesson here for me is that the students aren’t given a comprehensive overview of how the program works, but instead they just dive in to figure it out on their own. Reason gives immediate feedback, so it may in fact be easier to learn from diving right in than by a general tutorial. The instructor’s role is to answer questions, monitor, intervene when necessary and provide feedback.

In Stage 3, students add a bassline either with a live performance using MIDI, or by sequencing each step of the melody. And in stage 4, structure is introduced by listening to the music that the students brought to the program. At this point, the original mix can even be removed to leave each student with a completely original composition (not to stay that remixed elements couldn’t also be used in original ways). The article mentioned that Dizzee Rascal had this type of opportunity when he was a teenager, I found this to be really inspiring.

:: The next chapter, “Composing & Graphical Technologies” by Kevin Jennings, gave me a lot to think about as I design my web-based audio editor. I’m working on the design, and the visual representation of sound will have a significant impact on how people use this tool.

Graphical interfaces can provide novel ways for young people to dive into composing without needing to learn notation or a sequencer. However, music is not visual, so any attempt to represent it visually biases and affords certain types of uses, whether it’s traditional notation or sequencers or Finale or Logic or the original programs Hyperscore and Drum Steps mentioned in this paper.

I’m really inspired by idea idea of Hyperscore, in which ‘motives’ can be brought into a ‘sketch.’ The author believes that this particular interface is best-suited to teaching rhythmic concepts, texture and form, but despite his efforts, questions concerning pitch did not come about as naturally.

If you set a young person free to explore music composition in an environment with certain affordances, they will come to understand many different types of musical concepts on their own. After an initial stage of “bricolage” or “just messing around” to become familiar with the interface, Jennings observes the way students settle on a clearer idea to explore. This bricolage is an important part of the process that reminds me of Mike Challis’ approach in teaching Reason. The questions come up as part of the exploration process, and in the case of Hyperscore, concepts like inversion, repetition, and variation appear in students’ composition without any mention of these concepts by the teacher. In short, musical concepts don’t need to be explained as long as the system affords the potential to discover them through exploration.

In contrast to HyperScore, Drum-Step as a graphical interface that students seem more likely to interpret in non-musical ways. Their motivations are more about making things that look cool than by the musical inspiration. So graphical interfaces can have drawbacks. Interfaces need to incorporate non-musical elements, and there is a point where those might eclipse the musical elements. If the goal is music education, then it’s best to design an accessible path into musical understanding.

In the case of my web-based audio editor, there is a question of whether to follow the established ways to represent sound, or invent my own. There is also a question of how to describe the various effects that can be used. If an effect is offered, it will be used, is the message of Jennings’ article. However, in the case of a “feature-bloated” program like Microsoft Word, only a small percentage of the program’s features are actually used on a regular basis by most users. That is the problem I’m trying to solve by creating a program that streamlines some of the audio editing features available in other programs like Audacity. But I need to think very carefully about how they are represented visually.

:: Reading Steve Dillon and Andrew Brown’s 2005 chapter on networked collaborative music-making  here in the year 2013, I’m struck by the way they approach “computer as instrument,” “cyberspace as venue” and “network as ensemble” with such novelty. The trick for this type of program to work is to send data (i.e. MIDI notes, OSC messages) instead of the actual audio (which is also data, but a different kind that takes more memory).

Some of the benefits of networked jam2jam are that this type of experience invites listening and creating at the same time. Participants need to use musical terms to communicate their ideas, and jam2jam provides chat boxes that facilitate.

Dillon & Brown identify three ways that students collaborate without being in the same physical space (country, even), : Disputational involves individual decisions without agreement. Cumulative collaborators largely agree, but avoid confrontation. Exploratory is a give-and-take, modifying each others ideas to build something that is greater than the sum of its parts, and that seems like the goal here.

Abstract

A simple audio editor that works in the browser.

Audience: 2nd-8th graders in an afterschool workshop. I’m interested in designing (and ideally building some kind of technology) that supports this curriculum:

  • Explore timbre. One exercise could have us draw shapes that we turn into sounds, inspired by Daphne Oram’s ‘Oramics.’ I know of a Max/MSP patch that can do this, so we could develop the software/experience.
  • Create samples by editing+modifying our own original recordings (using Audacity because it’s free until we can find/build something better)
  • Make MIDI instruments to play kid-made samples (possibly using MaKey MaKey + conductive ink + found objects…and Garageband?)
  • Group improvisation with kid-made instruments in a system where each participant gets a chance to be the conductor/mixer/producer of a recording (possibly the soundtrack to a film made by another workshop)
  • Record/sequence loops (in Garageband unless we can find a simpler way). Layer/arrange loops to create songs. Remix each other’s loops to create new songs / remixes.

I’ve decided to focus on an in-browser audio editor. Why? Because Audacity is too confusing:

audacity-macosx

All the features we need:

Functions:

  • Upload a file (windows pcm wav)
  • Save a file (windows pcm wav)
  • Play a file
  • Reverse
  • Change amplitude
  • Select a piece of the waveform –> Cut, Copy, Paste
  • Pitch shift
  • Time shift
  • Fades at the beginning/end of file (created automatically before saving)

UI

  • Buttons to upload a file and save current file
  • View waveform (possibly also include frequency spectrum)
  • Spacebar to play / pause
  • Click to set start point
  • Select to set start/end point
  • keyboard shortcuts: –> Cut, Copy, Paste
  • Effects menu to modify selection with effects (pitch shift, time shift, reverse)
  • Zoom in / out (with keyboard shortcuts and buttons?)

Sketch:

abstract

 

Technology:

I have been looking at SoX and I think it is possible to do all of the editing on the server side. On the browser side, I’m looking at representing the waveform data visually in processing.js and also delving into the HTML5 web audio API

Tech Trends in Music Education wk1 reading reflections: Designing For Designers

All I Really Need to Know (About Creative Thinking) I Learned (By  Studying How Children Learn) in KindergartenMitchel Resnick is the director of MIT Media Lab’s Lifelong Kindergarten Group. In a paper called All I Really Need to Know (About Creative Thinking) I Learned (By Studying How Children Learn) in Kindergarten, Resnick defines “Kindergarten-style learning” and explains why he believes it is essential for people of all ages—it even guides his own approach to designing educational technology like Scatch and Cricket.

Resnick defines the Kindergarten approach as a cycle, pictured at left, and I’ll reflect on each component below:

Imagine – Resnick’s role is to design for designers—kindergarteners of all ages. he is inspired by Friedrich Froebel’s pioneering vision of the first Kindergarten, where he presented a room full of materials (“Froebel’s Gifts”) that serve as the building blocks for creativity.

Here, Resnick identifies an interesting design challenge: the environment and its gifts must be general enough to allow for creative applications beyond what Froebel himself could have imagined. But there need to be some boundaries; the environment  also be specific enough to be easily understand so that learning can take place.

Create – design environments made up of building blocks (i.e. Froebel’s Gifts) for designers to construct

Play – Resnick critiques other approaches like “Edutainment” which seem to view education as if it were a bitter-pill that must be sugarcoated in entertainment. Edutainment combines two passive things that we might expect people to provide for us. By contrast, Kindergarten style is active, emphasizing learning through play. Resnick believes that play is “intimately linked” with learning because both involve “experimentation, exploration, and testing the boundaries.” Scratch, encouraged to “Play with code”

Share – important in what Henry Jenkins calls our increasingly “Participatory Culture.” Learn from each other, as a community, exemplified by Scatch’s website of open source games. So not only play/tinker with your own code, but with each other’s.

Reflect – document the process. Reminds me of ITP where we’re encouraged to document everything, hence this blog. It’s one thing to get something working once, but another to be able to recreate it and tinker with it down the line. I love these reflections from children who participated in a workshop utilizing Cricket workshop, who observed that it’s important to…

Start simple 
Work on things that you like 
If you have no clue what to do, fiddle around 
Don’t be afraid to experiment 
Find a friend to work with, share ideas! 
It’s OK to copy stuff (to give you an idea) 
Keep your ideas in a sketch book 
Build, take apart, rebuild 
Lots of things can go wrong, stick with it

Failed experiments can be part of the learning process as long as they are supported by these sorts of encouraging reflections. Resnick called the workshop a success because the children not only learned, but learned how to learn.

Resnick’s approach to learning isn’t limited to kindergartners, but geared to learners of all ages, and he follows this approach himself when designing software like Scratch and Cricket. There are a lot of parallels to the way we approach learning at ITP, a program that provides a community of learners with certain building blocks and then encourages us to play with these tools, we share our work with each other often through open source code, and share skills through group prototyping, we document/reflect on everything, and then we iterate in an imaginative cycle.

Resnick’s “Kindergarten approach to learning” reminds me of Chris Crawford’s definition of “Interaction,” which I read for this week’s Physical Computing course at ITP. Both authors call for iteration through a cycle of steps. Crawford defines interaction as two actors Listening, Thinking, and Speaking. Resnick adds step because he wants to see learning take place through interaction. Crawford is writing for interaction designers, while Resnick is writing for what I might call “education designers” who are designing experiences/environments for young designers to take part in the programming. Interaction Designers are different from User Interface Designers because the former tweak the “thinking” aspect of the interaction, i.e. core functionality that is not traditionally made available to User Interface Designers. Resnick takes things a step further because he believes that if we can put “thinking” into the hands of the people who are interacting with the technology, that is the key to learning.

When Resnick refers to “designing for designers,” sometimes I wonder if this is always the best approach to learning? It’s not like all children are going to grow up to become designers…

logo-make:: Dale Dougherty has some persuasive answers to this question in his article The Maker Mindset. The founder of MAKE Magazine and, more recently, the nonprofit Maker Education Initiative, Dougherty believes that the Maker Mindset can transform education.

The Maker Mindset builds off of Carol Dweck’s book Mindsets in which the Stanford psychology professor found that some people have “Fixed Mindsets”—meaning they believe that their knowledge/abilities have a limit and they’re pretty close to maxed out—while others have “Growth Mindsets.” People with a Growth Mindset believe  they can do anything they put their mind to, and Dougherty emphasizes how essential this becomes in a world that is constantly changing. The Maker Mindset is a type of Growth Mindset through which people believe they can turn their ideas into reality. This is achieved by constantly experimenting, tinkering, and pushing the boundaries of what’s possible using what you already know. In the process, you’ll expand your own body of knowledge and probably create some cool things. Dougherty acknowledges that makers are not mainstream these days, driven largely by internal goals rather than external/social rewards. But he believes that Make can be even more of a social mindset if it is built around sharing, and that this starts with education.

To bring the Maker Mindset to education, Dougherty calls for the development of projects, kits and curricula. He wants young people to take leadership roles and join communities outside of school. Communities can provide exhibition space for makers that invites and inspires further participation. Dougherty believes that a record of participation in maker communities could be a valuable part of a young person’s résumé. What Resnick described as passive “edutainment,” Dougherty might call “consuming” and he believes that education by making things is the best way forward.

I’m not sure how I feel about the way Dougherty has built a brand out of the word “MAKE”, but I certainly do admire his mindset!

MaKey MaKey Drum Machine using Scratch

MaKey MaKey Drum Machine using Scratch

:: In Designing For Tinkerability, co-authors Resnick and Eric Rosenbaum critique Make’s approach because it is sometime narrowly misinterpreted as “making” physical things when really, for these guys, it’s more about the broad approach to making no matter the form. Tinkering is too often in the shadow of the more logical Planning, they argue, but in fact the ability to improvise and incorporate bricolage is increasingly essential. Resnick and Rosenbaum go on to share some of the thinking behind their respective tinkerable kits: Scatch and MaKey MaKey. Both kits are designed for tinkerability which in their veiw requires three things:

1.) Immediate Feedback – because tinkering means lots of quick experiments, and you’re not gonna sit around waiting for the results. Scratch shows the values of all variables (not the norm for programming languages) and lets you tinker while it’s running to see the results in real time. Makey has indicator lights to let you know what’s going on.

2.) Fluid Experimentation – it’s important to minimize the setup so that you can get started right away. Many programming languages require a lot of code just to get started and compiling and all this stuff that Scratch takes care of so that the user can focus on creating. Scratch has a “visual syntax” meaning that you can see if a certain object will take a type of input just by looking at the LEGO-like connector shapes. Like LEGO pieces, you can unplug blocks and leave them in your workspace and they won’t interfere with your other code the way that stray code would in most programming environments. Following the LEGO analogy, the kits may not be the sturdiest way to build something, but they aren’t designed for that, they’re designed to be tinkerable.

3.) Open Exploration – Both kits let you use a variety of materials to create projects that fall under a variety of genres. Scratch materials include the preloaded sprites and objects, but you can also make your own and find millions more sources of inspiration from the online community, all licensed under the tinkering-friendly Creative Commons Attribution-ShareAlike license. MaKey doesn’t come with anything preloaded, but with this tool “the world is your construction kit.” Both MaKey and Scratch are tools that can be used to play across genres (in more than just the musical sense) which is valuable because tinkerers tend to work “bottom-up” where they start out not knowing what they’re building and through many iterations they may touch upon a bunch of different genres before they determine the goal at the top.

Side note: I wonder if my application ought to encourage non-musical exploration…

In Resnick and Rosenbaum’s final note on this article, they touch upon the idea that the kits they’ve built can’t encourage tinkerability on their own; they need to be used in a tinker-friendly context. Educators looking to get the most out of their kits are advised to emphasize the process over the product…themes over challenges (broad enough so that each person cares about what they’re working on, but specific enough so that everybody feels it’s a shared experience…highlight examples for inspiration…give questions as answers. This is very inspiring to me because just the way we learn by doing, if we can answer our own questions, we’ll understand the answer better than if we’re given the answer through edutainment.

:: Some notes from What Develops in Musical Development? A View of Development As Learning by Jeanne Bamberger

  • Hearing music is a performance because we reconstruct it in our minds. So the same tune can sound different when we hear it in different contexts.
  • Cognitive developmental tracks how we organize perception and strategies to understand the world. Some are situational (context) and others are abstract (systems to categorize the world).
  • When educators use just one organization symbol system (like musical notation) that privileges certain types of categorization (“ontological imperialism”) and it’s important to have a lot of different ways to understand the world/music. Interactions between these various understandings foster complexity.
  • Trust musical intuition and embrace exceptions to “the rule”
  • Whatever children do, there is reason. Explore…