October 18, 2016

Learning to use Seesaw to draw and record

Seesaw drawing Day 1

April 2, 2016

Blended and Personalized Learning Conference

I wanted to attend this conference as technology is once again being heralded as the source for individualizing learning and this seemed like a good conference to get some updates on what is happening in schools. 

Lunchtime speaker Richard Culatta, former head of the US Office of Technology Education and recently named Rhode Island's first Chief Innovation Officer talked about being very careful about the words we use in describing "personalized learning". The definition in the recent National Technology Plan is the following:
Personalized learning refers to instruction in which the pace of learning and the instructional approach are optimized for the needs of each learner. Learning objectives, instructional approaches, and instructional content (and its sequencing) all may vary based on learner needs. In addition, learning activities are meaningful and relevant to learners, driven by their interests, and often self-initiated. (tech.ed.gov/netp/)
This definition sounds more like our efforts to implement Genius Hour based on individual student passion for learning than the companies that are providing branching learning using algorithms that are largely unproven and untested. In a recent blog post Alfie Kohn spoke to the use of "personalized" vs. "personal" learning:
If you haven’t given much thought to the kind of intellectual life we might want schools to foster, then it might sound exciting to “personalize” or “customize” learning. But as I argued not long ago, we shouldn’t confuse personalized learning with personal learning. The first involves adjusting the difficulty level of prefabricated skills-based exercises based on students’ test scores, and it requires the purchase of software. The second involves working with each student to create projects of intellectual discovery that reflect his or her unique needs and interests, and it requires the presence of a caring teacher who knows each child well. (http://www.alfiekohn.org/blogs/ed-tech/)
As I ponder this new trend I find that I continue to be dubious about the potential for corporate solutions for meeting the cognitive needs of our students. While I think that there is a place for online practice and even tutorials, the time must be allotted for teachers to review student progress, meet with students and work closely with colleagues to determine the learning paths for students.

As an update, today (4/26/16) Will Richardson posted about his concerns about personalized learning "products".

November 18, 2015

Hour of Code - Computer Science Week - December 7-11, 2015

For the third year our students will participate in the Computer Science Week "Hour of Code" in early December. The materials and resources that are posted on the HOC website support learning by tens of millions of students worldwide.

Our plan is for first graders to use the iPad App Kodable, gathering all the iPads so that each student has one for the class time that day. The 2nd graders will use the web version of The Foos to explore planning steps, repeating and building some programming loops. Third grade students will use the Lightbot Hour of Code site to move into the next levels of logical thinking and programming. By fourth grade students have a wider range of experience with coding and programming so they can learn to program the Frozen characters, create in Scratch, solve puzzles in a Minecraft game, build a Star Wars Galaxy and more. Students enjoy revisiting the Blockly Angry Birds Puzzles, coding a Flappy Birds game and many other activities that introduce the concepts of programming and game development.

October 28, 2015

Why Teach Google Earth: what are 2nd graders learning?

Second graders study communities, with one focus being our school garden and observing plants and animals. To help with learning about butterflies the technology connections include using the Journey North materials on the migration of monarch butterflies.

The teacher uses a classroom projector to show students the maps that show the path of the butterflies as they migrate to Mexico. After students have seen the maps they can use Google Earth to visit the states that they have seen pictured on the maps. As a literacy skill they learn that there are two letter post office abbreviations that they can enter in the search box to "fly to" a new state (e.g. MA, NH, MN, TX) and then zoom in and out to view the overall colors and features of the landscapes that give them visual cues as to the topography. What can they see when they travel? (Questions: Does the state of Minnesota look like the state of Texas? What do you see? What do you think the land looks like there?) What is going on in their minds as they change the view to go closer and then zoom back out to view the state boundaries and country borders? Is there a mathematical-spatial learning process going on?

Recently I've been reading about "partner learning" and the power of having students talk out loud about their explorations and questions. For this session using Google Earth we paired the students and had a "navigator" enter the search text and a "pilot" who pressed Return or clicked on Search to "fly" to a new destination.

What are students learning as they "fly", "zoom" and explore a virtual earth? The engagement in the room is full of, "What ifs?" and wondering as they talk about going N, S, E, W, look at the geographic features like lakes, mountain ranges and deserts. They are talking about countries, continents, oceans and more as they move around and explore. They are asking basic questions about typing on the laptops, finding letters, the space bar, delete key, etc.

A classroom teacher wrote a blog post about the session and the inclusion of a follow up activity where students went to see their own homes and explored the town. Part of the power of this lesson was the collaboration between the classroom teacher and my role as instructional integrator.