Summertime Plans


So, here’s some stuff I’m working on this summer:

1.) Re-working Algebra I units and learning targets to make them more coherent.  After using SBG this year, I found that some units had way too many targets and some had too few.  Also, some targets were really the same as others, and they needed to be combined; whereas some targets were too complex and needed to be split into two separate ones.  I also need to split up my learning targets into the categories Fundamental (procedural), Core (conceptual), and Advanced (synthesis).  I think I’ll find that after doing this, I am woefully lacking in the Advanced category.  Must fix!

2.) Writing algorithm generated questions for practice and assessments.  For any target that I can legitimately assess with a multiple choice or numerical response question, I want to have a bank of algorithm generated questions so that I don’t have to write new ones every time a student wants to re-assess.  They just go to SocraticBrain, log in, then enter the quiz password to re-assess.  I want to get these written and out of the way so I can spend more time writing task and project based (Advanced category) assessments.

3.) Getting to know SocraticBrain, the SBAR system I will be using for Algebra I and Physics next year.  There are some new features and I’ll need to figure out how they work and how I’m going to implement them.  For instance, it is now set up to automatically generate practice for students based on their scores on any recent work they’ve done.  Awesome!  But it means that I’ll have to be very careful when deciding what practice problems or tasks the system will spit out based on the areas the student is struggling in.  When a student keeps missing a fundamental question, maybe they just need to see an example worked out and then practice some more.  In this case, the system should spit out both resources (a video with a worked out example, and some practice problems).  When a student fails on an Advanced task, more skills practice won’t necessarily help them.  What they probably need is more specific feedback.  I think in this case the system should spit out a self reflection form based on the task.  Questions like “What parts of the task did you get stuck on?” and “What skills do you think are required for this task?” could go on there.  Then when it comes time for me to enter the picture, they will already have thought about these things and my feedback will be more useful to them.

4.) Re-working physics units and learning targets to match the NGSS.  This one I’ll be working on with the whole science dept at my school. We’ve got a full two weeks blocked off for this.  It’s awesome that we’re doing it together; we will hopefully end up with a nice vertically aligned science curriculum.  Once the physics learning targets are made, I can start to align the Algebra targets to them.

I plan to post my aligned Algebra I/Physics units and learning targets by the end of the summer for review by the mathtwitterblogosphere.

Real Differentiation! Finally…


I had an education professor at UNO that told us a class should pretty much run itself.  He said by the end of the semester, there would be at least one occasion where he would not even show up, and yet our class would go on without him.

It did.  We were working on short lessons and presenting them to each other to get feedback.  He never showed, so we started without him and everyone stayed the entire time.

Today I came into class and basically said “You know what you need to work on, let’s get to work!” and it happened.  Here’s why I think it worked:

1.) They were all very aware of exactly what skills they need to work on.  I use standards based grading and report their scores with ActiveGrade.  When they login, each student gets a report that looks like this:


2.) They are accustomed to getting help from outside sources.  I have been getting better at providing good resources for my students this year.  My newest adventure has been in using Edcanvas.  It  is super easy to use for teachers and for students, and it tracks views (and emails you a daily digest if you want).  Here’s one of mine (click it if you want to see how it works):

Edcanvas pic

3.) They are accustomed to getting help from each other. Since I teach using modeling in physics (and try to in algebra as well), the students are used to not really getting answers from me (hence the “Never” tagline of this blog) 😉 so they have gotten much better at learning from each other.  They are getting way better at spotting when someone has a skill that they do not, and asking that student for assistance.

BTW Even though it was awesome (!!), we didn’t use the whole class period for this “extra practice” session.  The second half of class was used to work through this handout (which you can see in the Edcanvas as well):

The awesomest thing was how I had never used the term “zeroes” before and they already knew what I was talking about!  I also love how it really ties everything together nicely.  I got a lot of “ooohhhh, now I see why we did that” out of it.  The thing that it is missing is a real, quality application to real life.  Any suggestions??

This assessment is missing something


Ugh, it’s getting hard to keep up this blog.  I’m working hard to keep trying new things in class, while still making my tutorial videos, and now I’ve got the Daily Desmos blog taking up half my planning time!  So this one’s gonna be short.  Here goes:

I have an assessment that basically asks students to describe the effects of the variables “a” “h” and “k” on the graph of the function y = a(x – h)^2 + k.  I was thinking that this would be great because it is so open ended (I’ve heard this type of thing referred to as a “goal-less” problem).  The idea was that students would have to not only know what the effects of the variables would be, but also be able to describe it to me using graphs and charts.

For some students it worked out great.  They clearly understood what I was asking of them and were able to demonstrate that understanding by comparing several graphs and their equations.  But then some students were completely stumped.  I’ve figured out that they were afraid of this assessment because they didn’t understand the basic principles.

I think the solution is to create a separate assessment that is lower level.  Maybe a few multiple choice questions where they have to compare two graphs, or even a constructed response, but one where I give them the graphs to compare.  That way, students who only sort of understand the skill can at least show what they know.

So I feel like the lesson on this one is that having a “goal-less” problem as an assessment is great, but it shouldn’t be the only assessment of that skill.

What do you think?  Do you use goal-less problems?