Skip to main content

Student Response Systems -- A Year Later

I have promised to come back with my experience after the Fall semester of 2013. Though the end of that semester passed a long ago, here are some thoughts:

Overall my experience of socrative was very positive. During the first week I polled the class to see how large a percentage of the class has some kind of wifi enabled device that they could use. 90 out of the 95 students had some kind of device that they were willing to bring to the class, so I decided to give socrative a try.

Socrative helped me tremendously to stay on the top of what everyone in the class knows. The way I used socrative was as follows: After every block of new material (usually 5-10 slides), I inserted a few questions to verify whether the students "got" the new material. This was all the "extra work" that I had to put in designing the class material due to socrative. And I would have done something similar without socrative anyways, so in fact I did not feel much of a difference here. Once the question was projected, I started the question on socrative by pushing a button and then the students could start entering their answers. The answers are shown on the teacher's screen. Based on the feedback received, I could then decide whether I needed to re-explain something.

I used a tablet to connect to socrative, while I used a second computer to project my slides: This was handy as I did not have to exit the presentation to start the question in socrative. In the classroom there was a projector that could project the tablet's screen. Sometimes I have shown the feedback screen even before I got all the answers in. The students seemed to like this as it created an interesting dynamic in the class.

I used multiple choice questions mostly, but I also used yes/no questions occasionally. In addition to these, I also experimented with "short answer" questions. This latter question-type would be very useful and I generally prefer it. It is not only less work to prepare "short answer" questions, but the they are usually better at testing what the students know. To put in another way, it is really hard (and time-consuming) to design good multiple choice questions with valid-looking alternatives (if you have some ideas of how to automate this, I am all ears). Examples of the "short answer" questions I asked are: "What is the keyword that should go in this program in this place?" or "What will this program print?". 

The feedback screen after a quiz-type question shows a histogram which is updated on the fly, so if you want, you can steer things in the class by showing how the histogram evolves. The same works for yes/no questions. Unfortunately, the feedback screens for the short answer questions are less than ideal as it shows the list of answers as they are coming in. On this list the same answers in fact can be repeated many times.. Needless to say, this is not the best way of presenting this information. It would be much nicer, for example, if there was a histogram representing the top answers. 

I also replaced the end-of-class questionnaire of previous years with the "exit quiz" that socrative provided. Unfortunately, the questions on the exit quiz are not configurable and some of socrative's standard questions caused permanent confusion ("solve the problem on the board" -- we did not even have a board!). Also, to my surprise, the exit-quiz appeared to be less effective than the paper-based questionnaire in facilitating comments. Later, I figured that I can leave the exit-quiz open for a few days after class to collect more feedback; this helped, but unfortunately only just a little bit. Reading through the exit-quiz responses is also a bit of extra work if you have not done this before, but this was not the case for me. And I actually like to read through these extra comments; they are very useful most of the time.

Once, I also experimented with socrative's "quiz" component. This allows teachers to compile a longer list of questions (a quiz) that the students can solve in the class (either timed or not timed). This quiz was not marked. Unfortunately, there were too many technical difficulties: socrative was overloaded (the class size was around 90). Also, socrative eliminated leading whitespace characters from the answers, which was quite unfortunate as whitespaces are crucially important in Python, the programming language that we used in this class. Thus, I decided not to use the quiz again.

In conclusion, although the system is not perfect, it was usable and overall it helped both the students and me. I have received many comments from the student praising how socrative was useful to engage everyone, including the shy students. As this was my main motivation to try this system, I conclude that the experiment was successful.

PS: Next semester, I tried again. Unfortunately, in this semester we were in a different classroom, where first the wifi service became unreliable and then it completely stopped working. I have no idea why. No one had. I asked the technical support people to fix it but all I got was promises. This was sad. Luckily, this was a smaller class (around 50), so we could still have some true interaction with the students.

PPS: I have also used in the first semester a website where students could solve hundreds of little programming problems. The site provided the questions and it also have feedback, even including things like "did you forget to initialize your variable". Again, the tool had glitches (the question bank was not very well developed), but overall I think this was also a great tool.

Comments

Matt Taylor said…
Great post - I'll have to try out socrative. I've wanted to try using clickers in the past, but balked at making students buy a separate device.

What was the supplemental website that you used? In the past, students liked it when I suggested they try specific problems at: http://codingbat.com/python

I also like http://www.pythontutor.com/ to visualize execution, but that takes more initiative on the students' part.
I used myprogramminglab.com; this is a commercial site that was free for the students as the textbook we used was new and the problems were still under development. We were sorta the guinea pig. This tool was not perfect either (it uses ancient web technology).
I am also sensitive to the cost of education, but seeing who myprogramminglab.com could help the students in ways I could not (because there were 90+ of them!), I would know recommend everyone to adapt something like this, even if it comes at a cost.
I did not know of codingbat; does it use Python 2 or Python 3? I tried it now and it looks great, though the number of problems is smallish (I think students need hundreds of small problems). Also, I noticed that it is entirely based on functions, which is somewhat unfortunate if you teach functions later.
I knew about pythontutor, which is great, but has a different role.

Popular posts from this blog

Keynote vs. Powerpoint vs. Beamer

A few days ago I decided to give Keynote, Apple's presentation software, a try (part of iWork '09). Beforehand I used MS Powerpoint 2003, Impress from NeoOffice 3.0 (OpenOffice's native Mac version) and LaTeX with beamer. Here is a comparison of the ups and downs of these software, mainly to remind myself when I will reconsider my choice in half a year and also to help people decide what to use for their presentation. Comments, suggestions, critics are absolutely welcome, as usual. Btw, while preparing this note I have learned that go-oo.org has a native Mac Aqua version of OpenOffice. Maybe I will try it some day and update the post. It would also be good to include a recent version of Powerpoint in the comparison.
StabilityKeynote: Excellent
After a few days of usage, so take this statement with a grain of salt..MS Powerpoint 2003: ExcellentImpress: Poor
Save your work very oftenBeamer: ExcellentCreating visually appealing slides, graphics on slides
Keynote: Excellent
Posit…

Approximating inverse covariance matrices

Phew, the last time I have posted an entry to my blog was a loong time ago.. Not that there was nothing interesting to blog about, just I always delayed things. (Btw, google changed the template which eliminated the rendering of the latex formulae, not happy.. Luckily, I could change back the template..) Now, as the actual contents:

I have just read the PAMI paper "Accuracy of Pseudo-Inverse Covariance Learning-A Random Matrix Theory Analysis" by D Hoyle (IEEE T. PAMI, 2011 vol. 33 (7) pp. 1470--1481).

The paper is about pseudo-inverse covariance matrices and their analysis based on random matrix theory analysis and I can say I enjoyed this paper quite a lot.

In short, the author's point is this:
Let \$d,n>0\$ be integers. Let $\hat{C}$ be the sample covariance matrix of some iid data $X_1,\ldots,X_n\in \mathbb{R}^d$ based on $n$ datapoints and let $C$ be the population covariance matrix (i.e., $\hat{C}=\mathbb{E}[X_1 X_1^\top]$). Assume that $d,n\rightarrow \infty$ …

Useful latex/svn tools (merge, clean, svn, diff)

This blog is about some tools that I have developed (and yet another one that I have downloaded) which help me to streamline my latex work cycle. I make the tools available, hoping that other people will find them useful. However, they are admittedly limited (more about this) and as usual for free stuff they come with zero guarantee. Use them at your own risk.

The first little tool is for creating a cleaned up file before submitting it to a publisher who asks for source files. I call it ltxclean.pl, it is developed in Perl. It can be downloaded from here.
The functionality is
(1) to remove latex comments
(2) to remove \todo{} commands
(3) to merge files included from a main file into the main file
(4) to merge the bbl file into the same main file

If you make the tool executable (chmod a+x ltxclean.pl), you can use it like this:

$ ltxclean.pl main.tex > cleaned.tex

How does this work?

The tool reads in the source tex file, processes it line by line and produces some output to the standard ou…