2.c+Tools+&+Methods


 * Tools**

We used many different tools and perspectives for shaping our analysis. The tools used for our data analysis included Pre-pilot & post-pilot surveys for all participants, weekly debriefing surveys from all participants (this was used as a more formative assessment to ensure a well-run pilot), daily journals by the project creators, sociograms representing links between users and their connections to karma, Google Analytics measured site traffic, and the Ruby on Rails databases from the website tabulated user participation including completed actions, karma points and comments.

Participants for the Take Action project pilot were recruited through University of Michigan - Flint colleagues and other educational connections. Participants were identified as educators committed to bringing a global experience to their students. Student participants were typically required to participate by their teachers. The required participation was either through individual logins during class and participation or logging in as a class and then requiring students to "act" on a deed on their own. Participating teachers were required to participate in the project in good faith and complete pre-pilot and post-pilot surveys.

The pilot lasted four weeks from Monday, February 1, 2010 through Sunday, February 28, 2010. Participation in the project depended wholly on the time teachers had to devote each day in class, on their dedication to completing their month-long commitment (link to the agreement made between teachers and creators), and any of the weekly reminders sent to teachers to continue with the good work. Three to four classrooms were dependable in participating for the entirety of the 28 days, and the rest did well through the first week but tapered off their participation for the following 3 weeks. (Google Analytics showing site visits.) Participation by students was very influenced by the classroom teacher as evidenced by the low participation numbers during weekend days of the project. For this reason, the project's success relied a great deal on the pilot teachers.


 * Methods**

Using Ruby on Rails as the format for developing the Take Action Web community lent itself well to data collection. Ruby on Rails is a web development language that is database driven, so we were able to easily collect data on how many users acted on a specific action or "deed," what date a deed was featured - called the Action of the Day, how many karma points a user received, how many times karma was given and received between users, and what Millennium Development Goals linked to each Action of the Day. Data was also collected to track site traffic using Google Analytics, however the application was not up and running until the third day of the pilot. We also surveyed the committed participants prior to the start of the pilot and after the end of the pilot using the same questions on both for "accurate" comparison. All of this data was pulled from the database and organized into separate views including comparisons of users to the deeds they completed, the number of users who completed each particular deed, users who gave and received karma, an alignment of the action of the day to its feature date and the total number of users who completed that action, deeds completed compared to the number of people who visited the site according to Google Analytics, and comparison of multiple choice answers concerning motivation on the pre-pilot survey and post-pilot survey.

As we looked at analyzing the data, the numbers led us to looking at the deeds that were featured during the month-long pilot in a different way than expected. An above-average number of users completing an Action of the Day did not appear to have a pattern at first glance other than a clear decrease in the number of people acting on weekend days. However, as any teacher will tell you there are certain homework assignments that students are more apt to complete because they view it as easier or more rewarding - i.e. extra credit always seems to find time to be completed by students! With this in mind, we took a closer look at the type of deed that users were more apt to complete during weekdays and decided to use Bloom's Revised Taxonomy (Krahlworth, 2002) and determine where these actions might fit in the cognitive dimension. The dimensions of cognitive learning that were identified include Remember, Apply and Create in the following ways: (Definitions adjusted from Krathwohl, 212, 2002)
 * Remember: retrieving relevant knowledge from a reputable source and being able to recall the information
 * Apply - Online: carrying out a procedure or "action" virtually via a computer
 * Apply - Offline: carrying out a procedure or "action" in the physical community. This may include, but is not limited to, the physical classroom, school, home, church, or local community.
 * Apply - Online, Apply - Offline: carrying out a procedure or "action" virtually via a computer which has a direct affect on a physical community. i.e. online games that directly result in giving money, food or other needed services to a community in need.
 * Create: Putting elements together to form an original product

(The Apply classification was divided into subcategories because it was decided there was a significant difference between virtual application and real-world, physical application.) After making these classifications, the total number of actions completed by users for the 28 deeds was totaled into the appropriate Bloom's classification (actions are considered completed when a user clicks "I've Acted" for that particular deed.) Since not all classifications had an equal number of deeds connected to it, the total number of completed actions on that classification could not be compared fairly. In order to have a fair comparison, the total of completed actions for each classification was divided by the number of deeds that fit that classification. To put it simply - we averaged it. This allowed us to fairly compare which classifications were more effective at motivating users to act.

Karma was filtered to subtract any points a user gave oneself. Unfortunately this does not omit users who participated in gamesmanship or who campaigned for others to give them karma. Users could give a comment a certain number of karma points, 0-5 points, only one time. So where we expect that some of this probably happened, we did our best to filter the data so it would be an accurate picture of how the Take Action community viewed the particular user's actions and/or comments on the Action of the Day. After filtering the data, karma points were analyzed to look at who were "super users" defined as the top ten percent of total users with the greatest sum of karma points. We then used a sociogram to display how the users of the site interacted by displaying just the connections between who gave whom karma points and how many times each user gave a that user karma points.

We were also interested in which Millennium Development Goals (MDGs) attracted the most attention from users. To do this we sound the frequency a MDG was referenced in an Action of the Day in order to get a base number, and see if each MDG was represented equally during the 28-day pilot. Once we tabulated the frequency of each MDG during the pilot, we looked at the deeds which had an above average number of users complete the action. The number of times each MDG was referenced in an above-average deed was counted and compared in a table with all eight MDGs.

media type="custom" key="5978689"