Table of Contents
Marmoset is an automated snapshot, submission and testing system developed at the University of Maryland. It provides a web-based framework that supports computer science programming courses. Instructors can post projects and test cases while students can make submissions and receive immediate feedback about their grades. Marmoset also keeps a record of submissions and other useful statistics that can be used by researchers to study the way students program.
Allows instructors to post projects and test cases on the web. Students can also submit their implementations over the web and Marmoset automatically compiles and runs these implementations against the test cases.
Gives students early feedback information about their submissions. This includes their grades, some of the tests they are failing, and the bugs found in their code by FindBugs.
Uses "Release Tests" to give students limited access to the test cases used to grade their projects. To perform a release test, students consume a release token -- they have just 3 tokens which regenerate after 24 hours. This encourages students to start early.
Gives instructors an overview of the performance of all students before the project deadline.
Allows instructors to change the testing setup after the project has been posted to correct errors in the setup or the specification.
Retests submissions to catch inconsistent results due to threads, hash codes, system failures etc.
Allows researchers to non-intrusively study the software development process for novice programmers.
Provides detailed code evolution history of student projects. This allows researchers to non-intrusively study the software development process for novice programmers.
Allows researchers to identify bug patterns that may also occur in production code.
Supports projects in Java, C/C++, Ruby and OCaml, and provides security and timeouts. It can be configured to support projects in other languages.
Marmoset Website: http://marmoset.cs.umd.edu
Marmoset Demo Server: http://marmoset-demo.cs.umd.edu (use this to try out the Marmoset system).
Jaime Spacco, David Hovemeyer, and William Pugh. An eclipse-based course project snapshot and submission system. In 3rd Eclipse Technology Exchange Workshop (eTX), Vancouver, BC, October 24, 2004.
Jaime Spacco, Jaymie Strecker, David Hovemeyer, and William Pugh. Software repository mining with Marmoset: An automated programming project snapshot and testing system. In Proceedings of the Mining Software Repositories Workshop (MSR 2005), St. Louis, Missouri, USA, May 2005.
David Hovemeyer, Jaime Spacco, and Bill Pugh. Evaluating and tuning a static analysis to find null pointer bugs. Lisbon, Portugal, September 5-6, 2005. ACM.
Jaime Spacco, David Hovemeyer, Bill Pugh, Jeff Hollingsworth, Nelson Padua-Perez, and Fawzi Emad. Experiences with marmoset. Technical report, 2006.
Jaime Spacco, David Hovemeyer, William Pugh, Jeff Hollingsworth, Nelson Padua-Perez, and Fawzi Emad. Experiences with marmoset: Designing and using an advanced submission and testing system for programming courses. In ITiCSE '06: Proceedings of the 11th annual conference on Innovation and technology in computer science education. ACM Press, 2006.
Jaime Spacco, Titus Winters, and Tom Payne. Inferring use cases from unit testing. In AAAI Workshop on Educational Data Mining, New York, NY, USA, July 2006. ACM Press.