Wednesday, February 1, 2017

CrowdGrader Pricing


CrowdGrader runs on the Google Cloud, which guarantees top-notch security, high availability, full scalability, and geographically distributed replicas of the data to ensure that the student work is not lost. In order to help pay for this infrastructure,  we started charging for the service. The intention is to provide an affordable service, while covering the costs.

Each assignment can be either instructor-sponsored or  student-supported.

Instructor-Sponsored Assignments

Instructor-sponsored assignments come in three tiers:
  • Free: up to 10 students
  • Small: $2.99, up to 35 students
  • Medium: $6.99, up to 100 students
  • Large: $13.98, unlimited students
The prices above are per-assignment.

Student-Supported Assignments.

  • Three-weeks free trial. A three-week trial period gives students time to sort out any payment issues at the beginning of a class. 
  • $1.99/six months: Students pay a non-refundable $1.99 subscription fee for six months of access to CrowdGrader.  
Subscription can be renewed, but they do not auto-renew. The access period determines when students can submit solutions to CrowdGrader: students can perform reviews, or view their feedback, at any time.

In a month or so, we will take a look at the costs vs. income, and we will assess whether we can lower prices.  We hope you consider our service good value for the money, and we thank our users for their support.

Payment Terms and Conditions

Please refer to the Payment Terms and Conditions governing payments and refunds.

Wednesday, November 16, 2016

Consensus-based grading: more accuracy and less stress

We just introduced a new method for grading, called Consensus-based grading. In consensus-based grading, the grades students assign are compared on each rubric item separately, to obtain a consensus for each rubric item independently. Subsequently, the consensus grades for each entry are combined in an overall grade for the submission.

Rubric-based consensus increases precision, as errors are eliminated on each rubric entry, before the totals for the entries are combined. 

A more streamlined, less stressful student experience

Corresponding to this change, we made a few other changes to CrowdGrader, based on our experience of running large classes:

  • Students see only consensus grades. If you choose rubric-based consensus, students will see only the conseusus grades for their submission, rather than the grades individually submitted by students. In our experience, this increases student satisfaction, as students no longer see anxiety-inducing, but harmless, outlier grades in their evaluation.
  • The review feedback phase is no longer present. We found that the review feedback phase was only mildly useful, and often worked as a dis-incentive for students to submit honest evaluations, in spite of the fact that CrowdGrader does eliminate tit-for-tat behavior in feedback.

Sunday, October 11, 2015

Stars for reviews

In CrowdGrader, students can give feedback on the quality of the reviews they receive.  The grades assigned by CrowdGrader take into account this feedback, creating an incentive for students to write helpful and insightful reviews.

Students can now leave feedback simply by assigning stars to the reviews, ranging from 1 star (very unhelpful, bogus) to 5 stars (very helpful).  A guide to the meaning of each star rating appears when students hover over the stars.  Here is how the new interface looks like.

A student's submission feedback.  Students can rate
the helpfulness of the reviews via a 1-to-5 star rating.


This change is part of an ongoing effort to streamline and simplify the CrowdGrader UI.  The new start-based feedback replaces an implementation based on forms and links.




Sunday, September 20, 2015

CrowdGrader can now check submission similarity

One of the problems of crowd-grading is that there is no-one who grades all submissions.  Thus, it is difficult to detect when students submit the same solution.

To help with the detection of similar submissions, we have implemented in CrowdGrader a similarity checker.  And not just any similarity checker: the most full-features similarity checker we could wish for.

The feature is accessible from an assignment page by selecting Submissions > Check Similarity.
Note that it is still in Beta - please report any problems.

Input formats

The CrowdGrader similarity checker can process:

  • Text typed directly in CrowdGrader
  • Attached Word (docx, not doc), PDF, HTML, RTF documents.
  • Attached source files in any programming language (it deals correctly with comments in C, C++, Java, Python).
  • zip, tar, tgz archives.  In the archives, you can specify which subset of files should be processed, so the similarity results are meaningful and not drowned out in a myriad of standard files. 
  • Compressed versions of above files via gzip.
CrowdGrader also accepts any nesting of the above, for instance, multiple Word files included in a single zip file are ok. 

Comparison output

CrowdGrader distinguishes between text that is: 
  • Unchanged: equal in the two submissions
  • Renamed: uniformly renamed (for instance, when a variable is renamed)
  • Different
CrowdGrader also clusters for you the similar submissions, according to a threshold of your choice.  
Perhaps the best is to look at a couple of screenshots.



Submissions are clustered according to their similarity.
You can dynamically vary the similarity threshold and explore the resulting clusters.

You can examine similar submissions side-by-side.
Identical content is highlighted in blue; content that has been renamed is highlighted in green.



Saturday, April 25, 2015

Calendar view, multiple assignment download, and more

This weekend brought several improvements to CrowdGrader.  This is a mix of what users wanted, and what we thought they would like.

Calendar View

We have introduced a calendar view for assignments.  Keeping track of the various deadlines should be much easier now!


Multiple Assignment Download

Instructors can now download the grades for multiple assignments at once, simplifying class management.
 

Clone Group Membership

When cloning assignments, it is now possible to clone also the group membership of students.  This should make it easier to run classes where student work in the same groups throughout the class. Students can still edit their membership, should something change from one assignment to the other.


We hope this changes will make your work as instructors easier.  Enjoy! - and give us feedback: we love implementing features that make your life easier. 

Saturday, March 21, 2015

Getting reviews, not stress

Students can learn much from feedback from their peers -- but sometimes, the grades that come with such feedback can be a cause for undue stress.  Often, what is desired is peer feedback -- without the judgement component.

For instance, a typical way in which CrowdGrader is used is to provide students feedback on their work -- be it an essay, an Android application, or a lab report.  The students can then use the feedback to improve their work, and resubmit it.  When CrowdGrader is used in this way, the grades that are associated with reviews can be a spurious source of stress: students are often better off focusing on improving their work than on the grades.  Nevertheless, having some grades can be useful to the teacher to get an idea of class progress.

Teachers can now check the "Hide review grades" box in assignments:


This has the effect of showing students only the final instructor-assigned grade.  Teachers and their assistants have access to the full information.  Presto, feedback without judgement!

Towards supporting submit-feedback-resubmit

The above change is one in a series of planned changes that aim at supporting a submit-feedback-resubmit workflow.  In this workflow, students will be able to submit a draft of their work, and receive feedback (but no grades) on their draft.  Students would then be able to improve their work in light of the peer feedback, and re-submit it for evaluation. 

Implementing the option to hide review grades is a first step.  We are soon going to allow to clone an assignment together with the reviewer assignment.  In this way, instructors can create two assignments for a given homework.  In the first one, students get feedback (but no grades) on their preliminary work.  Students would then be able to improve their work in view of the received feedback, and re-submit it to the second assignment.  In this second assignment, their work would be assigned to the same reviewers who wrote the initial feedback.  These reviewers would be able to judge how well the feedback has been taken into account, and they would assign an overall grade to the work.  This is similar to what happens in scientific conferences with an author-response phase. 

We plan to provide support for submit-feedback-resubmit in less than a month.


Sunday, March 8, 2015

Giving more information and control to instructors

Today we made two changes to CrowdGrader that we hope will give instructors better control of the grades, and students more assurance that their work is properly evaluated.

Average grade, and grade difference, for every submission

First, the list of all submissions now includes both the average grade received by a submission, and the delta between the highest and lowest grades.  You can sort on either column, thus having quick access to the students who got the lowest average grade, and those who got the most widely differing grades.

This is how this looks on a sample assignment, and on a real assignment (with student identities hidden for privacy):

Average grade and grade difference
for a sample assignment.
Average grade and grade difference
for a real assignment.

Instructors can now also grade submissions

The other change, which goes hand in hand with the previous one, is that instructors can now grade submissions, if they wish.  The option is offered when viewing the details of a submission, and here is how it looks: 


If instructors select Review and grade this submission, they are led to a page where they enter a review, and a grade, for the submission, using an interface that is the same as the one used by regular reviewers.  When the assignment grades are computed, CrowdGrader gives precedence to the teacher grade over the student-assigned grades. 

The result: more information, and more control

These two changes together give more information and more control to instructors.  Instructors can easily see which submissions received the most widely different (and thus, more unreliable) grades, and they can easily re-grade a subset of the submissions, ensuring that they receive proper grades. 

Furthermore, in case where students give widely exaggerated grades, the instructors can easily intervene and assign proper grades to the submissions. 

Since instructor-assigned grades take precedence over student-assigned grades, this also creates a powerful incentive for students to be fair in their grading: students whose grade differs markedly from the instructor-assigned grade will receive a low review grade


Saturday, February 14, 2015

Authors, reviewers, and instructors can discuss submissions

Associated with each CrowdGrader submission is a forum where submission author, reviewers, and instructors can discuss the submission.  This can help clarify any problems that might arise during the review process.  For example, authors that make minor mistakes that might prevent the reviewers from properly evaluating their submissions can alert the reviewers.  Reviewers who are unable to understand parts of the submission can ask the author for clarifications.  The goal is to have a more transparent review process, where submission authors have confidence that their submission is properly evaluated, and reviewers have all the information they need.

Recently, we have given instructors the ability to actively participate in these discussion forums.  Here is an example of how this works.

A discussion between submission
author and reviewers, as seen
by the instructor.
The instructor can add a message.
The discussion, with the message
added by the instructor.

Saturday, February 7, 2015

Reviewers can annotate submissions, and upload them with their reviews

A new feature of CrowdGrader enables reviewers to upload files with their review.  In this way, students can collaborate with reviewers using any file format, from pdf, to doc, to file bundles for code.  For example, students can download essays or papers in pdf/doc format, annotate them in place, then upload the annotated versions of the files together with their reviews.

This feature is already in successful use in some early-adopter classes.  We hope it might be useful to you!

Friday, January 30, 2015

Visual Tour

It can be difficult for a user of CrowdGrader to get a full picture of the tool.  Instructors and students each see only one side of it, and due to the various phases of assignments, from submission to evaluation and grading, it is difficult to form at once an idea of how it looks like.

To help you, we have created a visual tool, which gives you both the instructor and the student perspective along the various phases of the assignment. 
Enjoy!