Abstract: People are using online product reviews and evaluations more and more (Chandler et al., 2013). As the usage of online reviews persists and more smart phone applications are created, the demand for online product review continues to increase; yet, there is no indication of the quality of these reviews. In fact, some online reviews have been found to be fraudulent and misleading. Many online product reviews come from Internet-based crowdsource organizations. Few studies have explored evaluation practices among these organizations, and as a result, it is unclear what, if any, evaluation standards are used by crowdsource reviewers, particularly those found on open, self-serve sites such as MTurk. The purpose of this study, therefore, was to determine (a) what, if any, evaluation standards are used by crowdsource organizations and their requesters, and (b) to what extent these standards adhere to the Joint Committee on Standards for Education Evaluation (JCSEE) Program Evaluation Standards (Yarbrough et al., 2011). Descriptive, survey data was collected from 454 MTurk product reviewers. Findings indicate these product reviewers do not appear to use any standards. The MTurk product reviewers that participated in this survey are using personal, experience-based opinions as a basis for their online reviews. The literature tells us, however, these opinions are not reliable, as they change with the providers experience and knowledge of the product. Results further indicate participants appear to not be very procedural. Document management seemed to be reviewer dependent. Moreover, open-ended follow-up questions reveal that when asked if they used more technical review designs, the majority of participants answered “often,” while simultaneously indicating their reviews were based on personal experience. This result was very conflictive with survey results, and further points to a misperception that MTurk product reviewers are providing reliable online product reviews.