Google Glass Impacting Teacher Evaluations: An Investigation of Feasibility

Publications

Abstract

Teacher evaluation and observation in K through 12 has important outcomes. Changes in technology may lead to innovative approaches in evaluation and observation. Google Glass video recordings may help emphasize positives and negatives, in an evaluation, and potentially encourage self-evaluation by educators. The feasibility of using Google Glass for the evaluation and observation process was examined within this study. Administrative feedback [with administrators defined as principals, vice principals, and supervisors] was secured via questionnaire to identify whether using Google Glass was feasible for usage within teacher evaluation process. Respondents indicating a more appropriate usage for using Google Glass for teacher evaluations would be self-reflection from teachers. The results of this study suggest an experimental review of Google Glass in action for teacher evaluation and observation is necessary to define its effectiveness. Limitations of the study and Google Glass are discussed relative to the findings of the study.

Keywords: teacher evaluation, Google Glass, self-reflection

Due to the importance of teacher evaluations on educator’s careers it is important to consider tools which provide the most unbiased and accurate information possible. Educators often use evaluations to improve their teaching. Administrators alternatively use evaluations to make employment decisions. Therefore, videotaping and recording teacher evaluations has been proposed in the past (Darling-Hammond et al., 2012; Pickering & Walsh, 2011; Selmes, 1986). However, videotaping may be viewed as an outdated approach. Recording devices, like smart wearable devices, may be a more updated approach. This approach allows administrators to use specific examples when detailing evaluation critiques to educators. While privacy concerns and permissions are required, recordings with smart devices might provide the most in-depth evaluation and subsequent review possible.

Teacher Evaluation

Thomas (1974) stated, “Success or failure of schools is largely determined by what teachers do with children in classrooms” (p. 7). Teacher evaluations used to focus on teacher competence, but now focus on teacher improvement, growth, and development (Mayo, 1997). Identifying the procedures for teaching, learning, and assessment make it easier to conduct an evaluation (Withers, 1994). Due to teachers working in varied environments a flexible evaluation tool is necessary (Hallinger et al., 2014). Any approach to evaluation requires a review of literature related to teacher effectiveness (Darling-Hammond et al., 2012). The evaluation tool used has an ongoing impact to the future success of a teacher’s performance (Taylor & Tyler, 2012).

Evaluations need to represent a teacher’s performance (Mayo, 1997) and should not be rushed to prevent the evaluation from being biased or inadequate (Hallinger et al., 2014). The focus of teacher evaluations should be providing accurate feedback to educators (Hill & Grossman, 2013). Evaluations should feature a three-step approach: pre-evaluation, evaluation, and post-evaluation with detailed critiques after each step (Mayo, 1997). Feedback should be provided promptly after an evaluation for an educator to best address any areas needing improvement (Hill & Grossman, 2013). This allows educators to self-evaluate based on feedback to improve on weaknesses (Darling-Hammond et al., 2012; Mayo, 1997; Taylor & Tyler, 2012; Withers, 1994). Through self-evaluation educators can address weaknesses through professional development or other resources provided by administrators (Mayo, 1997).

Teacher evaluations are useful for administrators to help educators improve instruction (McNally, 1977). Administrators need to collect information through evaluations which help determine whether an educator belongs in the classroom (Wilson & Wood, 1996). Teacher evaluations need to focus on accountability and growth (Hill & Grossman, 2013; Wilson & Wood, 1996). The teacher evaluation instrument used severely impacts evaluation results and must have high validity to properly determine weaknesses (Hill & Grossman, 2013). It is important to be able to differentiate teaching quality and follow up on evaluations (Hill & Grossman, 2013). Failing to follow up on evaluations may lead to a lack of improvement of weaknesses, which is detrimental to student learning (Hill & Grossman, 2013).

Google Glass

Previous studies have focused on Google Glass and its applications in education for teaching and learning (Adapa et al., 2018; Knight et al., 2015; Lee et al., 2015; Muensterer et al., 2014; Paro et al., 2015; Parslow, 2014; Parton, 2017). Self-reflection is an effective use of reviewing video recordings (Gorham, 2017; Nagro et al., 2017; Ritchie, 2016) and Google Glass is able to record videos. Nagro et al. (2017) found student teachers who received feedback from a supervisor after writing a self-evaluation report upon review of a video recording significantly improved their instructional skills, which could be utilized in the administrator-teacher role. Ritchie (2016) conducted a similar study, finding self-evaluation on a recording created improvement from one task to the next. Soliciting feedback from colleagues on video recordings may also be an effective self-reflection tool (Knapp et al., 2017). Self-reflection on video recordings improves practice (Gorham, 2017). Additionally, Google Glass can be used as a tool for instant feedback (Ebner et al., 2016).

While individuals held favorable attitudes towards using Google Glass (Muensterer et al., 2014) there are various ways it may affect education. Google Glass allowed the sharing of information in a variety of methods for teachers and students to interact; allowing both students and teachers to view recordings (Parslow, 2014). Video is being used in multiple fields as a self-improvement tool (Paro et al., 2015). The individuals reviewing and recording the Google Glass video are important to how useful it is as a teaching tool (Knight et al., 2015). Google Glass has been used educationally within the field of medicine as a tool to enhance learning but may be helpful to educators too.

Method

Videotaping may be used as a tool to provide a record of an educator’s effectiveness (Darling-Hammond et al., 2012). Using videotaping of teacher performance allows administrators to refer to examples of strengths and weaknesses (Darling-Hammond et al., 2012; Pickering & Walsh, 2011; Selmes, 1986). Since evaluations should be accurate and eliminate bias, such as human error (Wilson & Wood, 1996), seeking tools that may eliminate bias is necessary. Google Glass might be able to be such a tool. This study examined the feasibility of Google Glass replacing videotaping for teacher evaluations. Within this study barriers to adoption were examined with respect to the administrators who would implement the usage of Google Glass in teacher evaluations.

Participants

I approached participants who met the credentials of administrator, defined as principals, vice principals, and supervisors through LinkedIn about participating in a survey featuring ten questions about the feasibility of using Google Glass for teacher evaluations. Twenty-six participants responded to the online survey. All 26 participants responded to the eight non-opened ended questions. Five of the participants chose to skip the open-ended question asking for personal views on Google Glass’s feasibility regarding teacher evaluations. Twelve of the participants chose to skip the open-ended question, asking for any other comments in relation to the survey.

Materials and Procedure

Data was collected through an online survey. Survey respondents were all administrators; identified as principals, vice principals, and/or supervisors. These administrators were identified as the most heavily involved in the evaluation process. The survey featured ten questions. Two questions were open-ended seeking further input on the feasibility of using Google Glass for teacher evaluations. Six survey questions focused on Yes or No answers looking to identify previous knowledge of Google Glass and procedural information for evaluations. The remaining two survey questions were Likert scale responses used to determine if knowledge of the product and the price point of the product impacted responses. Data from the non-open-ended responses addressed assumptions of feasibility concerns. The data from the open-ended questions allowed administrators to identify other areas that need to be addressed.

Results

Yes or No Results

All 26 participants responded to all six of the questions that featured Yes or No as the primary answering choices. All 26 participants indicated that their current district does not videotape teacher evaluations. Five participants (19.2%) identified that their school district would consider usage of videotaping teacher evaluations in the future. Six participants (23.1%) indicated that their school district would not consider usage of videotaping teacher evaluations in the future. Fifteen participants (57.7%) indicated a response of maybe for usage of videotaping teacher evaluations within their district. Eighteen participants (69.2%) indicated videotaping teacher evaluations would encourage self-reflection. One participant (3.9%) indicated videotaping would not encourage self-reflection. Seven participants (26.9%) expressed an uncertainty toward videotaping evaluations encouraging self-reflection. Eight participants (30.8%) expressed that enough time is devoted towards teacher evaluations. Eighteen participants (69.2%) suggested not enough time is devoted towards teacher evaluations. All participants indicated they were trained in the evaluation procedures their district uses. When explicitly asked if Google Glass would be effective in analyzing teacher evaluations with the observed educator, 13 participants (50%) answered Yes, 2 participants (7.7%) answered No, and 11 participants (42.3%) were unsure.

Likert Scale Results

All 26 respondents participated in both Likert Scale style questions. When asked about their familiarity to Google Glass prior to this survey 8 participants (30.8%) expressed not being familiar at all with Google Glass, 11 participants (42.3%) expressed a mild familiarity, and 7 participants (26.9%) expressed being fairly familiar with Google Glass. No respondent expressed being quite familiar or extremely familiar with Google Glass. Participants were also asked about the practicality of the price point for their district. The most recent price point of $1,500 was identified. No participant indicated they strongly agreed that the price point would be practical for their school district. One participant (3.9%) agreed the price point was practical. Ten participants (38.5%) were undecided. Eight participants (30.8%) indicated they disagreed, and 7 participants (26.9%) indicated they strongly disagreed to the identified price point being practical for their school district.

Open-Ended Results

Twenty-one of the 26 participants responded to the open-ended question regarding their view of feasibility. Three of the responses were deemed as insufficient to addressing the question. Four participants suggested that Google Glass would either be ineffective, a distraction, or make staff and students uncomfortable. Six participants expressed a concern over union approval or administrative overload. Three participants discussed how Google Glass has the potential to improve self-reflection. One participant suggested that the technology is as effective as the educator using it, but if it had benefits to student achievement, it should be used. One participant expressed concerns over teachers approving the recording. One participant expressed concerns over student confidentiality. One participant expressed concerns over the cost of Google Glass. One participant suggested piloting Google Glass teacher evaluations with a few select teachers.

Fourteen of the 26 participants agreed to provide any other comments pertaining to the feasibility of using Google Glass for improving teacher evaluations. Three of these responses were determined to be inefficient in addressing the question posed. Three participants identified that there is more to the teacher evaluation than merely the observation itself. Two participants expressed a need for the Google product to be made in line with evaluation practices; one specifically suggested a Google Form for collecting data of observations in the evaluation process. Three participants expressed that this tool would be more appropriate for teacher self-reflection or expressed a lack of understanding. Two participants identified how trust between the administration and teachers must be developed after the teacher’s association approves usage. One participant expressed concerns over the quality of video and cost of Google Glass.

Discussion

The results indicate that Google Glass, in its current iteration, was not viewed as a feasible tool for teacher evaluations. Respondents identified the following issues with using Google Glass related to video quality and cost. Video quality not being as strong as audio quality was found to be an issue previously (Paro et al., 2015). Video quality is important; especially if the recordings will be used for documentation purposes. While the majority seemed open to using Google Glass for analyzing teacher evaluations; they suggested that self-reflection may be a better purpose for this tool. Respondents were not informed about the video quality of Google Glass but were informed of the price point of $1,500.

Despite a lack of familiarity with Google Glass, the participants were able to identify other concerns such as potential distractions and approval to use. Distractions were identified due to the educators and students potentially feeling uncomfortable with the use of this device in the classroom. Participants identified that teachers must feel comfortable when an administrator enters the classroom to complete an evaluation. In a previous study (Zarraonandia et al., 2019) distractions to students and teachers were suggested to not be a concern. Though employment concerns were not considered. Participants identified the need for teachers, students, and the teacher’s union to sign off on the usage and recordings. Participants identified that the recordings could be used to terminate an educator, which the union would oppose. Therefore, knowledge and consent of recording is necessary (Muensterer et al., 2014), These parties need to be knowledgeable on Google Glass, and policies regarding how Google Glass would be used for teacher evaluations would need to be defined and approved. Policies and penalties for misuse would need to be identified.

Two other issues mentioned were administrator overload and needing a tool more in line with current teacher evaluation processes. Administrator overload was often mentioned because the teacher evaluation includes more than just the observation within the classroom. Adding a Google Glass recording of the observation into the evaluation process seemed to be an additional issue; participants identified a lack of time devoted to evaluations as they are presently conducted. The responses suggest needing to eliminate something else if this tool were to be used. The Google Glass tool would be able to record an entire lecture (Ebner et al., 2016) but might put more work on administrators. Teacher evaluations already put administrators in a difficult spot (Wilson & Wood, 1996). Participants believed the tool would be more feasible if it were to be in line with current teacher evaluation practices. A participant identified that Google could create a form to collect data related to teacher evaluations. Google has several educational products, and this could influence future use of Google Glass in education. If tools were created to make it easier to use Google Glass for teacher evaluation purposes, administrators and teachers might agree on its usage.

Privacy concerns were also mentioned, which reflect similar studies (Ebner et al., 2016; Halpern & Caron, 2016; Lee et al., 2015; Muensterer et al., 2014). However, there were other issues identified not mentioned by respondents associated with Google Glass. Adapa et al. (2018) examined Google Glass and other smart wearable devices through the lens of the Technology Acceptance Model. Functionality, compatibility, and battery life were among factors influencing usefulness based on participants’ beliefs (Adapa et al., 2018, p. 405). Efficiency, time saving, and dependability were among the reasons to adopt smart wearable technologies (Adapa et al., 2018, p. 407). Ease of use was not found to be a reason for or against using Google Glass (Adapa et al., 2018). The following issues were identified in previous studies related to using Google Glass: communication delay (Martinez-Millana et al., 2016), limited battery power (Adapa et al., 2018; Muensterer et al., 2014), synchronization of data to the web causing a potential data breach (Muensterer et al., 2014), longevity of recording time (Paro et al., 2015), longer for instruction than other technology like iPads (Parton, 2017), data storage and data displays (Lee et al., 2015).

Limitations

Limitations of Study

            The limited number of participants (26) in the study resulted in difficulty generalizing any of the results. The responses of the survey suggest a flaw in the group identified as appropriate for addressing this question. Since a teacher’s union would have to approve the usage of Google Glass for teacher evaluations purposes, not identifying them as potential participants for this study limited its effectiveness. Members of a teacher’s union could have expressed further considerations beyond approval, cost, and lack of knowledge of Google Glass. The limited number of participants (26) in the study resulted in responses to the survey restricted to areas already mentioned within the survey tool. Participants with limited knowledge of Google Glass prior to completion of the survey resulted in responses focused on educational enforcement rather than whether the tool would be effective in teacher evaluations. Additionally, providing participants a clearer explanation of Google Glass prior to the study would have helped, similar to issues identified by (Zarraonandia et al., 2019).

Limitations of Google Glass

            The primary limitation of Google Glass is that is has currently been taken off the market. Another limitation expressed by a survey response is the quality of the video Google Glass can take. Google Glass was supposed to take video mere seconds in duration. A participant suggested it would be more cost efficient to take a video on a cell phone. The majority of respondents expressed the cost of Google Glass as a roadblock to usage. Should Google Glass be integrated with Google’s educational initiatives, such as GAFE (Google Apps for Education) or, as suggested, feature a form for administrators to use to collect data for observations, it might have been better received.

Conclusion

Based on the results of this study, Google Glass is not a feasible tool for teacher evaluations in its current iteration. Participants identified a lack of time devoted towards teacher evaluations and a district lack of interest in videotaping teacher evaluations. This venture would require time and videotaping, which respondents suggest is not practical. Participants expressed a lack of familiarity with Google Glass but expressed the tool would be more appropriate for teacher self-reflection due to approval issues, it being a potential distraction, administrative overload, and lack of teacher evaluation connection. A potential pilot program with selected teachers could present usable data on the effectiveness of Google Glass on teacher evaluations. Video quality and cost were expressed concerns that need to be addressed in future iterations of Google Glass. A larger participant pool is needed to generalize these findings. Dafoulas et al., (2018) utilized Google Glass to provide descriptive feedback to students, using pictures, on in-class presentations, which may serve as a potential starting point for any pilot program. Ebner et al., (2016) agreed, Google Glass was an effective tool for instant feedback.

References

Adapa, A., Nah, F. F. H., Hall, R. H., Siau, K., & Smith, S. N. (2018). Factors influencing the adoption of smart wearable devices. International Journal of Human–Computer Interaction34(5), 399-409.

Dafoulas, G., Cardoso Maia, C., & Tsiakara, A. (2018). Google Glass as a learning tool: sharing evaluation results for the role of optical head mounted displays in education.

Darling-Hammond, L., Amrein-Beardsley, A., Haertel, E., & Rothstein, J. (2012). Evaluating Teacher Evaluation: Popular Modes of Evaluating Teachers Are Fraught with Inaccuracies and Inconsistencies, but the Field Has Identified Better Approaches. Phi Delta Kappan, 93(6), 8-15.

Ebner, M., Mühlburger, H., & Ebner, M. (2016). Google Glass in Face-to-face Lectures-Prototype and First Experiences. International Journal of Interactive Mobile Technologies10(1).

Gorham, J. K. (2017). VIDEO-BASED REFLECTION OF BEGINNING TEACHERS: AN INDUCTION STRATEGY TO PROMOTE TEACHING EFFICACY (Doctoral dissertation, Johns Hopkins University).

Hallinger, P., Heck, R., & Murphy, J. (2014). Teacher evaluation and school improvement: An analysis of the evidence. Educational Assessment, Evaluation and Accountability, 26(1), 5-28.

Halpern Jelin, D. M., & Caron, A. (2016). To Glass or Not to Glass: A Study on Attitudes towards a Wearable Device (Google Glass).

Hill, H., & Grossman, P. (2013). Learning from Teacher Observations: Challenges and Opportunities Posed by New Teacher Evaluation Systems. Harvard Educational Review, 83(2), 371-384.

Knapp, S., Gottlieb, M. C., & Handelsman, M. M. (2017). Enhancing professionalism through self-reflection. Professional Psychology: Research and Practice48(3), 167.

Knight, H. M., Gajendragadkar, P. R., & Bokhari, A. (2015). Wearable technology: using Google Glass as a teaching tool. Case Reports2015, bcr2014208768.

Lee, V. R., Drake, J., & Williamson, K. (2015). Let’s get physical: K-12 students using wearable devices to obtain and learn about data from physical activities. TechTrends59(4), 46-53.

Martinez-Millana, A., Bayo-Monton, J. L., Lizondo, A., Fernandez-Llatas, C., & Traver, V. (2016). Evaluation of Google Glass technical limitations on their integration in medical systems. Sensors16(12), 2142.

Mayo, R. (1997). Trends in Teacher Evaluation. Clearing House, 70(5), 269-270.

Mcnally, H. (1977). Performance-Based Teacher Evaluation. NASSP Bulletin, 61(411), 104-105.

Muensterer, O. J., Lacher, M., Zoeller, C., Bronstein, M., & Kübler, J. (2014). Google Glass in pediatric surgery: an exploratory study. International journal of surgery12(4), 281-289.

Nagro, S. A., deBettencourt, L. U., Rosenberg, M. S., Carran, D. T., & Weiss, M. P. (2017). The effects of guided video analysis on teacher candidates’ reflective ability and instructional skills. Teacher Education and Special Education40(1), 7-25.

Paro, J. A., Nazareli, R., Gurjala, A., Berger, A., & Lee, G. K. (2015). Video-based self-review: comparing Google Glass and GoPro technologies. Annals of plastic surgery74, S71-S74.

Parslow, G. R. (2014). Commentary: Google glass: A head‐up display to facilitate teaching and learning. Biochemistry and Molecular Biology Education42(1), 91-92.

Parton, B. S. (2017). Glass vision 3D: digital discovery for the deaf. TechTrends61(2), 141-146.

Pickering, L., & Walsh, E. (2011). Using Videoconferencing Technology to Enhance Classroom Observation Methodology for the Instruction of Preservice Early Childhood Professionals. Journal of Digital Learning in Teacher Education, 27(3), 99-108.

Ritchie, S. M. (2016). Self-assessment of video-recorded presentations: Does it improve skills?. Active Learning in Higher Education17(3), 207-221.

Savage, J. (1982). Options for Administrators Teacher Evaluation Without Classroom Observation. NASSP Bulletin, 66(458), 41-45.

Selmes, C. (1986). Teacher Evaluation in the Classroom. Educational Management Administration & Leadership, 14(3), 191-196.

Taylor, E., & Tyler, J. (2012). Can Teacher Evaluation Improve Teaching? Evidence of Systematic Growth in the Effectiveness of Midcareer Teachers. Education Next, 12(4), 79-84.

Thomas, D. (1974). The Principal and Teacher Evaluation. NASSP Bulletin, 58(386), 1-7.

Wilson, B., & Wood, J. (1996). Teacher evaluation: A national dilemma. Journal of Personnel Evaluation in Education, 10(1), 75-82.

Withers, G. (1994). Getting value from teacher evaluation. Journal of Personnel Evaluation in Education, 8(2), 185-194.

Zarraonandia, T., Díaz, P., Montero, Á., Aedo, I., & Onorati, T. (2019). Using a Google Glass-based Classroom Feedback System to improve students to teacher communication. IEEE Access7, 16837-16846.

Leave a Reply

Your email address will not be published. Required fields are marked *