• Request Info
  • Visit
  • Apply
  • Give
  • Request Info
  • Visit
  • Apply
  • Give

Search

  • A-Z Index
  • Map
Education Research
  • Request Info
  • Visit
  • Apply
  • Give
Cross

Educational Leadership and Policy Studies

  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Methodology
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  • Request Info
  • Visit
  • Apply
  • Give
  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Methodology
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  1. Educational Leadership and Policy Studies
  2. 2023

Archives for 2023

Learning to Learn New Research Methods: How Watching YouTube Helped Me Complete My First Client Facing Project

December 15, 2023 by jhall152

By Austin Boyd

Every measurement, evaluation, statistics, and assessment (MESA) professional ​​​​​​​​has their own “bag of tricks” to help them get the job done​,​ their go-to set of evaluation, statistical, and methodological skills and tools that they are most comfortable applying. For many, these are the skills and tools that they were taught directly while obtaining their MESA degrees. But what do we do when we need new tools and methodologies that we weren’t taught directly by a professor?  

My name is Austin Boyd, and I am a​​ researcher, instructor, UTK ESM alumni, and most importantly, a lifelong learner. I have had the opportunity to work on projects in several different research areas including psychometrics, para-social relationships, quality in higher education, and social network analysis. I seek out ​opportunities to learn​ about new areas of research while applying my MESA skill set in any area of research I can. My drive to enter new research areas often leads to me realizing that, while I feel confident in the MESA skills and tools I currently possess, these are only a fraction of what I could be using in a given project. This leads me to two options: 1) use a ​​​​​​method that I am comfortable with that might not be the perfect choice for the project; or 2) learn a new method that fits the needs of the project. Obviously, we have to choose option 2, but where do we even start learning a new ​research ​method?  

In my first year of graduate school, I took on an evaluation client who had recently learned about ​​Social Network Analysis (SNA), which is a method of visually displaying the social structure between social objects in terms of their relationships (Tichy & Fombrun, 1979) The​ client​ decided that this new analysis would revolutionize the way they looked at their professional development attendance but had no idea how to use it. This is where I came in, a new and excited PhD student, ready to take on the challenge. Except, SNA wasn’t something we would be covering in class. In fact, it wasn’t something covered in any of the classes I could take. I had to begin teaching myself something that I had only just heard of. This is where I learned two of the best starting points for any new researcher: Google and YouTube.  

Although they aren’t the most conventional starting points for learning, you would be surprised how convenient they can be. I could have begun by looking in the literature for articles or textbooks that covered SNA. However, I didn’t have time to go through an entire textbook on the topic in addition to my normal coursework, and most of the articles I found were applied research, far above my current understanding. What I needed was an entry point that began with the basics of conducting an SNA. Google, unlike the journal articles, was able to take me to several websites covering the basics of SNA and even led me to free online trainings on SNA for beginners. YouTube was able to supplement this knowledge with step-by-step video instructions on how to conduct my own SNA, both in software I was already proficient in, and in Gephi (Bastian, Heymann, & Jacomy, 2009), a new software designed specifically for this ​​​​analysis. For examples of these friendly starting points, see the SNA resources below. 

Marvel Web Image

 

These videos and websites weren’t perfect, and certainly weren’t what I ended up citing in my final report to my client, but they were a starting ​​point. A stepping stone that got me to a place where reading literature didn’t leave me confused, frustrated, and scared that I would have to abandon a project. This allowed me to successfully complete my first client facing research project, and they were equally thrilled with the results. Eventually, I even became comfortable enough to see areas for improvement in the literature, leading me to author my own paper creating a function that could reformat data to be used in one and two mode undirected social network analysis (Boyd & Rocconi, 2021). I’ve even used my free time to apply what I learned for fun and created a social network for the Marvel Cinematic Universe and the Pokémon game franchise (see below). 

​​It is unrealistic to expect to master every type of data analysis method that exists ​in just four years of graduate school. And even if we could, the field continues to expand every day with new methods, tools, and programs being added to aid in conducting research. This requires us to all be lifelong learners, who aren’t afraid to learn new skills, even if it means starting by watching some YouTube videos. 

 

​​​References 

Bastian M., Heymann S., & Jacomy M. (2009). Gephi: An open source software for exploring and manipulating networks. International AAAI Conference on Weblogs and Social Media. From AAAI 

Boyd, A. T., & Rocconi, L. M. (2021). Formatting data for one and two mode undirected social network analysis. Practical Assessment, Research & Evaluation, 26(24). Available online: https://scholarworks.umass.edu/pare/vol26/iss1/24/  

Tichy, N., & Fombrun, C. (1979). Network Analysis in Organizational Settings. Human Relations, 32(11), 923– 965. https://doi.org/10.1177/001872677903201103 

SNA Resources 

Aggarwal, C. C. (2011). An Introduction to Social Network Data Analytics. Social Network Data Analytics. Springer, Boston, MA 

Yang, S., Keller, F., & Zheng, L. (2017). Social network analysis: methods and examples. Los Angeles: Sage. 

https://visiblenetworklabs.com/guides/social-network-analysis-101/ 

https://github.com/gephi/gephi/wiki 

https://towardsdatascience.com/network-analysis-d734cd7270f8 

https://virtualitics.com/resources/a-beginners-guide-to-network-analysis/ 

https://ladal.edu.au/net.html 

Videos 

https://www.youtube.com/watch?v=xnX555j2sI8&ab_channel=DataCamp 

https://www.youtube.com/playlist?list=PLvRW_kd75IZuhy5AJE8GUyoV2aDl1o649 

https://www.youtube.com/watch?v=PT99WF1VEws&ab_channel=AlexandraOtt 

https://www.youtube.com/playlist?list=PL4iQXwvEG8CQSy4T1Z3cJZunvPtQp4dRy 

 

Filed Under: Evaluation Methodology Blog

Make Your Feedback to Teachers Matter: Leadership Content Knowledge is Key

December 12, 2023 by jhall152

By Dr. Mary Lynne Derrington & Dr. Alyson Lavigne 

Please Note: This is Part 1 of a four-part series on actionable feedback. Stay tuned for the next posts that will focus on Leadership Content Knowledge (LCK) and teacher feedback in the areas of STEM, Literacy, and Early Childhood Education.

The most important job of a school leader is to focus on the central purpose of schools—teaching and learning. Feedback to teachers on how to improve instructional practice is a critical element in promoting school success.

On average, principals spend 9 hours a week observing, providing feedback, and discussing instruction with teachers. Including documentation, this equates to nearly six 40-hour work-weeks and as much as 25% of a principal’s time.

Besides the time principals spend in these tasks, they are costly. It costs $700 million a year to observe all 3.1 million K-12 public school teachers just twice a year. All these efforts are based on the belief that, when school leaders observe teachers, they provide teachers with meaningful feedback — and that feedback, in turn, improves teaching and learning.

So, how does a school leader ensure that their feedback impacts practice? Feedback only matters when it can be acted upon, so what makes feedback actionable? We can all agree that for feedback to be actionable it must be timely, concrete, and clear. But it must also relate to the task at hand—teaching subject matter content.

When researchers ask teachers about the feedback they receive from school leaders, half of teachers reported that the feedback received from principals is not useful. Teachers say that they rarely receive feedback about their teaching content. Yet we know that pedagogical content knowledge is important for effective teaching and for student learning.

If you want to make your feedback to teachers matter, emphasize a teacher’s curriculum subject matter content as a part of your feedback. This requires differentiation for each teacher by subject matter and context of the classroom. Differentiation personalizes the feedback and emphasizes that the subject, content, and context of the classroom matters.

How can school leaders meet this lofty goal and possess expertise in every content area? First, a strong background in effective teaching practices is an important start. Second, leaders need a deep content knowledge of the subject and how it is learned (by students), and how it is taught, sometimes referred to as post-holing.

Principals can gain content expertise in many ways. For example:

  • Work with a content PLC team
  • Learn the standards for the subject
  • Review discipline-specific best practice research
  • Attend a subject-specific conference

Post-holing provides a great opportunity to align with other activities that might be occurring in the school, and demonstrates that you care about the subject matter and the teacher by providing deeper differentiated feedback. Challenge yourself to tackle one subject matter each year.

This blog entry is part of a four-part series on actionable feedback. Stay tuned for our next three posts that will focus on Leadership Content Knowledge (LCK) on concrete ways to provide feedback to teachers in the areas of STEM, Literacy, and Early Childhood Education.

If you want to dig into this content (pun intended!) a bit more, check out our book, Actionable Feedback to PK-12 Teachers. And for other suggestions on differentiated feedback, see Chapter 3 by Ellie Drago-Severson and Jessica Bloom-DeStefano.

Filed Under: News

The What, When, Why, and How of Formative Evaluation of Instruction

December 1, 2023 by jhall152

By M. Andrew Young

Hello! My name is M. Andrew Young. I am a second-year Ph.D. student in the Evaluation, Statistics, and Methodology Ph.D. program here at UT-Knoxville. I currently work in higher education assessment as a Director of Assessment at East Tennessee State University’s college of Pharmacy. As part of my duties, I am frequently called upon to conduct classroom assessments.  

Higher education assessment often employs the usage of summative evaluation of instruction, also commonly known as course evaluations, summative assessment of instruction (SAI), summative evaluation of instruction (SEI), among other titles, at the end of a course. At my institution the purpose of summative evaluation of instruction is primarily centered on evaluating faculty for tenure, promotion, and retention. What if there were a more student-centered approach to getting classroom evaluation feedback that not only benefits students in future classes (like summative assessment does), but also benefits students currently enrolled in the class? Enter formative evaluation of instruction, (FEI).  

 

What is FEI? 

FEI, sometimes referred to as midterm evaluations, entails seeking feedback from students prior to the semester midpoint to make mid-stream changes that will address each cohort’s individual learning needs. Collecting such meaningful and actionable FEI can prove to be challenging. Sometimes faculty may prefer to not participate in formative evaluation because they do not find the feedback from students actionable, or they may not value the student input. Furthermore, there is little direction on how to conduct this feedback and how to use it for continual quality improvement in the classroom. While there exists a lot of literature on summative evaluation of teaching, there seems to be a dearth of research surrounding best practices for formative evaluation of teaching. The few articles that I have been able to discover offer suggestions for FEI covered later in this post. 

 

When Should We Use FEI? 

In my opinion, every classroom can benefit from formative evaluation. When to administer it is as much an art as it is a science. Timing is everything and the results can differ greatly depending on the timing of the administration of the evaluation. In my time working as a Director of Assessment, I have found that the most meaningful feedback can be gathered in the first half of the semester, directly after a major assessment. Students have a better understanding of their comprehension of the material and the effectiveness of the classroom instruction. There is very little literature to support this, so this is purely anecdotal. None of the resources I have found have prescribed precisely when FEI should be conducted, but the name implies that the feedback should be sought at or around the semester midpoint. 

 

Why Should We Conduct FEI? 

FEI Can Help:

  • Improve student satisfaction on summative feedback of instruction (Snooks et al., 2007; Veeck et al., 2016),  
  • Make substantive changes to the classroom experience including textbooks, examinations/assessments of learning, and instructional methods (Snooks et al., 2007; Taylor et al., 2020) 
  • Strengthen teaching and improving rapport between students and faculty (Snooks et al., 2007; Taylor et al., 2020) 
  • Improve faculty development including promotion and tenure (Taylor et al., 2020; Veeck et al., 2016), encouraging active learning (Taylor et al., 2020) 
  • Bolster communication of expectations in a reciprocal relationship between instructor and student (Snooks et al., 2007; Taylor et al., 2020). 

 

How Should We Administer the FEI? 

Research has provided a wide variety of suggested practices including, but not limited to involving a facilitator for the formative evaluation, asking open-ended questions, using no more than ten minutes of classroom time, keeping it anonymous, and keeping it short (Holt & Moore, 1992; Snooks et al., 2007; Taylor et al., 2020), and even having students work in groups to provide the feedback or student conferencing (Fluckiger et al., 2010; Veeck et al., 2016).  

Hanover (2022) concluded that formative evaluation should include elements of: a 7-point Likert scale question evaluating how the course is going for the student followed by an open-ended explanation of rating question, involving the “Keep, Stop, Start” model with open-ended response-style questions, and finally, open-ended questions that allow students to suggest changes and provide additional feedback on the course and/or instructor. The “Keep, Stop, Start” model is applied by asking students what they would like the instructors to keep doing, stop doing, and/or start doing. In the college of pharmacy, we use the method that Hanover presented where we ask students to self-evaluate how well they feel they are doing in the class, and then explain their rating with an open-ended, free-response field. This has only been in practice at the college of pharmacy for the past academic year, and anecdotally from conversation with faculty, the data that has been collected has generally been more actionable for the faculty. Like all evaluations, it is not a perfect system and sometimes some of the data is not actionable, but in our college FEI is an integral part of indirect classroom assessment. The purpose is to collect and analyze themes that are associated with the different levels of evaluation rating. (Best Practices in Designing Course Evaluations, 2022). The most important step, however, is to close the feedback loop in a timely manner (Fluckiger et al., 2010; Taylor et al., 2020; Veeck et al., 2016). Closing the feedback loop for our purposes is essentially asking the course coordinator to respond to the feedback given in the FEI, usually within a week’s time, and detailing what changes, if any, will be made in the classroom and learning environment. Obviously, not all feedback is actionable, and in some cases, best practices in the literature conflict with suggestions made, but it is important for the students to know what can be changed and what cannot/will not be changed and why. 

 

What Remains? 

Some accrediting bodies (like the American Council for Pharmacy Education, or ACPE), require colleges to have an avenue for formative student feedback as part of their standards. I believe that formative evaluation benefits students and faculty alike, and where it may be too early to make a sweeping change and require FEI for every higher education institution, there may be value in educating faculty and assessment professionals of the benefits of FEI. Although outside the scope of this short blog post, adopting FEI as a common practice should be approached carefully, intentionally, and with best practices for change management in organizations. Some final thoughts: in order to get the students engaged in providing good feedback, ideally the practice of FEI has to be championed by the faculty. While it could be mandated by administration, that practice would likely not engender as much buy-in, and if the faculty, who are the primary touch-points for the students, aren’t sold on the practice or participate begrudgingly, that will create an environment where the data collected is not optimal and/or actionable. Students talk with each other across cohorts. If students in upper classes have a negative opinion on the process, that will have a negative trickle-down effect. What is the best way to make students disengage? Don’t close the feedback loop. 

 

References and Resources 

Best Practices in Designing Course Evaluations. (2022). Hanover Research. 

Fluckiger, J., Tixier, Y., Pasco, R., & Danielson, K. (2010). Formative Feedback: Involving Students as Partners in Assessment to Enhance Learning. College Teaching, 58, 136–140. https://doi.org/10.1080/87567555.2010.484031 

Holt, M. E., & Moore, A. B. (1992). Checking Halfway: The Value of Midterm Course Evaluation. Evaluation Practice, 13(1), 47–50. 

Snooks, M. K., Neeley, S. E., & Revere, L. (2007). Midterm Student Feedback: Results of a Pilot Study. Journal on Excellence in College Teaching, 18(3), 55–73. 

Taylor, R. L., Knorr, K., Ogrodnik, M., & Sinclair, P. (2020). Seven principles for good practice in midterm student feedback. International Journal for Academic Development, 25(4), 350–362. 

Veeck, A., O’Reilly, K., MacMillan, A., & Yu, H. (2016). The Use of Collaborative Midterm Student Evaluations to Provide Actionable Results. Journal of Marketing Education, 38(3), 157–169. https://doi.org/10.1177/0273475315619652 

 

Filed Under: Evaluation Methodology Blog

Evaluation in the Age of Emerging Technologies

November 15, 2023 by jhall152

By Richard Amoako

Greetings! My name is Richard Dickson Amoako. I am a second year PhD. student in Evaluation, Statistics, and Methodology at the University of Tennessee, Knoxville. My research interests focus on areas such as program evaluation, impact evaluation, higher education assessment, and emerging technologies in evaluation.  

As a lover of technology and technological innovations, I am intrigued by technological advancements in all spheres of our lives. The most recent development is the increased development and improvement of artificial intelligence (AI) and machine learning (ML). As an emerging evaluator, I am interested in learning about the implications of these technologies for evaluation practice.  

Throughout this blog post, I explore the implications of these technologies for evaluation including relevant technologies useful for evaluation, how these technologies can change the conduct of evaluation, the benefits and opportunities for evaluators, as well as the challenges and issues with the use of these emerging technologies in evaluation.  

 

Relevant Emerging Technologies for Evaluation 

Emerging technologies are new and innovative tools, techniques, and platforms that can transform the evaluation profession. These technologies can broadly be categorized into four groups, data collection and management tools, data visualization and reporting tools, data analysis and modeling tools, and digital and mobile tools. Three examples of the most popular emerging technologies relevant to evaluation are artificial intelligence, machine learning, and big data analytics. 

  • Data collection and analysis: AI and ML can help evaluators analyze data faster and more accurately. These technologies can also identify patterns and trends that may not be apparent to the naked eye. Additionally, emerging technologies have also led to new data collection methods, such as crowdsourcing, social media monitoring, and web analytics. These methods provide valuable opportunities for evaluators to access a wider range of data sources and collect more comprehensive and diverse data. 
  • Increased access to data: Social media, mobile devices, and other technologies have made it easier to collect data from a wider range of sources. This can help evaluators gather more diverse perspectives and ideas. 
  • Improved collaboration: Evaluators can collaborate more effectively with the help of video conferencing, online collaboration platforms, and project management software, regardless of where they are located. 
  • Improved visualization: Evaluators can present their findings in a more engaging and understandable way by using emerging technologies like data visualization software and virtual reality. 

 

Challenges and Issues Associated with Emerging Technologies in Evaluation 

While emerging technologies offer many exciting opportunities for evaluators, they also come with challenges. One of the main challenges is keeping up to date with the latest technologies and trends. Evaluators should have a solid understanding of the technologies they use, as well as the limitations and potential biases associated with those technologies. In some cases, emerging technologies can be expensive or require specialized equipment, which can be a barrier for evaluators with limited resources. 

Another challenge is the need to ensure emerging technologies are used ethically and responsibly. As the use of emerging technologies in evaluation becomes more widespread, there is a risk that evaluators may inadvertently compromise the privacy and security of program participants. In addition, they may inadvertently misuse data. To address these challenges, our profession needs to develop clear guidelines and best practices for using these technologies in evaluation. 

To conclude, emerging technologies are revolutionizing the evaluation landscape, opening new opportunities for evaluators to collect, analyze, and use data. With artificial intelligence and machine learning, as well as real-time monitoring and feedback, emerging technologies are changing evaluation and increasing the potential for action-based research. However, as with any advancing technology, there are also challenges to resolve. Evaluators must keep up to date with the latest technologies and develop clear guidelines and best practices. They must also ensure that these technologies are used ethically and responsibly. 

 

Resources 

Adlakha D. (2017). Quantifying the modern city: Emerging technologies and big data for active living research. Frontiers in Public Health, 5, 105. https://doi.org/10.3389/fpubh.2017.00105 

Borgo, R., Micallef, L., Bach, B. McGee , F.,  Lee, B. (2018). Information visualization evaluation using crowdsourcing. STAR – State of The Art Report, 37(7). Available at:  https://www.microsoft.com/en-us/research/uploads/prod/2018/05/InfoVis-Crowdsourcing-CGF2018.pdf 

Dimitriadou, E., & Lanitis, A. A. (2023).  Critical evaluation, challenges, and future perspectives of using artificial intelligence and emerging technologies in smart classrooms. Smart Learn. Environ, 10, 12. https://doi.org/10.1186/s40561-023-00231-3 

Huda, M., Maseleno, A., Atmotiyoso, P., Siregar, M., Ahmad, R., Jasmi, K. A., & Muhamad, N. H. N. (2018). Big data emerging technology: Insights into innovative environment for online learning Resources. International Journal of Emerging Technologies in Learning (iJET), 13(01), pp. 23–36. https://doi.org/10.3991/ijet.v13i01.6990 

Jurafsky, D., & Martin, J. H. (2009). Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall. 

World Health Organization. (2016). Monitoring and evaluating digital health interventions: A practical guide to conducting research and assessment. WHO Press. Available at:  https://saluddigital.com/wp-content/uploads/2019/06/WHO.-Monitoring-and-Evaluating-Digital-Health-Interventions.pdf 

Filed Under: Evaluation Methodology Blog

Martinez Coaching Ugandan Olympian in 2024 Paris Games

November 10, 2023 by jhall152

Kathleen Noble, a 2020 Olympic singles rower from Uganda, is being coached by Dr. James Martinez, an assistant professor in the Department of Educational Leadership & Policy Studies (ELPS). Dr. Martinez, himself a 5-time U.S. National and Olympic team member between 1993-1998, began working with Mrs. Noble this past July after she moved to Knoxville with her husband, Nico.

“Kathleen is an exceptionally competitive athlete, and an even better person,” says Dr. Martinez. A 28-year-old graduate of Princeton University, Mrs. Noble was an internationally competitive youth swimmer, having competed at the 2012 Short-Course World Championships in Istanbul. Holder of many Ugandan national records in freestyle and butterfly events, she started rowing as a walk–on athlete in her sophomore year of college and ultimately competed at the 2019 Under-23 World Rowing Championships.

Competing for Uganda in the 2020 Olympics (held in 2021 due to COVID), Mrs. Noble is the first rower ever to compete for her country. “Kathleen is a world-class athlete in every sense of the word,” says Martinez. “Her passion to understand every aspect of the sport, from racing, to nutrition, to training, to rigging the boat is inspiring.” Dr. Martinez and Mrs. Noble recently returned from the African Olympic Qualification Regatta in Tunisia, where she placed fourth among fifteen woman single scullers, qualifying her for the Paris games.

Dr. Martinez balances his UTK research (focused on school administrator self-efficacy), teaching and service demands and family responsibilities while supervising Mrs. Noble’s preparation for the Olympics. “Days are pretty full,” he says, “but no more so than when I was a schoolteacher and in training myself while raising our young children back in the day.”

Dr. Martinez credits his wife, Elizabeth, who earned her Master’s degree from the University of Tennessee, Knoxville’s School of Landscape Architecture, for her incredible support.. “She’s the glue that holds it all together,” he says.

Filed Under: News

So, You Want to Be a Higher Education Assessment Professional? What Skills and Dispositions are Essential?

November 1, 2023 by jhall152

By Jennifer Ann Morrow, PhD.

What does it take to be a competent higher education assessment professional? What skills and dispositions are needed in order to be successful in this field? I would get asked this question a lot from my students and while many times my go to answer is “it depends”, that answer would not suffice in preparing my students for this career path. So in order to give them a more comprehensive answer to this question I went to the literature. 

Although I have been teaching emerging assessment and evaluation professionals for the past 22 years and at various times coordinating both a Ph.D. and certificate program in Evaluation Statistics and Methodology I didn’t want to rely on just what our curriculum focuses on to answer their question. We educate students with diverse career paths (e.g., assessment professional, evaluator, faculty, data analyst, psychometrician) so our curriculum touches upon skills and dispositions across a variety of careers. Therefore, I delved deeper into the literature to give my students a more focused answer for their chosen career path. 

Guess what I found…. “it depends!”. There was little to no consistency or agreement within our field as to what are the essential competencies needed in order to be competent as a higher education assessment professional. So, depending on who you asked and what source you read the answer was different. While some sources touched upon needed knowledge and skills very few discussed dispositions that were essential to our professional practice. So my curious mind was racing and after some long discussions and reviewing literature with one of my fabulous graduate students, Nikki Christen, we started compiling lists of needed skills and dispositions from the literature. We soon realized that we needed to hear from higher education assessments professionals themselves to figure out what skills and dispositions were needed. So, a new research project was born! We brought on two other fabulous assessment colleagues, Dr. Gina Polychronopolous and Dr. Emilie Clucas Leaderman, and developed a national survey project to assess higher education assessment professionals’ perceptions of needed skills and dispositions in order to be effective in their job. I wanted to be able to give my students a better answer than “it depends!”. 

You can check out our article (https://www.rpajournal.com/dev/wp-content/uploads/2022/03/A-Snapshot-of-Needed-Skills-RPA.pdf) for detailed information on our methodology and results for this project. We had 213 higher education assessment professionals from across the country rate the importance of 92 skills and 52 dispositions for our field. I’ll briefly summarize the results here and then offer my suggestions to those who are interested in this career path. 

 

Summary of Needed Skills 

We found that the most important skills were interpersonal ones! Collaborating with others on assessment, developing collaborative relationships with stakeholders, and working with faculty on assessment projects were the highest rated skills. One participant even stated, “assessment is about people!”. Building relationships, collaboration, facilitation, and communication were all salient themes here. Other skills that were highly rated related to disseminating information. Communicating assessment results to stakeholders, communicating assessment results in writing, and disseminating assessment results were all highly related by higher education assessment professionals. Leadership skills were also deemed highly important by participants. Advocating for the value of assessment, developing a culture of assessment within an organization, facilitating change in an organization using assessment data were all seen as key skills. Project management was also rated as highly important to be competent in this field. Managing time, managing projects, and managing people were highly valued skills by participants. Various aspects of assessment design, developing assessment tools, data management, engaging in ethical assessment were also highly rated. One unexpected finding was that teaching experience was mentioned by a number of assessment professionals as a needed skill in the open-ended responses (Ha, the educator forgot to ask about teaching!). 

 

Summary of Needed Dispositions 

Many dispositions were rated as highly important by our participants. One mentioned, “personally I feel dispositions are more vital than technical skills. You can learn the techniques but without the personality, you will have trouble motivating others!”. Interpersonal dispositions such as collaborative, honest, helpful, inclusive, and support were deemed highly important dispositions to have. Responsiveness was also highly rated. Dispositions like problem solver and adaptable were found to be highly important. Having a consistent work approach was important. Dispositions such as trustworthy, reliable, ethical, analytical, detail oriented, and strategic were highly rated in this category. Expression related dispositions were also seen as important. Being transparent, articulate, and professional were all highly rated. Other themes that emerged from the open-ended responses were flexibility, patience, ‘thick skin’, and ‘it depends’ (seriously, I didn’t even prompt them for that response!).  

 

Next Steps: Starting Your Journey as a Higher Education Assessment Professional 

So now what? Now that you have some idea of what skills and dispositions are needed in order to be successful as a higher education assessment professional, what are your next steps? My advice is threefold: read, engage, and collaborate. Read the latest articles in the leading assessment journals (see list below). Here you will find the latest trends, the leading scholars, and suggestions for all the unanswered questions that still need to be explored in our field. Engage in learning and networking opportunities in our field. Attend the many conferences, webinars and trainings (some are free!), and join a professional organization and get involved. The Association for the Assessment of Learning in Higher Education (AALHE) is one of my homes. They have always been welcoming, and I’ve made great connections by attending events and volunteering. Reach out to others in our field for advice, to discuss research/interests, and possible collaborations. Post a message on the ASSESS listserv asking for advice or to connect with others that have similar research interests. There are many ways to learn more about our field and to get involved…just put yourself out there. Good luck on your journey! 

 

References and Resources 

Christen, N., Morrow, J. A., Polychronopoulos, G. B., & Leaderman, E. C. (2023). What should be in an assessment professionals’ toolkit? Perceptions of need from the field. Intersection: A Journal at the Intersection of Assessment and Learning. https://aalhe.scholasticahq.com/article/57789-what-should-be-in-an-assessment-professionals-toolkit-perceptions-of-need-from-the-field/attachment/123962.pdf 

Gregory, D., & Eckert, E. (2014, June). Assessment essentials: Engaging a new audience (things student affairs personnel should know or learn). Paper presented at the annual Student Affairs Assessment and Research Conference, Columbus, OH. 

Hoffman, J. (2015). Perceptions of assessment competency among new student affairs professionals. Research & Practice in Assessment, 10, 46-62. https://www.rpajournal.com/dev/wp-content/uploads/2015/12/A4.pdf 

Horst, S. J., & Prendergast, C. O. (2020). The Assessment Skills Framework: A taxonomy of assessment knowledge, skills and attitudes. Research & Practice in Assessment, 15(1). https://www.rpajournal.com/dev/wp-content/uploads/2020/05/The-Assessment-Skills-Framework-RPA.pdf 

Janke, K. K., Kelley, K. A., Sweet, B. V., & Kuba, S. E. (2017). Cultivating an assessment head coach: Competencies for the assessment professional. Assessment Update, 29(6). doi:10.1002/au.30113 

Polychronopoulos, G. B., & Clucas Leaderman, E. (2019). Strengths-based assessment practice: Constructing our professional identities through reflection. NILOA Viewpoints. Retrieved from https://www.learningoutcomesassessment.org/wp-content/uploads/2019/08/Viewpoints-Polychronopoulos-Leaderman.pdf 

AALHE: https://www.aalhe.org/  

AEFIS Academy: https://www.aefisacademy.org/global-category/assessment/?global_filter=all  

Assess Listserv: https://www.aalhe.org/assess-listserv 

Assessment and Evaluation in Higher Education: https://www.tandfonline.com/journals/caeh20 

Assessment in Education: Principles, Policy & Practice: https://www.tandfonline.com/loi/caie20 

Assessment Institute: https://assessmentinstitute.iupui.edu/ 

Assessment Update: https://onlinelibrary.wiley.com/journal/15360725 

Educational Assessment, Evaluation, and Accountability: https://www.springer.com/journal/11092 

Emerging Dialogues: https://www.aalhe.org/emerging-dialogues 

Intersection: A Journal at the Intersection of Assessment and Learning: https://www.aalhe.org/intersection 

JMU Higher Education Assessment Specialist Graduate Certificate: https://www.jmu.edu/pce/programs/all/assessment/index.shtml 

Journal of Assessment and Institutional Effectiveness: https://www.psupress.org/journals/jnls_jaie.html 

Journal of Assessment in Higher Education: https://journals.flvc.org/assessment 

Online Free Assessment Course: http://studentaffairsassessment.org/online-open-course 

Practical Assessment, Research, and Evaluation: https://scholarworks.umass.edu/pare/ 

Research & Practice in Assessment: https://www.rpajournal.com/ 

Rider University Higher Education Assessment Certificate: https://www.rider.edu/academics/colleges-schools/college-education-human-services/certificates-endorsements/higher-education-assessment 

Ten Trends in Higher Education Assessment: https://weaveeducation.com/assessment-meta-trends-higher-ed/ 

Weave Assessment Resources: https://weaveeducation.com/assessment-accreditation-webinars-ebooks-guides/?topic=assessment 

 

Filed Under: Evaluation Methodology Blog

Introducing UTK ERO: Your Bridge to Education Excellence

October 24, 2023 by jhall152

By Karina Beltran

The Educational Leadership and Policy Studies Department (ELPS) within the College of Education, Health, and Human Sciences (CEHHS) is proud to announce the launch of the Education Research & Opportunity Center (UTK ERO). UTK ERO represents the merger of The Center for Education Leadership, The Postsecondary Education Research Center, and the College Access and Persistence Services Outreach Center. UTK ERO builds on the long tradition of excellence established by these prior CEHSS efforts with a renewed passion and enhanced capacity for producing high-quality research, delivering high-impact outreach, and improving policy and practice in education. 

The mission of UTK ERO is to produce high-quality research, conduct high-impact outreach, and promote effective policies and practices that increase educational success and opportunity. 

High Quality Research  

Our research spans the entire education spectrum, from early childhood to adult education, addressing critical issues in education policy and practice. At UTK ERO, we hold our research to the highest standards, making sure it is: 

  1. Relevant: We understand the importance of timely, pertinent research. Our work focuses on critical educational issues, and we strive to deliver research that matters when it’s needed most.
  2. Rigorous: Trust is paramount. All our research undergoes a rigorous internal and external review process to ensure methodological soundness, responsible data management, and freedom from errors or biases.
  3. Actionable: We bridge the gap between academic concepts and real-world impact by providing concrete action steps for policymakers, school leaders, and educators.
  4. Accessible: We believe in making knowledge accessible to all. Our research findings are disseminated through a variety of channels, including social media, website blog posts, podcasts, and practitioner-oriented venues. We present results in graphical, text, audio, and video formats.

High Impact Outreach  

As a land-grant university, the University of Tennessee takes pride in enhancing economic, social, and professional opportunities for all Tennesseans. UTK ERO manages five U.S. Department of Education TRiO outreach and student services programs, all designed to increase college access and success for first-generation, low-income students in East Tennessee and surrounding rural areas.  

Our outreach programs are guided by these core values: 

  1. Service: Our main priority and passion are to increase access to and success within postsecondary education for students from disadvantaged and historically underrepresented groups. Every student, every opportunity.
  2. Stewardship: We are committed to managing public resources and funds with the utmost responsibility, transparency, and fiscal integrity. We aim to create a safe, positive, and fair environment for our employees and the students we serve.
  3. Community: Our connection with the communities we serve is central to our success. We value engagement with and support of these communities, maintaining regular and effective contact with our project partners to provide comprehensive, community-specific support for students and communities.

Stay Connected 

For more information, please visit the UTK ERO website at ero.utk.edu. There you can follow UTK ERO on social media, receive news and updates related to our research and outreach, and follow our blog!  

Website: ero.utk.edu 

Social Media Platforms 

Instagram: https://www.instagram.com/utk_ero/ 

Meta: https://www.facebook.com/people/The-Education-Research-Opportunity-Center/100090028087658/ 

LinkedIn: https://www.linkedin.com/company/education-research-and-opportunity-center/about/ 

X: https://twitter.com/utk_ero  

Filed Under: News

White Recognized for Early Career Contributions by University Council for Educational Administrators

October 18, 2023 by jhall152

Courtesy of the College of Education, Health, and Human Sciences (October 17, 2023)

Rachel White is passionate about issues of power, voice, and inclusion in education policy making and  implementation. Her research focuses on whose voices are heard in policy spaces, and how decisions made by political and educational leaders at the school, district, and state level impact both teachers’ and students’ educational experiences.

Photo portrait of Rachel White. She has fair skin and shoulder-length blond hair. She is wearing a light blue shirt and is posed in front of a dark background.

Rachel S. White

Now, White has received a prestigious accolade in her young career, the Jack A. Culbterson Award from the University Council for Education Administration (UCEA). Named for the organization’s long-serving director, the Culbertson Award is presented yearly to an outstanding junior educational leadership professor to recognize their significant contributions to the field of educational leadership. Eligible nominees must be professors with 6 years or fewer, serving in a UCEA affiliated university.

“It’s a great honor to be selected as a recipient of the Jack A. Culbertson Award,” said White. “I’m grateful for the incredible mentors and colleagues across the nation who contributed to my nomination. It’s truly a privilege to be in this position where I’m empowered to be curious, push boundaries, listen to the voices of kids and educators, and build on my experience as a former school board member and high school cross country and track & field coach to attempt to build a body of work that can positively impact K-12 public school systems, leaders, educators, kids—and, ultimately, our democracy.”

White is an assistant professor in the department of Educational Leadership and Policy Studies (ELPS) in the University of Tennessee, Knoxville, College of Education, Health, and Human Sciences (CEHHS). She joined CEHHS in 2022.

Nominees for the Culbertson Award are selected from their outstanding contributions to innovation, originality, potential impact, and more in their early body of academic work. In fact, White has been published in numerous journals, including Educational Administration Quarterly, Educational Evaluation and Policy Analysis, Leadership & Policy in Schools, Journal of School Leadership, Teachers College Record, and Kappan.

While White has appeared in several academic publications, she also contributes to a number of well-received stories in general media as well, including Education Week, The Conversation and Ed Surge. She also takes strides to make her research findings accessible to broader audiences using easy-to-easy-to-understand infographics.

Recently, she was named to a United States Department of Education Advisory Committee to provide advice and recommendations concerning the educational needs in the Appalachian region and how those needs can be most effectively addressed. Her committee will submit a report in six months to U.S. Secretary of Education, Dr. Miguel Cardona.

“It’s not lost on me that there has never been a day where I have woken up and was not excited to do this work,” said White. “This award pushes me to not let up and only fuels my passion to engage in rigorous, robust, and—most importantly—policy- and practice-relevant scholarship and outreach. As I reflect on prior award winners and the ways they have transformed the field of educational leadership, I’m humbled to be a part of that community.”

Through its eight departments and 12 centers, the UT Knoxville College of Education, Health, and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

So I Like Statistics, Now What?

October 15, 2023 by jhall152

By Jake Working 

Whether you’ve taken a statistics class, recently read a report from your data analyst, or simply want to make data-driven decisions, something about statistics just clicked with you. But what comes next? What can you do with this newfound passion?

I’m Jake Working, a current PhD student in the Evaluation, Statistics, and Methodology at the University of Tennessee, and I had similar questions after my first statistics class in college. In this post, I’ll discuss methods and rationale to improve your statistical skill set and an introduction to the methodology, evaluation, statistics, and assessment (MESA) fields.

Overview

1. Explore Statistics: Methods to improve your statistical skill set

2. Discover Your Motivation: Refine your rationale for statistical application

3. What is MESA? An introduction to the fields

 

Explore Statistics

Now that you have found an interest, keep learning! If you are still in college, consider a statistics minor or simply taking a few courses outside your major. As an engineering student in college, I was able to take additional statistics-related courses, such as business statistics and statistical methods in Six Sigma. Most institutions offer topical statistical-based courses such as business statistics and quality methods, but it is important to consider foundational statistics courses taught in a mathematical environment to have a basic understanding of statistical theory and methodology.

Image Credit: XKCD

Creating a foundational knowledge of statistics does not have to be expensive, though. If you aren’t currently a college student, there are endless opportunities to gain statistical knowledge for free! A popular statistical analysis program, R, is available free and open source. I recommend an interface such as RStudio or BlueSky (both also free and open source) to use with R, and a certification course to get started (such as this one offered by Johns Hopkins). In the manufacturing industry, statistical analysis related to Six Sigma or quality control would be more beneficial, and there are many options to become Six Sigma certified.

 

Discover Your Motivation

Why did you initially enjoy statistics? I was drawn to multiple aspects related to statistical analysis such as data visualization and data-driven decision making which ultimately led me to the MESA field.

At first, I was motivated by statistical reporting and data visualization techniques that allowed complex, but useful, information to be distilled into digestible and easy to understand information. While it may be natural to some, data visualization is a learned and ever-changing process. If you are interested in this area, I recommend checking out Stephanie Evergreen’s Evergreen Data for data visualization checklists, best practices, and online courses!

Most importantly, I enjoyed being able to support any decision I made with data. This motivation allowed me to weave statistical methods for the purpose of data-driven decision making into any role I was working. Data-driven decision making is popular in any field, because it allows you to have substantial rationale and evidence to create progress. If you were like me, I enjoyed the field I was working in, and wanted to formally apply these motivations in my field. Enter the MESA fields.

 

What is MESA?

The interwoven fields of methodology, evaluation, statistics, and assessment (MESA) include a growing number of career opportunities for those who started with an initial passion for statistics. While you likely understand statistics, how do the other fields connect?

Methodology, in this application, relates to the systems (or methods) of gathering information related to a particular problem (Charles, 2019). It is the “how” you gather and address your question or problem. Examples of methodologies include qualitative, quantitative, and mixed methods. The Grad Coach has a great resource on defining research methodology. You can think of statistics and methodology as the tools used to conduct assessments and evaluations, the other areas of MESA.

Evaluation refers to the process of determining the merit, worth, or value of a process or the product of that process (Scriven, 1991, p. 139). One common area within this field is program evaluation, which focuses on the evaluation of program objectives and will lead to decisions regarding the program.

Assessment is often defined as “any effort to gather, analyze, and interpret evidence which describes institutional, divisional, or agency effectiveness (Upcraft & Schuh, 1996, p. 18). The main goal of assessment is to gather information in order to improve performance. Examples of assessment include standardized tests, surveys, homework or exams, or self-reflection (Formative, 2021).

If you’d like to gain an understanding of what type of careers lie within these fields, search for jobs related to: evaluation, assessment, methodologist, data analyst, psycho-metrics, or research analyst.

 

References

Charles, H. (2019). Research Methodology Definition [PowerPoint slides]. SlidePlayer. https://slideplayer.com/slide/13250398/

Formative and Summative Assessments. Yale Poorvu Center for Teaching and Learning. (2021, June 30). Retrieved March 26, 2023, from https://poorvucenter.yale.edu/Formative-Summative-Assessments

Scriven, M. (1991). Evaluation Thesaurus. Sage. https://files.eric.ed.gov/fulltext/ED214952.pdf

Upcraft, M. L., & Schuh, J. H. (1996). Assessment in Student Affairs: A Guide for Practitioners. The Jossey-Bass Higher and Adult Education Series. Jossey-Bass Inc., Publishers, 350 Sansome St., San Francisco, CA 94104.

Filed Under: Evaluation Methodology Blog

Civil Rights Leader Inspires Project Excellence Students

October 12, 2023 by jhall152

Courtesy of the College of Education, Health, and Human Sciences (October 12, 2023)

Renowned civil rights leader Dr. Harold Middlebrook recently inspired student leaders in The University of Tennessee’s Project Excellence program at Austin-East Magnet High School. Pastor Daryl Arnold, the students’ Leadership Studies instructor from the Educational Leadership and Policy Studies department, in the College of Education, Health, and Human Sciences (CEHHS), invited the civil rights leader. Dr. Middlebrook’svisit emphasized to the students the importance of activism, leadership, and community engagement, leaving a lasting impact.

Dr. Harold Middlebrook visits with UT Project Excellence Students

 

Dr. Middlebrook, a prominent figure in civil rights movement in Tennessee, played a pivotal role in the historic Memphis sanitation workers’ strike in 1968. As a close confidant of Dr. Martin Luther King Jr., he dedicated his life to advocating for the rights of African Americans and promoting social justice. Arnold recognized the value of Dr. Middlebrook’s leadership experiences and invited him to Austin-East Magnet High School to provide an invaluable opportunity for the students to learn from a living legend.

UT’s  Project Excellence program at Austin-East Magnet High School empowers students to become community leaders. Dr. Middlebrook’s visit aligned perfectly with this mission, as he encouraged the students to utilize their voices and talents for positive change. During his visit, Dr. Middlebrook emphasized the importance of activism, leadership, and collaboration. He shared stories of civil rights struggles and resilience, inspiring the students to continue the fight for equality. Dr. Middlebrook’s visit served as a reminder that their voices matter and that they have a vital role in creating a more just society.

His words resonated deeply, reminding the student leaders of their power to shape the future. His words of wisdom and personal anecdotes served as a source of inspiration and motivation. The students were reminded of their place in a legacy of activists and leaders, and the importance of activism, leadership, and community engagement in creating a better future.

As the student leaders carry the torch for social change, they do so with the knowledge that they are part of that legacy, just like Dr. Harold Middlebrook. His visit will undoubtedly continue to inspire them as they strive to make a positive impact in their lives and communities.

Through its eight departments and 12 centers, the UT Knoxville College of Education, Health, and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

  • 1
  • 2
  • Next Page »

Educational Leadership and Policy Studies

325 Bailey Education Complex
Knoxville, Tennessee 37996

Phone: 865-974-2214
Fax: 865.974.6146

The University of Tennessee, Knoxville
Knoxville, Tennessee 37996
865-974-1000

The flagship campus of the University of Tennessee System and partner in the Tennessee Transfer Pathway.

ADA Privacy Safety Title IX